Oct 27, 2010

Blog Oriented Design for the Web

All web sites are built in response to many needs

Basically, the network is an information tool. People post stuff - to - in response to the need for information. Success depends on meeting these needs. Please note that the information can be "hard", such as flight times from London to Boston, or "soft" as the image quality of the company's product or service. Your site might have information that people want, it can be information that you want them to be not necessarily want, or you may wish to obtain information from them. Your site might be a way to transfer data between different users. (Or all of these things at the same time!) It's very important to understand that all types of sites to try to solve a variety of needs. Publisher is a need, that leads them to publish (to earn money, to gather information, promote the brand). Site visitors will be required (to earn money, to be successful at work, is to entertain).

Pursuit of goals drives all behaviour

People visit web sites because they want to achieve something, a certain state, usually having got something or having done something.
As a commercial web site publisher, your business goals (strategic or tactical objectives) drive everything you do.
It’s your goals that influence whether you, as a web user, click on a particular link or take the time to look around a web page.
No-one goes on shopping sites for the fun of using the site’s interface. We do it to find bargains or to buy specific products. Those finds help us to feel a certain way (smart, fashionable, relaxed, excited). The site is simply a means to an end.

Case study: booking train tickets

There are many websites that allow me to train tickets, but that39s not the point I am trying to achieve when using these sites. Usually train tickets for the end of the night to travel the next morning. My goal is to sleep as soon as possible relaxed knowing that my ticket will be ready when I arrive at the station the next day and that will achieve this goal by controlling my train tickets quickly and safely, and get feedback that my request successfully. Advertisers have long known that lifestyle choices of most consumer spending decisions, from clothes to cars to bottled water. That39s why advertising uses images of the states of life possible, the goals that consumers can access with the purchase. Focus on the goal of design is the design process specifically and consciously to allow users to achieve their goals.

Level Introductory through Advanced

It is required reading for anyone interested in interaction design, design-oriented objectives and design of user interfaces. Written by Alan Cooper, one of my hero and founder of the discipline of interaction design, and Rob Reimann, another expert communicator who has worked with Alan Cooper Interaction Design. About Face 2.0 is the second major release of basic text on creating clear and usable software. Its principles and examples apply to Web pages and applications and desktop applications. This book will give you a solid grounding in all aspects of interaction design, the basic discipline to address the problem through techniques based on the target and the right to design the user interface. This does not mean that it is a difficult book: it is very well written and easy to consume. I can not recommend enough.

Oct 20, 2010

Seo Addons for Drupal, Joomla and Wordpress

Seo Addons for Drupal, Joomla and Wordpress

Drupal, Joomla and Wordpress are very popular and powerful content management systems. They are all written in PHP, support the MySQL DBMS and are built on a modular architecture that allows for extending them with additional functionality.

There are hundreds of free add-ons (modules, components, plugins) available for each of these three systems. Some of them are especially useful when it comes to optimizing your website for search engines.

Below you find a selection of such SEO related add-ons for Drupal, Joomla and Wordpress listed in alphabetical order, including tools for search engine friendly URLs, meta tag generation, social bookmarking services, performance tuning, user tracking and statistical analysis. Add-ons related to search engine friendly URLs usually require mod_rewrite to be enabled on your Apache web server.

Have fun playing around with these tools and bear in mind that none of them is a replacement for the most powerful SEO technique, which is creating valuable content.
Drupal Modules

1. Seo Friend is meant to be used along side existing Drupal SEO modules to make them more effective. This module does not replace functionality available in the SEO Checklist and SEO Compliance Checker modules.
2. Seo Compliance Checker checks node content on search engine optimization upon its creation or modification. Whenever a publisher creates or modifies a node, the module performs a set of checks and gives the user a feedback on the compliance of the rules.
3. Seo Checklist provides a checklist of good Drupal SEO (Search Engine Optimization) best practices. Maximize the presence of your Drupal website in the major search engines like Google, Yahoo, Bing, etc.
4. Block Cache creates cached versions of each block displayed on your website. For each cached block only one SQL query is executed thus reducing server load resulting in better performance and faster delivery of content.
5. Find URL Alias is a utility module that lets you search for particular URL aliases. This module requires the core path module which enables search engine friendly URLs for your site. Find URL Alias is especially useful when there are many aliases stored in your database. But keep in mind that changing URLs is something you should avoid.
6. Google Analytics integrates Google’s powerful web statistics tracking system with your website. The module enables you to selectively track users by role.
7. Meta tags allows you to define site wide meta tags and specific tags for each piece of content (node). If you use the core taxonomy module (you really should do) meta tags can be assigned automatically by using the terms (tags) you use to categorize your content.
8. Pathauto automatically generates path aliases based on the modules settings. A very powerful module that lets you define different URL patterns based on content types. In the latest version also allows to filter common words. Requires the core path module.
9. RobotsTxt is useful if you run multiple Drupal sites from a single code base and want to have different robots.txt files for each of them.
10. Service links automatically adds links to social bookmarking and blog search services to your content. You can select which services you want to link to, restrict the display based on content (node) types and whether to display links in teaser and/or full page view.
11. URLify automatically generates the path alias for a piece of content based on its title using JavaScript. Requires the core path module. I lightweight alternative to pathauto.
12. XML Sitemap generates an XML sitemap which complies with the sitemaps.org specification. The relative priority of each piece of content is calculated based on content type, number of comments, and promotion to front page. The values of each of these factors can be set in the admin section of the module.

Joomla Modules, Components, Plugins

1. Content Search Seo Plugin is based on the official search content plugin, and add some seo features. It will generate page title, meta description and meta keywords dynamicly according to the search keywords and search results.
2. SEO Canonicalisation Plugin enables site ‘canonicalisation’, which means that when a user hits your site from another domain than your preferred.
3. SeoSimple simply takes the starting chunk of text from the page’s content and applies that as the value for the meta description tag in the page’s head.
4. Seo Generator automatically generates keywords and description by pulling text from the title and/or the content to help with SEO.
5. Seo Keyword link is very useful to improve the internal pages SEO inside your site. you can define the limit of links to each keyword , NoFollow or DoFollow and open link in same window or new window.
6. Advanced SEF Bot for Joomla 1.1.x is a plugin that enables search engine friendly URLs. Since Joomla’s standard URLs are not really meaningful this one is a must.
7. Dynamic gSitemap is a PHP script that dynamically creates an XML sitemap of your site when GoogleBot visits it.
8. JoomSEO is a plugin that dynamically creates meta tags, changes the title tag on the fly, adds heading tags to content titles and more. Some of the configurable features include: show, hide or override keywords, site name, and content title, adjust element order in the title and heading tag selection.
9. LinX is a link exchange component that lets you manage a reciprocal link directory. Links that are submitted to your site are automatically checked and only added if there is a link back to your site. A large number of links to your website of course has an effect on the PageRank but what really matters are links from quality and trusted websites that have a similar target audience as your site.
10. MetaTags NX creates meta description and keyword tags for your content on the fly. Keywords are generated based on their frequency in the content and stopwords can be excluded.
11. Redirect component lets you redirect old urls to new ones and set their status codes. Remember, changing URLs often is a bad idea.
12. Website Validators Tool contains links to validation and site information services that help you make your website more standards compliant, see how you rank who links to you and more.

Wordpress Plugins

1. All in One SEO Pack Optimizes your Wordpress blog for Search Engines.
2. Seo Title Tag optimizes the title tags across your WordPress-powered blog or website. Not just your posts, not just your home page, but any and every title tag on your site
3. Seo Slugs removes common words like ‘a’, ‘the’, ‘in’ from post slugs to improve search engine optimization.
4. HeadSpace2 SEO is an all-in-one meta-data manager that allows you to fine-tune the SEO potential of your site.
5. Seo Friendly Images is a Wordpress optimization plugin which automatically updates all images with proper ALT and TITLE attributes. If your images do not have ALT and TITLE already set, SEO Friendly Images will add them according the options you set. Additionally this makes the post W3C/xHTML valid as well.
6. Seo Post Link makes your post link short and SEO friendly. It removes common words that are unnecessary for search engine optimization of your blog post link.
7. Seo Smart Links provides automatic SEO benefits for your site in addition to custom keyword lists, nofollow and much more.
8. Platinum Seo Pack Optimizes your Wordpress blog for Search Engines (Search Engine Optimization).
9. Simple Submit Seo / Social Bookmarking Plugin is for adding submission links for Digg, Delicious, Buzz, and Stumble to pages and posts.
10. Automatic Seo Links Forget to put manually your links, just choose a word and a URL and this plugin will replace all matches in the posts of your blog.
11. Seo for Paged Comments reduces SEO problems when using WordPress’s paged comments.
12. Seo No Duplicate helps you easily tell the search engine bots the preferred version of a page by specifying the canonical properly within your head tag.
13. 404 Seo Plugin gives you a customized, smart ‘Page Not Found(404)’ error message. It will automatically display links to relevant pages on your site, based on the words in the URL that was not found.
14. Add to Any adds links to a large number of social bookmarking sites to your posts.
15. GeneralStats is a statistics components that counts the number of users, categories, posts, comments, pages, words in posts, words in comments and words in pages. Useful for doing keyword research.
16. Google Sitemap Generator creates an XML sitemap of your website. In the current version homepage, posts, static pages, categories and archives are supported. Priority is automatically assigned based on the number of comments.
17. Gregarious is a social bookmarking plugin for Digg, Reddit and Feedburner with update checks via AJAX.
18. Popularity Contest is a counter for posts, categories, archive views, comments, trackbacks, etc. to determine the most popular pages of your site.
19. Technorati Tagging Plugin adds Techorati tags to your posts and enables you to display a tag cloud.
20. WP-Cache is a page caching system to improve your websites performance. Cached pages are stored as static files, reducing server load thus making your site faster and more responsive.
21. X-Valid attempts to convert posts and comments to valid XHTML. Read more on the benefits of Web Standards Compliance.
22. Search Engine query is trying to reduce the bounce rate of your blog and provide the visitor a better navigation experience.

If you know other free SEO related add-ons for Drupal, Joomla and Wordpress that should be mentioned here simply post links to them in your comments. I’d like to read about your experiences using these add-ons.

All about this post content from seoaddons so, Thanks for this.

How to Create Amazing Backlinks

In the post, they frustrated the link must have tried everything to build quality backlinks: free directories, your profile links, article submissions, comment spam, etc. But nothing works. Despite all their work of little value, they can not jump over their competitors in the SERPs, so they're pleading with others to share their secrets for building incredible, incredible, super fantastisk backlinks. The problem is that these secrets do not exist. There are no magic shortcuts, no hacks initiated highly classified obtain quality links to your website.

backlinks building requires great efforts. But the last thing that bums forum want to hear. They want a quick solution, and create something of value is much harder than slapping a signature on a forum post discussion thread. So the message often falls on deaf ears.



The first step to get backlinks is to discover the wonderful places that have great backlinks and link patterns. The logic is that good content attracts links, site owners feel compelled to share with your audience. So in this first step that finds pages with "Lotsa links" because they have shown to have a decent relationship content. To see what blog posts have attracted the arrival of the links, follow this process. Note: For this entire blog entry, we will use a hypothetical example of Jim's Pet Shop, an online store for pets who are looking to attract links, traffic and attention for its line of dog toys.

Oct 6, 2010

What Meta Robots Tag?

Have you ever wondered what the robots file in your website is for? Maybe you’re using Wordpress and you stumble upon this certain, unfamiliar tag that says "meta name =”robots” content=”index”". What the heck is it!? Is it a robot that automates your meta tags? Is it a piece of magical SEO tag? Does it summon the Google robot to your page?

Meta robots tag is a tag that tells search engines what to follow and what not to follow. It is a piece of code in the "head" : section of your webpage. It’s a simple code that gives you the power to decide about what pages you want to hide from search engine crawlers and what pages you want them to index and look at.



Another function of the meta robots tag is that it tells search engine crawlers what links to follow and what links to stop with. When you have a lot of links going out of your website you should know that you lose some Google juice. And as a result, your page rank would lower down. So what you want to do is to keep that juice to yourself with some of the links - and you tell the search engine crawlers not to follow the links going out of your site because in doing so, they will also take some of your Google juice with them.

If you don’t have a meta robots tag though, don’t panic. By default, the search engine crawlers WILL index your site and WILL follow links. Let me make it clear that search engine crawlers following your links is not bad at all. Losing some of your juice won’t affect your site much in exchange for getting the attention of other websites you’re linking out to. In fact I don’t recommend using nofollow at all if you don’t have too much outbound links.

Basically the meta robots tag can be cracked down to four main functions for the search engine crawlers:

* FOLLOW – a command for the search engine crawler to follow the links in that webpage
* INDEX – a command for the search engine crawler to index that webpage
* NOFOLLOW – a command for the search engine crawler NOT to follow the links in that webpage
* NOINDEX – a command for the search engine crawler NOT to index that webpage

Pretty simple isn’t it? Now you’re telling yourself “Heck is that all? I thought it was some crazy program that takes years and years to study.”

Well there are some more commands for the meta robots tag but these four are the MAIN functions. These four are what meta robots tag are mostly used for.

If you ask me, meta robots tag are little things in your site’s SEO that you can use to control your Google juice. I personally don’t use the noindex but I do sometimes use nofollow. Don’t ask why. It’s personal. Haha!

An example of a meta robots tag code would look like this:

"meta name =”robots” content=”index”"

What this tag does is to index the webpage which it is on. It’s like telling someone who’s going to get a glass of water to get a glass of water. Because again, by default, search engine already indexes your site even if you don’t use this code.

And you can also combine the commands if you so desire:

"meta name=”robots” content=”noindex,nofollow”"

For me, this code is a good thing to keep in mind – especially if you’re trying to save up Google juice by applying nofollow to your outbound links. Other than that, it’s not something you’d want to keep on checking when you’re optimizing your on-site SEO.

Tips for Keeps: We all want to know all the little things about SEO. This is something that might help in the future so try to remember it. This code isn’t developed for nothing. The most skilled SEOs know how to use this best.

Oct 4, 2010

How to Get more Links to your Websites....


1. Blog Comments – find blogs that are relevant to your industry and business and post appropriate comments with a link back to your website.

2. Directories – there are gazillions of websites that simply list the URL’s of other website (rather like the Yellow pages but online). Submit a link to your site on the ones that have a good page rank. Avoid the ones that ask for payment or reciprolinks to save you.

3. Articles – if you blog regularly or write articles then use that great content and submit your articles to online articles sites. Again, choose articles sites that have a high ranking like Ezine articles.

4. Events – if your business is a service that regularly hosts events then list your events and link back to your site in so doing. Read this blog post for more.

5. Social Networks – use your social networking presence to link back to your site. These links count and are really easy to create.

6. Relationships – ask partners, clients and organisations you are a member of to place a link on their site back to yours.

7. Bookmarking – sites like Delicious or Stumble Upon allow you to submit sites that you like. Submit your own (carefully) to build valuable links back to your site. Read this article for more.

8. Press Releases - if your company regularly creates promotional pieces or press releases, use free PR release sites: submit your press releases to these sites with a link back to your website. Here’s an article all about how to go about doing so.

9. Content - make sure that the content on your website or blog is great. The better it is the more likely people will find it and link back to it.

10. Advertising - consider advertising online: Craigslist can be a valuable way to to this – not only does it bring visitors to your site who might link to your content (see 9 above) but it also gives you the ability to link back to your site from another high ranking site.

Oct 3, 2010

What is Broken Link ?

You are visiting a website that you have just discovered. The information contained in the pages are just what you need for a paper you are writing. While reading a page you come across a note that very important information -the one that you actually need and would make your paper very insightful - is contained in another page. You click on the page, but nothing! The link is dead and there is no other way for you to get on to the page that you need! The frustration mounts up so much that you decide to go out of the website and vow never to go back to that frustrating website.

This is the kind of scenario that would usually revolve around a website that has been lax in its management and where broken links have become rampant. Among the many bad habits or mistakes that a website can commit, having broken links is one of the most serious of them. Broken links bring with it so many negative effects that it may leave your online business or website reeling from its effects, which are very hard to correct. Search engine bots are stopped dead by broken links because they would think that it is the end of the line. And as previously illustrated in the scenario, visitors will be turned off by a website that has dead links because they will think that there is no information available when in fact, the data is there but it becomes inaccessible because of an error in code.

A website that is filled with broken links suffers a lot in terms of damaged reputation. The website will be seen as unprofessional and many may even think that it is a shady operation. The same goes with website owner, who may be seen as a person with a dubious reputation. These negative hits on the reputation alone can have very damaging effects to a business. And we are not even counting the many visitors who will not visit the site anymore because they have been turned off by the broken links.

Webmasters and website owners should take it upon themselves to "clean house" diligently and regularly. All of the links should be actively checked if there are any dead or broken hyperlinks. It is just part of regular housekeeping. If you always keep your house clean then so should you with your website.

Unfortunately, many webmasters and website owners complain about the task of looking for broken links. They claim that it takes a lot of time to check each and every link to see if it is working or not. With the many responsibilities that webmasters and website owners have to juggle, it is quite understandable that they would not prioritize checking broken links in their "to do" lists.

Fortunately, there are companies that also realize that checking for broken links can be quite a tedious task and have devised different methods in order to make this task easier to do and consume less time.

For example, xml-sitemaps.com has programmed a standalone script that not only creates sitemaps but also looks for broken links in a website and then informs webmasters or website owners what links they are and to which pages the links are associated with. This automation of the task of checking broken links is a great time saver for webmasters and website owners.