Dec 17, 2010

Most Important Item in SEO- Keyword Density

1. Choosing the Right Keywords to Optimize For

It seems that the time when you could easily top the results for a one-word search string is centuries ago. Now, when the Web is so densely populated with sites, it is next to impossible to achieve constant top ratings for a one-word search string. Achieving constant top ratings for two-word or three-word search strings is a more realistic goal. If you examine closely the dynamics of search results for popular one-word keywords, you might notice that it is so easy one week to be in the first ten results and the next one– to have fallen out of the first 30 results because the competition for popular one-word keywords is so fierce and other sites have replaced you.

Of course, you can include one-word strings in your keywords list but if they are not backed up by more expressions, do not dream of high ratings. For instance, if you have a site about dogs, “dog” is a mandatory keyword but if you do not optimize for more words, like “dog owners”, “dog breeds”, “dog food”, or even “canine”, success is unlikely, especially for such a popular keyword. The examples given here are by no means the ultimate truth about how to optimize a dog site but they are good enough to show that you need to think broad when choosing the keywords.

Generally, when you start optimization, the first thing you need to consider is the keywords that describe the content of your site best and that are most likely to be used by users to find you. Ideally, you know your users well and can guess correctly what search strings they are likely to use to search for you. One issue to consider is synonyms. Very often users will use a different word for the same thing. For instance, in the example with the dog site, “canine” is a synonym and it is for sure that there will be users who will use it, so it does not hurt to include it now and then on your pages. But do not rush to optimize for every synonym you can think of – search engines themselves have algorithms that include synonyms in the keyword match, especially for languages like English.

Instead, think of more keywords that are likely to be used to describe your site. Thinking thematically is especially good because search engines tend to rate a page higher if it belongs to a site the theme of which fits into the keyword string. In this aspect it is important that your site is concentrated around a particular theme – i.e. dogs. It might be difficult to think of all the relevant keywords on your own but that is why tools are for. For instance, the Website Keyword Suggestions Tool below can help you to see how search engines determine the theme of your web site and what keywords fit into this theme. You can also try Google's Keyword Tool to get more suggestions about which keywords are hot and which are not.

2. Keyword Density

After you have chosen the keywords that describe your site and are supposedly of interest to your users, the next step is to make your site keyword-rich and to have good keyword density for your target keywords. Keyword density is a common measure of how relevant a page is. Generally, the idea is that the higher the keyword density, the more relevant to the search string a page is. The recommended density is 3-7% for the major 2 or 3 keywords and 1-2% for minor keywords. Try the Keyword Density Checker below to determine the keyword density of your website.

Keyword Density Checker

Enter a URL

Although there are no strict rules, try optimizing for a reasonable number of keywords – 5 or 10 is OK. If you attempt to optimize for a list of 300, you will soon see that it is just not possible to have a good keyword density for more than a few keywords, without making the text sound artificial and stuffed with keywords. And what is worse, there are severe penalties (including ban from the search engine) for keyword stuffing because this is considered an unethical practice that tries to manipulate search results.
3. Keywords in Special Places

Keywords are very important not only as quantity but as quality as well – i.e. if you have more keywords in the page title, the headings, the first paragraphs – this counts more that if you have many keywords at the bottom of the page. The reason is that the URL (and especially the domain name), file names and directory names, the page title, the headings for the separate sections are more important than ordinary text on the page and therefore, all equal, if you have the same keyword density as your competitors but you have keywords in the URL, this will boost your ranking incredibly, especially with Yahoo!.
a. Keywords in URLs and File Names

The domain name and the whole URL of a site tell a lot about it. The presumption is that if your site is about dogs, you will have “dog”, “dogs”, or “puppy” as part of your domain name. For instance, if your site is mainly about adopting dogs, it is much better to name your dog site “dog-adopt.net” than “animal-care.org”, for example, because in the first case you have two major keywords in the URL, while in the second one you have no more than one potential minor keyword.

When hunting for keyword rich domain names, don't get greedy. While from a SEO point of view it is better to have 5 keywords in the URL, just imagine how long and difficult to memorize the URL will be. So you need to strike a balance between the keywords in the URL and site usability, which says that more than 3 words in the URL is a way too much.
Probably you will not be able to come on your own with tons of good suggestions. Additionally, even if you manage to think of a couple of good domain names, they might be already taken. In such cases tools like the Tool below can come very handy.

File names and directory names are also important. Often search engines will give preference to pages that have a keyword in the file name. For instance http://mydomain.com/dog-adopt.html is not as good as http://dog-adopt.net/dog-adopt.html but is certainly better than http://mydomain.com/animal-care.html. The advantage of keywords in file names over keywords in URLs is that they are easier to change, if you decide to move to another niche, for example.
b. Keywords in Page Titles

The page title is another special place because the contents of the tag usually gets displayed in most search engines, (including Google). While it is not mandatory per the HTML specification to write something in the <title> tag (i.e. you can leave it empty and the title bar of the browser will read “Untitled Document” or similar), for SEO purposes you may not want to leave the <title> tag empty; instead, you'd better write the the page title in it. <br /> <br />Unlike URLs, with page titles you can get wordy. If we go on with the dog example, the <title> tag of the home page for the http://dog-adopt.net can include something like this: <title>Adopt a Dog – Save a Life and Bring Joy to Your Home, Everything You Need to Know About Adopting a Dog or even longer.
c. Keywords in Headings

Normally headings separate paragraphs into related subtopics and from a literary point of view, it may be pointless to have a heading after every other paragraph but from SEO point of view it is extremely good to have as many headings on a page as possible, especially if they have the keywords in them.

There are no technical length limits for the contents of the

,

,

, ... tags but common sense says that too long headings are bad for page readability. So, like with URLs, you need to be wise with the length of headings. Another issue you need to consider is how the heading will be displayed. If it is Heading 1 (

), generally this means larger font size and in this case it is recommendable to have less than 7-8 words in the heading, otherwise it might spread on 2 or 3 lines, which is not good and if you can avoid it – do it.

Dec 15, 2010

How People Share Content

We all know it’s good to share, right? Silicon Alley Insider recently shared some information with us on How People Share Content on the Web. The article took a look at some recent data released by AddToAny, a company that develops widgets that enable users to share content across various social media and other Web communication channels. It should come as no surprise that Facebook (24%), email (11.1%), and Twitter (10.8%) are among the most popular methods for sharing content on the Web today. This is significant information for any Web publisher or other organization that seeks to provide content for its users to share across the Web.

I think this information continues to support the argument that organizations need to open up more of their content for users. As you shift from free to paid content business models, you reduce the potential for information to be shared across all these channels. It’s clear that we’re in a sharing economy today when it comes to information. Unlike a child that’s overly protective of his or her toys, it appears that most of us grown ups are comfortable with sharing. If you take a closer look at the data from AddToAny, you’ll also see that social bookmarking and user-generated news sites like Delicious, Digg, StumbleUpon and Reddit are also among the top sources for sharing information online. If you’re not using a widget like AddToAny or Sociable (which we use on Journalistics), you could be missing a huge opportunity to share your content with a much larger audience.Will Sullivan recently wrote a great post on this topic for Poynter Online that discussed this topic in greater detail. In his article, Will takes a look at information provided by search engine marketing guru Danny Sullivan (no relation), suggesting that Twitter traffic numbers can be under reported by analytics tools like Google Analytics. I’ve found this to be true with this blog. I regularly track the number of clicks and “retweets” generated from posts I share through Twitter. In the case of clicks, I find the trackable URL shortening services I use (Tr.im and Bit.ly) often report more referrals than Google Analytics; often at a ratio of 3:1. I think this is important to point out, since Twitter can be a huge source of referrals to your Web content. In the case of this blog, Twitter is consistently a top referral source, despite information being under reported.

Back to the point of free versus paid content. Is it possible for users to share links to your premium content? Yes. Will it result in more people paying for subscriptions to your content? I have no doubt. However, you’ll reach a much larger audience if you make your content free for sharing. As a result, you’ll continually grow your audience. That much larger audience creates more opportunities for generating revenue than subscriptions alone. There are countless examples of this approach working for publishers of Web content. In my case, I know it’s true. We continually see an increase of more than 100% month-over-month in readers of our Web content. If I started to charge for content, I have no doubt our audience would atrophy and people would stop sharing our content. A larger audience has far more value than a smaller paid subscriber audience.

As a consumer of online content, I am also an active sharer of information I find interesting. I am 10 times more likely to share content I know my audience can easily access and share without a subscription. It’s the classic pay-it-forward model. When I find great information on a subscription site, I don’t share it. It’s too much of a burden on my audience to jump through hoops to read the article. I realize organizations need to find new business models that will help them achieve sustainability, I just can’t help but think paid content models will lose big in this sharing economy we’re in today.

Source...

Essential Tools for Marketing

Press Release Distribution Services

Marketwire
The most bang for your buck from an actual wire service, Marketwire’s prices are lower than PR Newswire and Businesswire. This newer service is built for powerful online exposure, and you’ll enjoy the full online distribution with any geographical AP wire distribution. (Sometimes you can get statewide wire distribution for nearly the same cost as only your local metropolitan area.) It’s great for building inbound links – just choose the SEO Enhanced option.

PRLog
A good-performing free press release distribution outlet, PRLog press releases rank really well and for a really long time if they are written with SEO copywriting best practices. Press releases include three links, though they are URL based (starting with http) rather than text anchor. PRLog also lets you create your own newsroom where all your press releases reside, as well as an “about us” page and product showcase area. See ours here.

PitchEngine
A new PR-for-social-media site that promises to let you create and share press releases easily and for free and syndicate content to Google News. Lets you include HTML in your press release, so you can use keyword text anchor links. The site is marketing itself quite aggressively and will likely build a big presence quickly. The only catch is your release will disappear off the site after 30 days if you aren’t a paid member ($50/month for your own press room).
Social Networks

Facebook
Create a page for your business. Feed your blog in. Start a group. Get fans. Advertise to targeted users if your products appeal to the Facebook crowd (which is basically everyone nowadays). See using the new Facebook business page layout to learn more. Stop by our page and become a fan, too!

MySpace
Take a second look at this medium for social networking. According to MarketingProfs, more than half of MySpace.com users are 35 or older. Explore using MySpace for your business.

LinkedIn
Like a virtual Rolodex. Build your professional profile, link up with other professionals, join groups or even start a group. Participating in Q&A’s related to your profession is a great way to build credibility and visibility.

Ning
Build your own social network around your business. You may even get your network into the search engine results pages. Learn more about using Ning for business.

Read “Utlize Social Media to Gain Additional Exposure for Your Site” for more information about social networks and how they can drive targeted traffic to your site.
Social Bookmarking

Digg
Getting your content on the home page of Digg is one way to bump up your web site’s traffic by thousands within minutes. This can result in valuable links to your site. Start with this beginner’s guide to Digg.

StumbleUpon
Build friends and send them your articles to rate. More thumbs up will get your article shown to more people outside your network and can result in thousands of visitors every day. Tips for using StumbleUpon.

Reddit
Even if your content gets buried on Digg, it can flourish on Reddit – which can be a catalyst for jumping to the home page of other social bookmarking sites. Learn more about the types of topics that do well on Reddit.
Email Marketing

AWeber
AWeber makes it easy to start building your email marketing list, if you haven’t already. For less than $20/month, you can build unlimited newsletter lists, send unlimited email blasts, and email unlimited autorespond messages to up to 500 subscribers/list. (Then it’s $29/month up to 2500 subscribers.) Also offers a recurring 30% commission – a pretty good affiliate program for a service you’ll appreciate enough to recommend to others. (Disclosure note: the link above is our affiliate link. We’ve been using the service for 3 years now, after trying out Constant Contact and researching about 20 other providers! Most either do autoresponders or email blasts/newsletters – not both.)
Blogs

Your own blog
Write great content relevant to your business area that people will find useful. Use it to link to deep pages on your site to help them get indexed in the search engines. Build your thought leadership and let your customers get to know your business better. Try WordPress for an easy-to-use platform that’s also search engine friendly.

Others’ blogs
Read and comment on other blogs in your industry. Use your comment signature to link back to your blog or web site. Build relationships online and spread the link love from your own blog to others’.

Twitter
Micro-blogging. Update your status daily or a couple times a week. Use keywords in your posts and profile to help gain followers on Twitter quickly. Link to your unique content in your updates and take advantage of the multitude of new applications created to help you manage your Twitter experience.
Affiliate Marketing

Post Affiliate Pro
Traditionally links generated through affiliate marketing have not been helpful for search engine optimization - until Post Affiliate Pro, that is. This easy-to-use affiliate program lets you set up a referral program in minutes and keeps your links simple and search engine friendly.
Directories

The Open Directory
A staple of the SEO crowd, this directory can be tricky to get into but well worth it for the link juice it passes along to your site.

Yahoo! Directory
This directory will set you back a couple hundred bucks a year, but its well worth it for inclusion.

Best of the Web Directory
This directory has been around a while and can pass along some good PageRank to your site.

Niche directories
Find the directories in your industry that pass along link juice. Some examples to get you started: www.sbdgraphics.com for ad agencies, web developers, printers and other graphics professionals, www.sbdpro.com for small businesses and businesses that serve them, www.cpapro.com for the accounting industry, www.seoalpaca.org for alpaca breeders, and so on.

Technorati
Claim your blog at Technorati to make sure it’s indexed in the blog search engines and have your updates broadcast across the network.
Your Own Web Properties

Create a knowledge center
Build a content area on your site where you can add articles regularly. This can be as formal as white papers or case studies, but it can also work with less formal articles, as long as they further your company’s thought leadership position and credibility. They will also boost your search engine rankings if you contribute regularly and ensure your site architecture is optimized. See how we’re doing this with SEO Advantage’s new knowledge center.

PowerReviews
People are going to look up user reviews whether on your site or elsewhere – might as well take advantage of the user-generated content for additional search engine visibility. You’ll also rank higher in trust with your efforts at transparency.

SurveyMonkey
Voting/polling/surveys. A tool of engagement that lets you gauge interest, measure customer satisfaction or just provide some fun. (People love to give their opinions.)

Awards & Contests
Enter them and host them. Winning an industry award can add to your credibility, and giving them out can get you lots of press coverage and links. Get creative.

Search Engine Optimization
Make sure your site architecture is optimized for maximum exposure in the search engines. Create link-building campaigns. Add to your content regularly. Enlist the services of a search engine optimization company to guide your efforts or handle implementation completely.
Research

Google
Designate the keywords you’d like to keep track of, and Google will send you alerts of news and pages indexed on those topics “as it happens” with Google News Alerts. You can also enter multiple terms at Google Trends to compare the general level of buzz around those terms.

AllTop
An “online magazine rack”, this site aggregates content and organizes it. Select your topics and have the latest content delivered to you, or just browse the site when you’re in research mode. Also, be sure to submit your site so it shows among the related content.

WordTracker
Find keywords with a more comprehensive tool than free pay-per-click research tools, which may skew results. A free trial will give you an idea of how it works.

Yahoo! Pipes
See who’s talking about your company or brand by pulling together RSS feeds from different sources around the Web using the beta Yahoo! Pipes. This handy video shows you how.

Analytics

You need to measure all your online marketing efforts. If you’re looking for a free analytics package, try Google Analytics or Yahoo! IndexTools. (Which is better?) However, you’ll probably want some help implementing and interpreting your analytics package to get the most from it – ask your SEO company if they offer this.
About SEO Advantage

SEO Advantage, Inc. is an online marketing /search engine optimization firm that helps businesses turn their web presence into a powerful revenue generation medium. Our clients enjoy dominance on Google, Yahoo and Bing through a suite of unique pay-for-performance search engine optimization and online marketing services implemented by a multidisciplinary team of SEO engineers, copywriters, and web designers. You'll find us referenced in books such as Writing Web-Based Advertising Copy to Get the Sale and the BusinessWeek bestseller The New Rules of Marketing & PR, as well as the popular ebook The Small Business Blogging Blueprint. Find out more today. Visit www.seo-advantage.com, call 1-800-366-1639 or email us here.

Source....

Dec 11, 2010

Canonical URL

A webpage can be accessed through two or more different URLs. The most common is the www vs non-www URL. For example, http://domain.com/about.htm and http://www.domain.com/about.htm will always (as far as all websites I encountered) lead to the same web page. There are other strings that can cause a web page to have more than one valid URL ("valid" means typing the URL will lead to a specific page) such as session IDs and trailing slash but those will be discussed in some of my future posts.

How Does it Operate?

The tag is part of the HTML header on a web page, the same section you'd find the Title attribute and Meta Description tag. In fact, this tag isn't new, but like nofollow, simply uses a new rel parameter.

This would tell Yahoo!, Live & Google that the page in question should be treated as though it were a copy of the URL www.seomoz.org/blog and that all of the link & content metrics the engines apply should technically flow back to that URL.

Nov 30, 2010

History of SEO



In this article I will cover experimental start of the Internet, the commercialization of this technology now, and what the project is happening that threatens the future on the Internet.
Before you start talking about the Internet, I would like to define what the Internet, which is dominated by it, and what the economic impact of this technology. The Internet is made up of all data networks using IP, which operate in a seamless network for their collective users. [Krol 3] This means that the federal networks, commercial and institutional compose all parts of the Internet. This network is connected by telephone lines, cable lines or satellite. These, the lines of wire or pipeline signals from the server computer to the server computer until the service to send electronic data to a computer. Internet Safety and Internet Society (ISOC). [4 Krol], Internet Society, the purpose, according to Ed Krol, is to promote the global exchange of information through Internet technology. Another public body of the Internet Architecture Board (IAB). [5 Krol]
It was not until 1996, when the concept of SEO was born. Studies that have approached the United States largest and larger than SEO, which is exactly today. Internet and its applicability in the world was just a little rhythm, where the download of web sites was one of the licensees of time on a 56k modem. The Internet was just a button marketed in the United Kingdom. Internet does not look and Development until 1996, it was considered at small size. In 1996-97, the only search engine that became famous and known Alta Vista, Google was not known and is not used as Alta Vista, and it is only through the very narrow concept. That was when Google think something, and now Google is the only source, searching eyes.
Virtually all good things come to an end. The strategies that were once standard in Search Engine Optimization (SEO) finally ended when individuals with fewer scruples than a box of cornflakes manipulated the system to their advantage. Some of the above optimization methods were a simple alphabetical list, such as a phone, but by 1996 web site programmers tried their luck at placing specific keywords in some parts of the lineup and started learning the manipulation that had some influence on the ranking of the site. A variety of algorithms have been used in 1997 that allowed programmers to crack the SEO code at Excite. These programmers are able to produce results for their clients at will. What began as based on helping customers ultimately leads to bad systems that sought a corner on site rankings to the exclusion of any other competitor.

Nov 3, 2010

How to Automatically Track Your Google Positions in Microsoft Excel

If you have a website then it is a good idea to keep a track of your search engine positions for key phrases that people search for and that bring you traffic. For any SEO professionals this is especially true.

Some people use commercial products for this task, others use online services, but there is a free way to do it. Well, free if you already own a copy of Microsoft Excel, that is!

Before we get into the solution, my apologies to the guys at Webmaster World for singling you out. That particular website ranks nicely in all the test terms, so it became useful to use as an example!

How to Use the Spreadsheet

serp-sheet

  1. · If you want to use this spreadsheet, first download it from here and open it up in Excel.
  2. · Write your website's host name in cell B2.
  3. · Enter your search terms in cells B4, C4, D4, and so on (as many as you like).
  4. · Select the example rows 5-12 and press Del to clear the contents (If you use the “right-click, delete” approach the chart will become smaller, so resize it back).

· Press Ctrl-Shift-U to refresh the results.

How it Works

If you look at the macro code, there is a fair amount going on under the hood, but it is fairly straight forward.

Essentially what we are doing here is “scraping” the search result then looking within the returned content for specific strings (our links). This approach can be used for a lot of useful purposes so it is worth investigating.

serp-macro

The Macro Code

The main macro subroutine AddCurrentRankingsRow first retrieves the website URL, and locates the data in the sheet. Then it adds a new line for today's date, and works on the term columns:

term = sheet.UsedRange.Cells(4, col).Text

rank = GetCurrentRanking(term, myurl, 3)

sheet.Cells(newRow, col).Formula = rank

For each term column, the subroutine fetches the term itself, then looks at the SERPs (Search Engine Result Pages) to find the rankings of our website, and finally writes it to the respective cell. The subroutine GetCurrentRanking figures out the ranking by iterating the SERPs as long as our website does not appear in the results. When our website appears in the results, it calculates and returns the ranking:

While pagenum <>

pagenum = pagenum + 1

url = BuildSERPURL(term, start)

page = FetchPage(url)

If FindRank(page, myurl, count) Then

GetCurrentRanking = start + count

Exit Function

End If

start = start + count

Wend

GetCurrentRanking uses three handy but simple utility functions:

  • BuildSERPURL - This generates the URL of a SERP for a specific term, starting at a certain result number.
  • FetchPage - Uses Microsoft's WinHttp library to do a HTTP ‘GET’ request and fetch the SERP's HTML contents.
  • FindRank - Finds the position of our website in the organic results in a page.

FindRank is specific to Google results. It disregards the paid advertisements and counts result links. The organic result links are in the form …

...

… so the function just extracts the URL from those links. This function can be easily adapted to other search engines like Bing or Ask.com, but it will require some programming tweaks to work.

How to Run the Macro Automatically

You may want to run the macro automatically, without the need to press Ctrl-Shift-U. In order to do so, add the following subroutine after all the code:

Private Sub Workbook_Open()

AddCurrentRankingsRow

End Sub

This will run the macro every time you open the file, which means you will always see the most updated data.

All above information from : searc hengine people

Oct 27, 2010

Blog Oriented Design for the Web

All web sites are built in response to many needs

Basically, the network is an information tool. People post stuff - to - in response to the need for information. Success depends on meeting these needs. Please note that the information can be "hard", such as flight times from London to Boston, or "soft" as the image quality of the company's product or service. Your site might have information that people want, it can be information that you want them to be not necessarily want, or you may wish to obtain information from them. Your site might be a way to transfer data between different users. (Or all of these things at the same time!) It's very important to understand that all types of sites to try to solve a variety of needs. Publisher is a need, that leads them to publish (to earn money, to gather information, promote the brand). Site visitors will be required (to earn money, to be successful at work, is to entertain).

Pursuit of goals drives all behaviour

People visit web sites because they want to achieve something, a certain state, usually having got something or having done something.
As a commercial web site publisher, your business goals (strategic or tactical objectives) drive everything you do.
It’s your goals that influence whether you, as a web user, click on a particular link or take the time to look around a web page.
No-one goes on shopping sites for the fun of using the site’s interface. We do it to find bargains or to buy specific products. Those finds help us to feel a certain way (smart, fashionable, relaxed, excited). The site is simply a means to an end.

Case study: booking train tickets

There are many websites that allow me to train tickets, but that39s not the point I am trying to achieve when using these sites. Usually train tickets for the end of the night to travel the next morning. My goal is to sleep as soon as possible relaxed knowing that my ticket will be ready when I arrive at the station the next day and that will achieve this goal by controlling my train tickets quickly and safely, and get feedback that my request successfully. Advertisers have long known that lifestyle choices of most consumer spending decisions, from clothes to cars to bottled water. That39s why advertising uses images of the states of life possible, the goals that consumers can access with the purchase. Focus on the goal of design is the design process specifically and consciously to allow users to achieve their goals.

Level Introductory through Advanced

It is required reading for anyone interested in interaction design, design-oriented objectives and design of user interfaces. Written by Alan Cooper, one of my hero and founder of the discipline of interaction design, and Rob Reimann, another expert communicator who has worked with Alan Cooper Interaction Design. About Face 2.0 is the second major release of basic text on creating clear and usable software. Its principles and examples apply to Web pages and applications and desktop applications. This book will give you a solid grounding in all aspects of interaction design, the basic discipline to address the problem through techniques based on the target and the right to design the user interface. This does not mean that it is a difficult book: it is very well written and easy to consume. I can not recommend enough.

Oct 20, 2010

Seo Addons for Drupal, Joomla and Wordpress

Seo Addons for Drupal, Joomla and Wordpress

Drupal, Joomla and Wordpress are very popular and powerful content management systems. They are all written in PHP, support the MySQL DBMS and are built on a modular architecture that allows for extending them with additional functionality.

There are hundreds of free add-ons (modules, components, plugins) available for each of these three systems. Some of them are especially useful when it comes to optimizing your website for search engines.

Below you find a selection of such SEO related add-ons for Drupal, Joomla and Wordpress listed in alphabetical order, including tools for search engine friendly URLs, meta tag generation, social bookmarking services, performance tuning, user tracking and statistical analysis. Add-ons related to search engine friendly URLs usually require mod_rewrite to be enabled on your Apache web server.

Have fun playing around with these tools and bear in mind that none of them is a replacement for the most powerful SEO technique, which is creating valuable content.
Drupal Modules

1. Seo Friend is meant to be used along side existing Drupal SEO modules to make them more effective. This module does not replace functionality available in the SEO Checklist and SEO Compliance Checker modules.
2. Seo Compliance Checker checks node content on search engine optimization upon its creation or modification. Whenever a publisher creates or modifies a node, the module performs a set of checks and gives the user a feedback on the compliance of the rules.
3. Seo Checklist provides a checklist of good Drupal SEO (Search Engine Optimization) best practices. Maximize the presence of your Drupal website in the major search engines like Google, Yahoo, Bing, etc.
4. Block Cache creates cached versions of each block displayed on your website. For each cached block only one SQL query is executed thus reducing server load resulting in better performance and faster delivery of content.
5. Find URL Alias is a utility module that lets you search for particular URL aliases. This module requires the core path module which enables search engine friendly URLs for your site. Find URL Alias is especially useful when there are many aliases stored in your database. But keep in mind that changing URLs is something you should avoid.
6. Google Analytics integrates Google’s powerful web statistics tracking system with your website. The module enables you to selectively track users by role.
7. Meta tags allows you to define site wide meta tags and specific tags for each piece of content (node). If you use the core taxonomy module (you really should do) meta tags can be assigned automatically by using the terms (tags) you use to categorize your content.
8. Pathauto automatically generates path aliases based on the modules settings. A very powerful module that lets you define different URL patterns based on content types. In the latest version also allows to filter common words. Requires the core path module.
9. RobotsTxt is useful if you run multiple Drupal sites from a single code base and want to have different robots.txt files for each of them.
10. Service links automatically adds links to social bookmarking and blog search services to your content. You can select which services you want to link to, restrict the display based on content (node) types and whether to display links in teaser and/or full page view.
11. URLify automatically generates the path alias for a piece of content based on its title using JavaScript. Requires the core path module. I lightweight alternative to pathauto.
12. XML Sitemap generates an XML sitemap which complies with the sitemaps.org specification. The relative priority of each piece of content is calculated based on content type, number of comments, and promotion to front page. The values of each of these factors can be set in the admin section of the module.

Joomla Modules, Components, Plugins

1. Content Search Seo Plugin is based on the official search content plugin, and add some seo features. It will generate page title, meta description and meta keywords dynamicly according to the search keywords and search results.
2. SEO Canonicalisation Plugin enables site ‘canonicalisation’, which means that when a user hits your site from another domain than your preferred.
3. SeoSimple simply takes the starting chunk of text from the page’s content and applies that as the value for the meta description tag in the page’s head.
4. Seo Generator automatically generates keywords and description by pulling text from the title and/or the content to help with SEO.
5. Seo Keyword link is very useful to improve the internal pages SEO inside your site. you can define the limit of links to each keyword , NoFollow or DoFollow and open link in same window or new window.
6. Advanced SEF Bot for Joomla 1.1.x is a plugin that enables search engine friendly URLs. Since Joomla’s standard URLs are not really meaningful this one is a must.
7. Dynamic gSitemap is a PHP script that dynamically creates an XML sitemap of your site when GoogleBot visits it.
8. JoomSEO is a plugin that dynamically creates meta tags, changes the title tag on the fly, adds heading tags to content titles and more. Some of the configurable features include: show, hide or override keywords, site name, and content title, adjust element order in the title and heading tag selection.
9. LinX is a link exchange component that lets you manage a reciprocal link directory. Links that are submitted to your site are automatically checked and only added if there is a link back to your site. A large number of links to your website of course has an effect on the PageRank but what really matters are links from quality and trusted websites that have a similar target audience as your site.
10. MetaTags NX creates meta description and keyword tags for your content on the fly. Keywords are generated based on their frequency in the content and stopwords can be excluded.
11. Redirect component lets you redirect old urls to new ones and set their status codes. Remember, changing URLs often is a bad idea.
12. Website Validators Tool contains links to validation and site information services that help you make your website more standards compliant, see how you rank who links to you and more.

Wordpress Plugins

1. All in One SEO Pack Optimizes your Wordpress blog for Search Engines.
2. Seo Title Tag optimizes the title tags across your WordPress-powered blog or website. Not just your posts, not just your home page, but any and every title tag on your site
3. Seo Slugs removes common words like ‘a’, ‘the’, ‘in’ from post slugs to improve search engine optimization.
4. HeadSpace2 SEO is an all-in-one meta-data manager that allows you to fine-tune the SEO potential of your site.
5. Seo Friendly Images is a Wordpress optimization plugin which automatically updates all images with proper ALT and TITLE attributes. If your images do not have ALT and TITLE already set, SEO Friendly Images will add them according the options you set. Additionally this makes the post W3C/xHTML valid as well.
6. Seo Post Link makes your post link short and SEO friendly. It removes common words that are unnecessary for search engine optimization of your blog post link.
7. Seo Smart Links provides automatic SEO benefits for your site in addition to custom keyword lists, nofollow and much more.
8. Platinum Seo Pack Optimizes your Wordpress blog for Search Engines (Search Engine Optimization).
9. Simple Submit Seo / Social Bookmarking Plugin is for adding submission links for Digg, Delicious, Buzz, and Stumble to pages and posts.
10. Automatic Seo Links Forget to put manually your links, just choose a word and a URL and this plugin will replace all matches in the posts of your blog.
11. Seo for Paged Comments reduces SEO problems when using WordPress’s paged comments.
12. Seo No Duplicate helps you easily tell the search engine bots the preferred version of a page by specifying the canonical properly within your head tag.
13. 404 Seo Plugin gives you a customized, smart ‘Page Not Found(404)’ error message. It will automatically display links to relevant pages on your site, based on the words in the URL that was not found.
14. Add to Any adds links to a large number of social bookmarking sites to your posts.
15. GeneralStats is a statistics components that counts the number of users, categories, posts, comments, pages, words in posts, words in comments and words in pages. Useful for doing keyword research.
16. Google Sitemap Generator creates an XML sitemap of your website. In the current version homepage, posts, static pages, categories and archives are supported. Priority is automatically assigned based on the number of comments.
17. Gregarious is a social bookmarking plugin for Digg, Reddit and Feedburner with update checks via AJAX.
18. Popularity Contest is a counter for posts, categories, archive views, comments, trackbacks, etc. to determine the most popular pages of your site.
19. Technorati Tagging Plugin adds Techorati tags to your posts and enables you to display a tag cloud.
20. WP-Cache is a page caching system to improve your websites performance. Cached pages are stored as static files, reducing server load thus making your site faster and more responsive.
21. X-Valid attempts to convert posts and comments to valid XHTML. Read more on the benefits of Web Standards Compliance.
22. Search Engine query is trying to reduce the bounce rate of your blog and provide the visitor a better navigation experience.

If you know other free SEO related add-ons for Drupal, Joomla and Wordpress that should be mentioned here simply post links to them in your comments. I’d like to read about your experiences using these add-ons.

All about this post content from seoaddons so, Thanks for this.

How to Create Amazing Backlinks

In the post, they frustrated the link must have tried everything to build quality backlinks: free directories, your profile links, article submissions, comment spam, etc. But nothing works. Despite all their work of little value, they can not jump over their competitors in the SERPs, so they're pleading with others to share their secrets for building incredible, incredible, super fantastisk backlinks. The problem is that these secrets do not exist. There are no magic shortcuts, no hacks initiated highly classified obtain quality links to your website.

backlinks building requires great efforts. But the last thing that bums forum want to hear. They want a quick solution, and create something of value is much harder than slapping a signature on a forum post discussion thread. So the message often falls on deaf ears.



The first step to get backlinks is to discover the wonderful places that have great backlinks and link patterns. The logic is that good content attracts links, site owners feel compelled to share with your audience. So in this first step that finds pages with "Lotsa links" because they have shown to have a decent relationship content. To see what blog posts have attracted the arrival of the links, follow this process. Note: For this entire blog entry, we will use a hypothetical example of Jim's Pet Shop, an online store for pets who are looking to attract links, traffic and attention for its line of dog toys.

Oct 6, 2010

What Meta Robots Tag?

Have you ever wondered what the robots file in your website is for? Maybe you’re using Wordpress and you stumble upon this certain, unfamiliar tag that says "meta name =”robots” content=”index”". What the heck is it!? Is it a robot that automates your meta tags? Is it a piece of magical SEO tag? Does it summon the Google robot to your page?

Meta robots tag is a tag that tells search engines what to follow and what not to follow. It is a piece of code in the "head" : section of your webpage. It’s a simple code that gives you the power to decide about what pages you want to hide from search engine crawlers and what pages you want them to index and look at.



Another function of the meta robots tag is that it tells search engine crawlers what links to follow and what links to stop with. When you have a lot of links going out of your website you should know that you lose some Google juice. And as a result, your page rank would lower down. So what you want to do is to keep that juice to yourself with some of the links - and you tell the search engine crawlers not to follow the links going out of your site because in doing so, they will also take some of your Google juice with them.

If you don’t have a meta robots tag though, don’t panic. By default, the search engine crawlers WILL index your site and WILL follow links. Let me make it clear that search engine crawlers following your links is not bad at all. Losing some of your juice won’t affect your site much in exchange for getting the attention of other websites you’re linking out to. In fact I don’t recommend using nofollow at all if you don’t have too much outbound links.

Basically the meta robots tag can be cracked down to four main functions for the search engine crawlers:

* FOLLOW – a command for the search engine crawler to follow the links in that webpage
* INDEX – a command for the search engine crawler to index that webpage
* NOFOLLOW – a command for the search engine crawler NOT to follow the links in that webpage
* NOINDEX – a command for the search engine crawler NOT to index that webpage

Pretty simple isn’t it? Now you’re telling yourself “Heck is that all? I thought it was some crazy program that takes years and years to study.”

Well there are some more commands for the meta robots tag but these four are the MAIN functions. These four are what meta robots tag are mostly used for.

If you ask me, meta robots tag are little things in your site’s SEO that you can use to control your Google juice. I personally don’t use the noindex but I do sometimes use nofollow. Don’t ask why. It’s personal. Haha!

An example of a meta robots tag code would look like this:

"meta name =”robots” content=”index”"

What this tag does is to index the webpage which it is on. It’s like telling someone who’s going to get a glass of water to get a glass of water. Because again, by default, search engine already indexes your site even if you don’t use this code.

And you can also combine the commands if you so desire:

"meta name=”robots” content=”noindex,nofollow”"

For me, this code is a good thing to keep in mind – especially if you’re trying to save up Google juice by applying nofollow to your outbound links. Other than that, it’s not something you’d want to keep on checking when you’re optimizing your on-site SEO.

Tips for Keeps: We all want to know all the little things about SEO. This is something that might help in the future so try to remember it. This code isn’t developed for nothing. The most skilled SEOs know how to use this best.

Oct 4, 2010

How to Get more Links to your Websites....


1. Blog Comments – find blogs that are relevant to your industry and business and post appropriate comments with a link back to your website.

2. Directories – there are gazillions of websites that simply list the URL’s of other website (rather like the Yellow pages but online). Submit a link to your site on the ones that have a good page rank. Avoid the ones that ask for payment or reciprolinks to save you.

3. Articles – if you blog regularly or write articles then use that great content and submit your articles to online articles sites. Again, choose articles sites that have a high ranking like Ezine articles.

4. Events – if your business is a service that regularly hosts events then list your events and link back to your site in so doing. Read this blog post for more.

5. Social Networks – use your social networking presence to link back to your site. These links count and are really easy to create.

6. Relationships – ask partners, clients and organisations you are a member of to place a link on their site back to yours.

7. Bookmarking – sites like Delicious or Stumble Upon allow you to submit sites that you like. Submit your own (carefully) to build valuable links back to your site. Read this article for more.

8. Press Releases - if your company regularly creates promotional pieces or press releases, use free PR release sites: submit your press releases to these sites with a link back to your website. Here’s an article all about how to go about doing so.

9. Content - make sure that the content on your website or blog is great. The better it is the more likely people will find it and link back to it.

10. Advertising - consider advertising online: Craigslist can be a valuable way to to this – not only does it bring visitors to your site who might link to your content (see 9 above) but it also gives you the ability to link back to your site from another high ranking site.

Oct 3, 2010

What is Broken Link ?

You are visiting a website that you have just discovered. The information contained in the pages are just what you need for a paper you are writing. While reading a page you come across a note that very important information -the one that you actually need and would make your paper very insightful - is contained in another page. You click on the page, but nothing! The link is dead and there is no other way for you to get on to the page that you need! The frustration mounts up so much that you decide to go out of the website and vow never to go back to that frustrating website.

This is the kind of scenario that would usually revolve around a website that has been lax in its management and where broken links have become rampant. Among the many bad habits or mistakes that a website can commit, having broken links is one of the most serious of them. Broken links bring with it so many negative effects that it may leave your online business or website reeling from its effects, which are very hard to correct. Search engine bots are stopped dead by broken links because they would think that it is the end of the line. And as previously illustrated in the scenario, visitors will be turned off by a website that has dead links because they will think that there is no information available when in fact, the data is there but it becomes inaccessible because of an error in code.

A website that is filled with broken links suffers a lot in terms of damaged reputation. The website will be seen as unprofessional and many may even think that it is a shady operation. The same goes with website owner, who may be seen as a person with a dubious reputation. These negative hits on the reputation alone can have very damaging effects to a business. And we are not even counting the many visitors who will not visit the site anymore because they have been turned off by the broken links.

Webmasters and website owners should take it upon themselves to "clean house" diligently and regularly. All of the links should be actively checked if there are any dead or broken hyperlinks. It is just part of regular housekeeping. If you always keep your house clean then so should you with your website.

Unfortunately, many webmasters and website owners complain about the task of looking for broken links. They claim that it takes a lot of time to check each and every link to see if it is working or not. With the many responsibilities that webmasters and website owners have to juggle, it is quite understandable that they would not prioritize checking broken links in their "to do" lists.

Fortunately, there are companies that also realize that checking for broken links can be quite a tedious task and have devised different methods in order to make this task easier to do and consume less time.

For example, xml-sitemaps.com has programmed a standalone script that not only creates sitemaps but also looks for broken links in a website and then informs webmasters or website owners what links they are and to which pages the links are associated with. This automation of the task of checking broken links is a great time saver for webmasters and website owners.

Sep 24, 2010

Use of Backlinks


Backlinks enable you to keep track of other pages on the web that link to your posts. For instance, suppose Alice writes a blog entry that Bob finds interesting. Bob then goes to his own blog and writes a post of his own about it, linking back to Alice's original post. Now Alice's post will automatically show that Bob has linked to it, and it will provide a short snippet of his text and a link to his post. What it all works out to is a way of expanding the comment feature such that related discussions on other sites can be included along with the regular comments on a post.

The backlinks setting can be found under the Settings | Comments tab, and consists of a single, simple option to turn it on or off:

Our default templates are already set up with the necessary code for backlinks. However, if you have a custom template, or one of our templates from before this feature was launched, you will need to add the code yourself. Instructions for that are here.

Sep 21, 2010

List of SEO Tools



This List of Tools is very Useful for You :Take a one look at this....

Adsense Calculator
Google AdSense affiliates can use this calculator to gain a better understanding of what affects their earnings by experimenting with values.

AdSense Preview
This preview utility will give you a sense of which ads would be placed on a given page.

Advanced Meta Tag Generator
This tool helps you to add meta tags to your site.

Alexa Rank Comparison Tool
The Alexa Traffic History Graph allows you to create a traffic history graph for any site.

Check Server Headers
Check your server to make sure the proper HTTP Status Codes (200, 301, 302, 304, 307, 404, 410) are being returned in the server headers.

Class C Checker

This Class C Checker tool allows you to find out whether some sites are hosted on the same Class C IP Range.

Code to Text Ratio
This tool will help you discover the percentage of text in a web page (as compared to the combined text and code).

CPM Calculator
This calculator measures the ROI (return on investment) of the CPM (cost per thousand) impressions advertising model.

Domain Age Check
Use this tool to find the age of a domain.

Domain Typo Generator

Enter a domain name into the box, and this tool will generate a list of suggestions of likely human misspellings and typos for the given domain.

Future PageRank
This tool will query Google's various data centers to check for any changes in PageRank values for a given URL.

Google Dance
This tool will query Google's three main web servers, showing the different statuses during updates. Watch Google Dance.

Google Keyword Suggestions

The keyword suggestion tool for Google will help you choose relevant and popular terms related to your selected key term.

Google Search for Multiple Datacenter
This tool searches your keyword/phrase in different Google data centers.

Google Suggest Scraper
Shows frequently search for phrases starting with the words and letters in your query.

Google vs Yahoo
This tool will run a search query in Google and Yahoo search engines and then graphically compare the results.

Indexed Pages
This tool will return the total link count for each URL from major search engines (Google, Yahoo, MSN, Alta Vista, and AlltheWeb).

Keyword Cloud

This tool provides a visual representation of keywords used on a website.

Keyword Density
This tool will analyze your chosen URL, and return a table of keyword density values for one-, two-, or three-word key terms.

Keyword Difficulty Check
Use the Keyword Difficulty Check Tool to see how difficult it would be to rank for specific keywords or keyword phrases.

Keyword Optimizer
Enter a list of keywords and this tool will remove any duplicate entries and re-order the list alphabetically.


Keyword Position Check for Multiple Datacenter
This tool will help you to find the position of your site in Google for a specific keyword/phrase.

Keyword Typo Generator
Enter a keyword or key term into the box above and this tool will generate a list of suggestions for likely human misspellings and typos.

Link Popularity
Enter a valid URL, and this tool will query all the major search engines (Google, Yahoo, MSN, and Teoma) and then return that URL's total link count for each one.

Link Price Calculator
This tool can help you determine the approximate amount you should be paying per month for a text link.

Meta Analyzer

This tool will analyze a website's meta tags. Analyzing a competitor's keyword and description meta values is a good way to find ideas for key terms and more effective copy for your site.
Meta Tag Generator
If you're new to web development and search engine optimization, you may find this tool useful to ensure that your meta tags are correctly formed.

Multiple Datacenter Link Popularity Check
This tool will give a back link count for a URL from multiple Google data centers.

Page Comparison

Compares the page titles, meta information, and common phrases occurring on different pages.

Page Size
This tool will help you to determine HTML web page size.

PageRank Lookup
This tool streamlines the process of checking PageRank for your sites. Enter a list of URLs and it will return the PageRank value for each one.

PageRank Search
Enter your Google search here, and our tool will search Google and display the PageRank next to each resulting answer.

Robots.txt Generator
Use this tool to generate a simple robots.txt file for your website, which allows you to hide files or directories that you don̢۪t wish to be spidered.

ROI Calculator

This calculator measures the ROI (return on investment) of a CPC (cost per click) campaign.

Search Engine Comparison
This tool allows you to perform your own comparisons, and displays the results visually, making it easy to see both the rankings and comparative positions of pages in search engine results.

Search Engine Keyword Position
This tool checks the search engine result pages of Google, Yahoo, and MSN to see what position your site holds for a particular keyword phrase.

Site Link Analyzer
This tool will analyze a given web page and return a table of data containing columns of outbound links and their associated anchor text.

Spider Simulator
This tool simulates a search engine spider by displaying the contents of a web page in exactly the way the spider would see it.

URL Redirect Check
This tool checks for valid HTTP 301 headers.

URL Rewriting
This tool converts dynamic URLs to static URLs. You will need to create an .htaccess file to use this tool.
Hi This is Free SEO Services From www.liveseoservices.com

Sep 17, 2010

SEO FRIENDLY WEBSITE DESIGN TIPS




1. Avoid making menu on the left-hand side of a website. If unavoidable, an alternative way is to place some text with rich keywords at the top or above the left-hand menu so that this text will be the first thing to be read by search engines.


2. Use of Title Tag: Title is the most important element of a web page, which should be written with due care. Title must include the most important keywords of a web page. Sometimes we see the web pages which only contain company name in the title tag, which is not correct from SEO point of view. Titles are first crawled by search engines, so it must include the keywords that you want to rank in search engines.


3. Meta Tags: Meta tags contain meta keywords and description. In meta keywords tag you can place all the targeted keywords and key phrases of that page and meta description tag contains a brief about the web page.


4. Reckon twice on how to use graphics. Make them relevant to your content and use an alt tag with relevant keywords for search engines to read as they cannot read graphics and also for your visitors so that they can have something to read when waiting for the graphics to load.


5. Do not only use images to link out. You should always use text links to link out to vital content on your web site. Spiders can follow image links, but like text links more though.


6. Avoid using frames. Some search engines cannot spider web pages with frames at all. For the other search engines that can, they can have problems spidering it and sometimes they too cannot index the web page.


7. Avoid using too complex tables when laying out your page but to keep them simple for the spiders. There are some engines which find it hard to navigate through to the other pages on your website if the navigation bar is too complicated.


8. Heading Tags: Every web page must contain one heading tag which is written in H1 tag of HTML. Most targeted keyword of the web page is written in H1 tag. After this less important keywords or secondary keywords can be written in H2 and H3 tags.


9. Image Optimization: To optimize images, alt tag is used. Important keywords or key phrases are written inside the alt tag to describe the image. Images can’t be read by search engines, but an alt tag can well describe an image to search engines. And then search engines can easily read the image through its alt text.


10. Sitemap: Sitemap is very important for a website specially when you have a dynamic site or a site with thousands or lakhs of pages. Websites with large number of pages and dynamic sites can’t be easily crawled by the search engines. To overcome this problem xml sitemap is used. XML sitemap helps in crawling all the web pages of a website very easily.

11. Headlines are rated more vital than the rest of the web page by search engines. To take advantage of this, you should have your keywords in the page headline. Since the header tag (h1) is quite large, you should format it to make it smaller.

12. Every page should contain the “title” and “description” tags with excellent keywords to describe the page content. The number of words for the title should not exceed 9 and that for the description should not be more than 20 words in order to keep within the limits of most search engines.

13. Try not to use Flash when possible. Flash cannot be read by the search engines to date and will cause slow page loading time and make people run away. If you really have a reason to use flash, try to make it smaller (e.g. as a flash header) and leave other area of your website for keyword-rich content.


14. Use external Cascading Style Sheets and Java Scriptfiles to reduce page size and make the download time much quicker. It will allow the spider to index your web page quicker and can help your ranking.


15. Use standard HTML. Software



Hi This is Free SEO Services From www.liveseoservices.com

Sep 13, 2010

You Must Know About Google Algorithm


Hi This is Free SEO Services From www.liveseoservices.com

PageRank is a probability distribution used to represent the likelihood that a person randomly clicking on links will arrive at any particular page. PageRank can be calculated for collections of documents of any size.This algorithm is help for your SEO Optimization. It is assumed in several research papers that the distribution is evenly divided among all documents in the collection at the beginning of the computational process. The PageRank computations require several passes, called "iterations", through the collection to adjust approximate PageRank values to more closely reflect the theoretical true value.

Sep 8, 2010

What is Black Hat SEO???

SEO Updates,SEO Tips,Online Optimization


Hi This is Free SEO Services From www.liveseoservices.com

Many search engine operators, such as Google, MSN, and Yahoo, publish policies and guidelines which document what they feel are appropriate and inappropriate SEO techniques.

Blackhat SEO "exploits weaknesses in the algo by breaking the webmaster guidelines". Blackhat SEO is primarily about "cheating" / "manipulating" search engines. It is also about "misrepresenting content to search engines".



There are some topics that we can identify for Black Hat SEO:

Keyword stuffing: If Many numbers if Keywords are stuffing in your tittle tag it means you are listed as Black Head SEO.
Invisible text: Every words in your website must be in visible form its importat for Indexing.
Doorway Pages: A doorway page is basically a "fake" page that the user will never see. It is purely for search engine spiders, and attempts to trick them into indexing the site higher. Read more about doorway pages.
Spamming the Search Engines: Do not spamming in search engines through various postings.


Advantages with WHITE HAT SEO:

1. White hat SEO work refers to legitimate and proper way of increasing traffic
2. White hat SEO work has been proved very effective in long run
3. White hat SEO work always focuses upon quality of web content
4. White hat SEO work includes regular tracking of visitors and always keeps them prepared for any sorts of challenging task
5. Most popular white hat SEO techniques are original & relevant content creation, article submission, director submission, request site feedback, create back links, prepare business blogs, create site mappings, RSS feed, forum contribution and so on.

Disadvantages of using Black Hat SEO:

We are discussing the disadvantages first, due to very bad results that it could result in.

* Once Search Engines identified that your are using Black Hat SEO, immediately your website will be dropped from Search Engine.

* Page Rank of the website will be dropped to 0

* Search Engine would remove from indices (refer Wikipedia)

Sep 6, 2010

SEO Friendly Website Design


Hi This is Online SEO Updates From www.liveseoservices.com

I decided to write this post because it is clear some clients, no matter how clear you aim to be, think that buying a website from Hobo, because we are a search engine optimization company, means a lifetime supply of free traffic from Google.
We've compiled a list of the most important things that a SEO friendly website design needs in order to have a solid foundation for its search engine optimization campaign. Whether you're an amateur web designer attempting to integrate SEO into your website or a professional web designer looking to refine your approach to creating SEO friendly website design, this guide has something for everyone.

We have few suggestion for you. Its really make your website SEO Friendly…..

" The foundation of the foundation: keyword research
" Researched keywords in anchor text (the text that contains a hyperlink)
" Don't use Flash (and how to survive if you absolutely must use it)
" Don't use frames (and how to survive if you absolutely must use them)
" Researched keywords in andlt;titleandgt; tag
" CSS drop down navigation (no Javascript or Flash); CSS stylesheets
" Density of researched keywords in document text; don't hide text
" Researched keywords in headings andlt;H1andgt;, andlt;H2andgt;, andlt;H3andgt; etc.
" Images (including researched keywords in 'alt' tags, no text in images)
" Researched keywords in URL (filenames and folders)

Sep 5, 2010

Twitter Moves to OAuth: The OAuthcalypse Is Nigh




Twitter is killing support for basic user authentication in third-party apps on Tuesday morning, the company says. Instead, Twitter will now require all third-party app developers to use OAuth for user authentication.
Some bloggers have given the event the catchy name, “OAuthcalypse” — a bit of a mouthful, but so is “user authentication protocol” — the implication being that when basic authentication is switched off, it will break old software and leave users in the dark. But since Twitter has given developers ample warning of the change, the switch will only lock out a small number of apps.
This is a planned move Twitter first announced in December, and the company has posted a help page on its developer site with some resources meant to ease the transition to OAuth.
The Twitter API team has been dialing down the number of requests an app can make using the basic authorization method. That number will hit zero at 8AM Pacific time Tuesday.
The only disadvantage is that old apps that haven’t updated to use OAuth will stop working this week. All of the popular ones (Seesmic, Tweetdeck, etc.) have already updated.
Twitter has been recommending developers use OAuth as an authentication method for some time.
Almost all of the biggest social services, including Facebook and Yahoo, use OAuth to connect their social services together and to let users share photos, status updates and links in multiple places.
Twitter’s move mirrors a broader trend on the social web, where basic authentication is being ditched for the more secure OAuth when services and applications connect user’s accounts.
In basic authentication, a website or app will say, “Hey, do you want to share whatever you’re doing here with your friends on Twitter? Give me your Twitter username and password and I’ll hook up your accounts.” By passing along your info, you’re giving that app or website unlimited access to everything in your Twitter account. Pretty dangerous, and not secure.
In OAuth authentication, the website or app will send you to Twitter where you sign yourself in, then Twitter will tell the website or app “Yeah, they are who they say they are.” The website or app only gains the ability to do certain things with your account — post, read, reply, search — while staying locked out from the more sensitive stuff.
The biggest advantage of OAuth is you don’t have to tell your Twitter password to anyone other than Twitter. Also, OAuth connections are token-based, so once a connection is established, you can change your Twitter password without having to re-enter it into the website or app.
In fact, Facebook’s new Like buttons and its Social Graph API, launched in April, use the newer OAuth 2.0 to handle user authentication.
OAuth 2.0 is a simplified version of OAuth. Twitter plans to eventually move to OAuth 2.0 for its entire platform, and Tuesday’s switch is part of that broader transition.
Twitter was originally going to move to OAuth in June, but the transition was delayed because of the increased volume of tweets around the World Cup.



Hi This is Free SEO Services From www.liveseoservices.com

Sep 3, 2010

Best SEO Tips for updates websites.



Kindly find the SEO Tips as bellow. Enjoy!



• Proper Meta Data – Relevant, Organized Title, Description, KW’s, & Robots
• H1 Heading – relevance & h2-h5 support
• In-Content Keyword Placement – Density, Prominence, 1st Sentence
• Onpage Keyword Placement – 1st Part of Body, ID’s, Last 10 Words, Site-wide – Refer to above.
• Image Optimization – Img Folder, Img Name, Alt Attribute
• URL Structuring/ Page Names – Site Structure, URL Names, Folder Locations
• Internal Linking & Navigation – nofollow, javascript, navigation, Anchor text, title Attribute, sitewide
• Google Webmaster Tools Suggestions – Verify, HTML Suggestions, Crawl Errors, Broken Links
• 301 Redirecting & Canonicalization – Server Type, www/non-www, Old Pages
• 404 Header Check Optimization
• Code to Text Ratio Excessive Coding
• HTML Validation & CSS Optimization
• Add Blog/RSS Feed
• HTML Site-mapping – Sitemap Links, Structure, Pages
• XML Site-mapping & Submissions – XML Blog Feed, Sitemaps, Submit to Google & Yahoo!
• Robots.txt – Follow/NoFollow, Reference XML

Sep 1, 2010

What Is SEO--- Introduction


Hi This is Live SEO Services From LIVESEOSERVICES


Search Engine Optimization (SEO) is often considered the more technical part of Web marketing. This is true because SEO does help in the promotion of sites and at the same time it requires some technical knowledge – at least familiarity with basic HTML. SEO is sometimes also called SEO copyrighting because most of the techniques that are used to promote sites in search engines deal with text. Generally, SEO can be defined as the activity of optimizing Web pages or whole sites in order to make them more search engine-friendly, thus getting higher positions in search results.

Although SEO helps to increase the traffic to one's site, SEO is not advertising. Of course, you can be included in paid search results for given keywords but basically the idea behind the SEO techniques is to get top placement because your site is relevant to a particular search term, not because you pay.

One of the basic truths in SEO is that even if you do all the things that are necessary to do, this does not automatically guarantee you top ratings but if you neglect basic rules, this certainly will not go unnoticed. Also, if you set realistic goals – i.e to get into the top 30 results in Google for a particular keyword, rather than be the number one for 10 keywords in 5 search engines, you will feel happier and more satisfied with your results.

SEO can be a 30-minute job or a permanent activity. Sometimes it is enough to do some generic SEO in order to get high in search engines – for instance, if you are a leader for rare keywords, then you do not have a lot to do in order to get decent placement. But in most cases, if you really want to be at the top, you need to pay special attention to SEO and devote significant amounts of time and effort to it. Even if you plan to do some basic SEO, it is essential that you understand how search engines work and which items are most important in SEO.

Aug 30, 2010

All about Facebook


Hi This is Free SEO Services From www.liveseoservices.com


Facebook is a social networking website launched in February 2004 that is operated and privately owned by Facebook, Inc., with more than 500 million active users in July 2010. Users can add people as friends and send them messages, and update their personal profiles to notify friends about themselves. Additionally, users can join networks organized by workplace, school, or college. The website's name stems from the colloquial name of books given to students at the start of the academic year by university administrations in the US with the intention of helping students to get to know each other better. Facebook allows anyone who declares themselves to be aged 13 or older to become a member of the website.

Facebook has met with some controversy. It has been blocked intermittently in several countries including Pakistan, Syria, People's Republic of China, Vietnam, and Iran.It has also been banned at many places of work to discourage employees from wasting time using the service.13] Privacy has also been an issue, and itclarification needed has been compromised several times. Facebook settled a lawsuit regarding claims over source code and intellectual property.14] The site has also been involved in controversy over the sale of fans and friends.

Facebook was founded by Mark Zuckerberg with his college roommates and fellow computer science students Eduardo Saverin, Dustin Moskovitz and Chris Hughes.The website's membership was initially limited by the founders to Harvard students, but was expanded to other colleges in the Boston area, the Ivy League, and Stanford University. It gradually added support for students at various other universities before opening to high school students, and, finally, to anyone aged 13 and over.

SEO Or PPC? Which One Is Right For You?

The tension started (this time) when an article appearing in DMNews.com questioned whether or not SEO techniques were a legitimate need of the prospective search engine marketer. Dave Pasternack of Did-It.com wrote the article. One of the points of Pasternack's is SEO is a "fix-it-and-forget-it" exercise that doesn't need constant monitoring; an idea that left a number of SEO providers shocked and disappointed.
One such SEO provider, Greg Boser, took issue with Pasternack's article but kept quiet... that is until another article appeared at ClickZ saying essentially the same thing as Pasternack's work. Kevin Lee, also of Did-It.com and SEMPO, wrote the second article. In his writing, Lee indirectly compared search optimizers to spammers while saying they took a scatter-shot approach to improving rankings. However, in Lee's defense, it's never clear whether he is talking about those who spam search engines or the SEO industry in general.