Sabtu, 12 Mei 2012

Good And Bad SEO Practices

Search Engine Optimizing is a set of practices developed to help you get a high traffic and PageRank, while simultaneously building a useful and functional website that will respond to the needs of your potential users. While some website owners are getting their site indexed by following and enhancing these guidelines, others are looking for a way to sneak in and crawl up search engines by developing various strategies to cheat spiders.

Most search engines have their specific rules on which SEO practices are legitimate and which are not. While they disagree on some borderline issues, they do have a set of basic principles in common.


Good SEO Practices

• creating logically-structured pages
• logical paragraph use
• including a site map
• providing clear links to each page
• using descriptive and not too long meta-tags
• validating you code
• providing information-rich content that visitors would find useful
• creating your website content based on the users' needs as if search engines didn't exist
• using spell-check
• avoiding duplicate content


Bad SEO Practices

• using invisible text and links to create false keyword density
• using non-compliant HTML to manipulate relevancy (e.g.: a title that doesn't have anything to do with the page content)
• using CSS to manipulate relevancy (e.g.: using hidden elements by executing code to reveal them)
• sending automated requests to search engines
• using doorway pages created just for search engines
• using machine generated code to produce keyword-rich content
• cloaking - presenting a different content to users than to search engine spiders
• linking to spammers and other bad websites
• participating in linking schemes to increase PageRank
• creating pages that install viruses and other badware
• using unauthorized software to submit pages, check ranking etc.
• re-submitting the website without any reason


Borderline Issues

• inserting comments in the HTML code
• using invisible form elements to hold keyword values
• using HTML elements for keyword stacking (e.g.: image alts)
• using machine generated code for usability reasons (e.g.: in order to check for browser versions)

These borderline issues are controverted and ambiguous techniques. Most of the time it is very difficult to say whether these techniques were used to trick the spiders or only for usability purposes. The search engine is the one who has to make the ultimate decision.

Major search engines are constantly fighting against illicit techniques to get your website indexed and increase your PageRank and developing new and improved automated solutions to detect them. They even encourage users to report any website that is trying to abuse SEO guidelines. Thus using illegitimate ways to crawl up search engines may result in your website being banned. Once it's been removed from the search engine index, it will no longer appear in search results.

Google, as well as other search engines, are trying to develop a pro-active attitude among website owners and convince them it's more efficient to make use of legitimate SEO practices. One thing you should always ask yourself when building new features and writing content is whether you would be doing it even if search engines weren't there at all. This is the best way to drive traffic to your website and get better ranking.

Copyright © 2006 Andreas Obermueller

Tidak ada komentar:

Posting Komentar