The landscape of SEO is an ever-changing one, as search engines tweak their algorithms to provide the most relevant and authoritative content to their customers. It’s no surprise, therefore, that previously successful techniques cease to work effectively and need to be re-thought. Then again, some SEO techniques were never a good idea in the first place.
Certain unscrupulous individuals have always tried to trick their way into a higher page ranking than the one their sites deserve through the use of tricks and gimmicks; these sites are generally known as “webspam.” Search engines know about webspam and will always try to combat it.
Here are the top SEO techniques to avoid. Some simply don’t work, while others will actively harm your page ranking and your brand.
1. Automated backlink building
This falls into the “never a good idea to begin with” category. Automated backlink building uses robots (small autonomous programs) that crawl across the Web, dropping links to your site into comment boxes, forums, and other places that allow public posting without registration. Some automated backlink builders generate fake Web content as well, placing links back to your page in machine-generated articles. Modern search engines can easily detect these tricks and will actively penalize your site if you use them.
2. Keyword stuffing
This involves attempting to artificially boost your page rank for certain keywords by cramming them into your site in a haphazard manner, with no regard for readability. This technique has been declining in effectiveness since the turn of the century and is now actively toxic to your search performance. Identify relevant, useful keywords and optimize for those, including them in moderation and always with regard for the needs of a human reader.
3. Abusing the Metadata
Many years ago, search engines used to look at a site’s metadata and uncritically factor it into their evaluation of a page’s subject and its ranking. As soon as early “black hat” SEOs realized this, they began to cram every keyword and possible search term they could think of into a page’s metadata in the hopes of securing an artificially high ranking. The upshot of this was that search engines began to disregard large chunks of websites’ metadata. Today, metadata stuffing is used by search engine algorithms as one of the red flags they look for in identifying poor-quality websites.
Ultimately, the main reason that sites fall foul of search engines’ anti-webspam measures is that they look as if they’re employing “black hat” SEO techniques. The best way to avoid being mistaken for a webspammer is not to be one. Provide your visitors with quality content, engage in responsible promotion and follow the guidelines for best SEO practise offered by the search companies themselves.
Send this to a friend