Learn how and why NOT to practice unethical or bad Search Engine Optimization

 

 

In the early days of search engines, building a website was a fairly organic process. You listed keywords that matched what your site was about, you made sure the different search bots could access all your pages, you tried to keep things clean and simple and hoped for the best.

Today, everyone searches on Google. This means that you ought to be spending a lot of time thinking specifically about how to be placed highly and broadly on Google. Unfortunately this means you need to do Search Engine Optimization. I say on purpose "unfortunately" because some websites focus too much on SEO instead to the quality the visitor should receive, which is a great pity.

During these same early days, many web owners discovered how to manipulate these search engines to get a higher ranking by using artificial and unethical manipulation. In these days the SE algorithms were pretty transparent and it was not difficult to 'fool' the search engines and by doing this, they also mislead the visitors by showing sites in high ranking positions which they didn't 'deserve'.

But like fighting against computer viruses, search engines started to fight back (especially Google, remember the famous Florida Update in 2002) by using specific algorithms, filters and penalties to stop spammy sites from being indexed. And it became harder and harder to use so called "blackhat" techniques (unethical techniques).

Today your site will be penalized when search engines discover when your site is using spam techniques. This means a penalty on lower ranking or a complete ban from the index.

Therefore it is amazing that many websites are still using 'old' SEO spam techniques to try to get a higher ranking. Sometimes by accident, but also still on purpose.

In this chapter you will find a list categorized by web design sections. Use the navigation on the left to jump to the different sections. Use the green arrow ( go to top of page) to return to the top of this page.

 

Keyword stuffing

Any artificially inflated keyword density is keyword stuffing and you risk getting banned from search engines. If you put your keywords (or phrases) on a certain page, too close after each other, in such way the content will be unreadable for the human eye, then this is keyword stuffing.

Risk: ranking penalty, or the risk to get banned from the index.
Solution: Use the same keyword (phrase) maximum 3 or 4 times on one page and use synonyms, keep to reading content 'natural' for human eyes.

go to top of page

 

Keyword dilution

If you put too many (same) keywords (or phrases) in your <title>, Page heading, subheaders, filename and many repeating times on the same page, then this is keyword dilution.

Risk: ranking penalty, or the risk to get banned from the index.
Solution: Use for every main keyword a separate page, within a page use synonyms, keep to reading content 'natural' for human eyes.

go to top of page

 

Duplicated content

When you have the same content on several pages on your site, this will not make your site look larger because the duplicate content penalty (see: Google penalties) kicks in. But in some cases (but less, or no penalty risk) duplicated content applies to pages that are also on other sites. For example article syndication. In this case you will not get penalized for using duplicated content. But the ranking of the page (in your site) where this content is published will be low. Unless you are the original owner (writer) of this article.

Solution: keep you site up with only unique content.
Risk: ranking penalty, or the risk to get banned from the index.
 

go to top of page

 

Illegal content

When you use other people's copyrighted content without their permission or when you use content that promotes legal violations, you can get banned from the search engines.

Risk: get banned from the index.
Solution: simply don't do this!

go to top of page

 

Invisible text

This is also mentioned in the Wrong Web design chapter.
Once again, do not use same font color on same background color to use for stuffing keywords. Including so called smart solutions in CSS.

Risk: get banned from the index.
Solution: simply don't do this!

go to top of page

 

Blocking Internet Archive may be a spam Signal for Google

According to Matt Cutts (Google's Head of Spam) spammers often block archive.org from crawling or storing their pages. Thus, it is a potential spam signal to search engines.

go to top of page

 

Dynamic URLs

A dynamic URL is an URL which is generated by the software you are using for your website. Like Blog software, CMS software and Web shop software.
Dynamic URLs have long names (mostly numbers) with lots of code and special characters which are difficult to crawl.
Crawlers prefer prefer static URLs and for SEO reasons you should put one or two keywords in the filename, direct related to the topic of that page.

Risk: Difficult crawling, no keywords in URL = less ranking (as part of the total page optimization)

Solution: You can use a tool to rewrite dynamic URLs in SE friendly URLs. Many blogger software (like Wordpress) have already a built-in tool for this. Google the words "rewrite dynamic URL" and you will find a lot of information on how to do this.

 

go to top of page

Repetition of keywords in internal anchor text (particularly in footers) is troubling

Again Matt Cutts noted that keyword usage in the anchor text of many internal links is seen as potentially manipulative (particularly in the footer of a website).
So only use the anchor text as 'guidence' for your users and not for SE robots. And don't show a bunch of keyword links to other pages in your footer.

go to top of page

 

Bans in robots.txt

The main purpose of a robots.txt file is to tell crawlers what to crawl and what not to crawl in your website. If you accidentally tell a crawler not to crawl (or an important part of your website) your site, then you understand you will have a big problem.

Risk: If making mistakes, your complete site or part of it will not be indexed.
Solution: Always double check the command lines you put in the robots.txt file. Google's Webmaster tool has a tool to check the do's and don'ts of your robots.txt file. You can check and test which URLs are restricted by the robots.txt file.

go to top of page

 

Cloaking

Cloaking is a real illegal technique.
Crawlers see one page (of course highly optimized for a certain keyword) and the visitor sees another version of the same page. They know whether it's a crawler or a visitor based on their IP address and/or user-agent.

Risk: get banned from the index.
Solution: simply don't do this!

go to top of page

 

Doorway pages

Doorway pages are primarily designed for search engines, not for human beings. These pages are highly optimized for a certain topic and when a visitor clicks on a doorway page (from an SE results page), he will be redirected (automatically -e.g. cloaking- or by clicking on a link) to another page.

go to top of page

 

Wrong redirects

When not applied properly, redirects can hurt a lot.
A redirect is: When a visitor clicks on a link, he will be redirected to an other page. This technique was very popular when cloaking and the use of doorway pages were still very 'popular'. These last two are not so popular anymore, because search engines can filter our these techniques and as a result you will get penalized for using it. But still there are websites that use this black hat technique. Mostly using the meta refresh, Javascript, PHP, ASP and other techniques to mislead the robots and the human eye. If you still use redirects for unethical SEO purposes then you risk to get penalized or banned from the index.

But there are also legitimate reasons to do a redirect. For example a domain that redirects to an other domain. Or a page that doesn't exists anymore can be directed to the new page. For these type of redirects it is important to use the correct technique. In fact there is one technique what is complete safe and that is the so called 301 redirect. 301 redirect is the most efficient and Search Engine Friendly method for webpage redirection. The code "301" is interpreted as "moved permanently"


Check out correct techniques to do a redirect

 

 

Risk 1: When using redirects for black hat SEO you can get banned from the index.
Risk 2: When using for Search Engine Friendly reasons, you should test this carefully before implementing this permanently. If the target page (the one you redirecting to) doesn't exists, you will have a big problem.


Here is a tool to check a successful 301 redirect

 

 

go to top of page

 

Bad Linking

Many outgoing links

Google does not like pages that consists mainly of links or looks like a single link-list. Having many outgoing links does not get you any benefits in terms of ranking and could even make your situation worse. For example, if you have a special page with outgoing links, never name it "links.htm". You also could consider to put a meta NOINDEX or NOFOLLOW on this page. In this way, a visitor can reach this page using a link (pointing to this page) on your site. But they will not index the page itself and follow the links to all these external sites (and value or devalue it).

Risk: Lowering on page rank
Solution: select carefully the site you are linking to. Make this link part of your content of a certain page. Using this it looks more natural for the search engines AND the human eye that this link has an added value.

Single pixel links or hidden links (front and back color are the same)

One of the 'old' techniques (but still used) When you have a link that is a pixel wide it is invisible for the human eye. No one will click on this link and obviously this link is an attempt to manipulate search engines.

Risk: you can get banned
Solution: Don't do this!


Read Google's Link Guidelines

 

 

go to top of page

 

Bad neighbors

Any site that is using unethical SEO techniques can be considered as a bad neighborhood. This means that if you link to one of those sites, you will be treated as one of them. Like in real life: if you choose outlaws for friends, you are considered to be one of them.
So o
utbound links to link farms and other suspicious sites will do your site harm, in terms of getting a penalty or even get banned from the index.
But if a so called bad neighborhood is linking to you (which is not reciprocal), then it will do no harm for your site. If it was, it would be an easy way to get rid of your competitors :-)

 

Risk: Lowering on page rank or even get banned.
Solution: select carefully the site you are linking to. Check the site well where you want to link to.
More important: check regular all outbound links to see if the site you are linking to is still the same. One time I had a link to a certain website (domain). After some time, the original owner quit his web business and someone else took over his domain (happens a lot). It appeared that the old domain name turned into some spammy porno site. And Google didn't liked that, because this link had no any relevant correspondence with my website. Beside that, this kind of sites very often use unethical SEO techniques.

 

Check if an outgoing link is part of a bad neighborhood
This tool will scan the links on your website, and on the pages that your website is linking to, and flag possible problem areas.

 

go to top of page

 

Cross-linking

Also named "Interlinking".
The main purpose is to gain popularity with many small sites (often owned by the same person, or shared on same IP address or bad neighborhoods).
Example: Site A links to site B, site B links to site C and site C links to site A. This is the simplest example but more complex schemes are possible.

Risk: If doing wrong (and many do that!) you will leave footprints and SE's can easily detect them. If detected (and one day they will) you will get penalized in someway.

go to top of page

 

Paid links

I quote Google:

"Some SEOs and webmasters engage in the practice of buying and selling links that pass PageRank, disregarding the quality of the links, the sources, and the long-term impact it will have on their sites. Buying or selling links that pass PageRank is in violation of Google's webmaster guidelines and can negatively impact a site's ranking in search results."


Read more about Google's guideline for paid links

 

 

go to top of page

 

 

bookmark