Search engine rankings are extremely competitive and Website owners are under pressure to do all they can to gain visibility in search results.
Those pressures come from many quarters: there are branding restrictions, style guidelines, legal issues, navigation needs, sales conversion demands, site interaction demands and more.
The fact remains, though, that search engines were designed for information purposes. This presents hurdles to businesses that try to exploit the search engines in order to attract users who seek in information, then try to sell them something. To overcome these hurdles, many businesses use increasingly ruthless tactics -- tactics that lead them into dishonest territory -- to gain those higher search rankings.
Exploiting the Engine
The exploitation of search engines today is a serious issue, but, like it or not, most businesses see it as something that must be done -- an online business imperative. To exploit a search engine, however, most organizations must exploit a search engine optimization company. In these arrangements, exploitation, or the gaining of something for nothing, becomes the central theme for interaction between client and SEO provider.
Thousands of SEO providers are now in business, and each ranking promiser is more famous than the next. For numerous of these service providers, quality is not an issue. What matters is making promises that beat the competition and win them the client. Faced with these enormous and often unreasonable pressures, ethical SEOs will withdraw from an optimization project. Unethical SEOs, however, will take on the project, saying, "No problem. I'll take care of it."
"Taking care of" an impossible situation means spamming. The client's demand for the impossible and expectation of something for nothing pushes the SEO or Webmaster to that sorry path of search engine spamming. This approach involves the study and nurturing of a growing list of tricky spam techniques.
The best way to diffuse the issue is to bring these methods to light. If everyone knows about a spamming technique, it will cease to work. This is the way to defeat search engine spam, and is the purpose of this article.
Search engines value popular, content-rich sites; however, many Website owners either can't or don't want to spend the money required to create that type of content and popularity. The needed resources, such as researchers, Web content developers, copywriters and skilled SEOs, aren't available, or are beyond the financial resources of the company.
This is the something for nothing scenario that launches all spam projects.
Now, it seems Traffic Power's clients are suing the company, but the damage is done. We still have to wonder who the guilty parties are.
In reality, when a site utilizes spam tactics, it is the client who's ultimately responsible, not the SEO provider. The client has control over a Website and its deployment. When spamming occurs, the Website owner is solely responsible.
The Lure of Top Listings
It is well publicized by some shady operators that rankings are cheap and easy to get. That lie -- and the expectation it generates -- forces some SEOs to offer a guarantee of top 5 rankings. This, in turn, puts pressure on all SEO providers to provide similar guarantees.
Besides angering search engine companies, such guarantees are misleading. Top rankings can't be put on a schedule like an advertising buy. Search engine-organic results are not for sale, and it is this element of honesty that ensures their continued popularity: that which cannot be bought is trustworthy.
When SEOs can't achieve rankings on schedule, they are forced to refund perhaps thousands of dollars. Since many are barely able to pay their bills, they can't afford to return that money. This sets the stage for SEO spamming.
There are spammers who don't care one way or another -- they don't mind cheating as they have no sense of ethics. There are also large SEO companies that are tasked to create rankings for clients that just shouldn't be attempted. They want to automate the SEO process in order to increase revenues. Search engines, in contrast, want to rid their indexes of automated material of any kind.
The Website owner's greed, combined with search engine spammer's opportunism, sets the stage for an unholy union. Here's just one example of a spamming site I've seen.
Spammingsite1.com used several types of spam to achieve strong results:
* mouse-activated redirects
* hidden table cells stuffed with keywords within < h1 > tags
* links from contrived Websites
The end users saw a different page than the search engine indexed. The search engine was tricked by these tactics, and, as is the case with all instances of spamming, lost control of the product it served to search users.
Spammingsite1 was a leader in the search results -- but only because of spam. A check of the sites that linked to Spammingsite1 revealed a list of dubious quality sites with which no legitimate site owner would have wanted to be associated. One of the sites was a growing list of open directory copies -- sites that draw all their content from the open directory project. Copies of Open Directory listings represent a huge problem for Google.The Perils of New Content Types
As Google and Yahoo! venture into spidering new types of Web content, they run the risk of being tricked by the complexity of the code itself. Spammers succeed by staying ahead of the technical filtering capabilities of search engines.
Search engines apply content filters as they spider sites, and afterward, in what¡¦s called post-processing. This sophisticated filtering is wonderful, however it's also limited by the imagination, foresight and programming of the engineers. Spammers can trick the system by exploiting cracks in the filters.
Sometimes innocent sites are penalized because they appear to have some characteristics of spamming. Is your site one of them? Why might a legitimate link to your site not be recognized? It probably looks like a paid link to the search engine. This is another huge problem for search engines: their filters are so complex that they become almost uncontrollable, and innocent sites are incorrectly penalized.
Search engines can only see and know so much about any given Website and its owners. One SEOs content and links are another's spam, so it¡¦s difficult to make statements about who the spammers are. The problem is further complicated by the fact that search engines have different listing and content assessment guidelines.
There are, of course, numerous tactics that are considered spam. Below are some of the most common spamming techniques being used right now -- tactics that should be avoided.
* Publishing Empires
* Networked Blogs
* Domain Spam
* Duplicate Domains
* Links inside No Script Tags
* Dynamic Real Time Page Generation
* HTML invisible table cells
* DHTML laying and Hidden text under layers
* Humungous machine-generated Web sites
* Link stuffing
* Invisible text
* Link Farms
Let's discuss each of these in more detail.
When a publisher builds a vast array of interlinked Websites, it can generate high PageRank and subsequent rankings. This form of spam is difficult for a search engine to penalize, since the links are legitimate. Any single business entity has the right to interlink its own Websites. The company can create further overlap between the sites' content themes so that the links are truly valued by search engines.
This kind of activity is exemplified by one of the Internet's largest publishers. The business has 120+ Web properties, all of which are carefully linked to the others. Perform a search on one of these sites, and you're virtually guaranteed to see one of the company's other Web properties in the search results.
Many among the most successfully ranked sites use this system -- this form of spamming is extremely widespread. Perpetrators basically collect PageRank and link reputation within their network, then use it creatively to dominate the best keyword phrases. Search engines haven't found a way to stop this technique, but they'll have to. This form of spamming is a major threat to the quality of search results.
Wikis are Web repositories to which anyone can post content. They can be a great way to present and edit ideas without close censorship, and have proven extremely successful for the creation, management, and maintenance of projects that require input from users around the globe.
However, despite their considerable advantages, the often un-scrutinized nature of wikis makes them ripe for abuse. Like a link farm, a wiki's links are free for all. Ironically, the value of wikis is consistent with popularity-based search engines. Some of these wikis boast a very high pagerank, which can make the wiki an attractive place from which to gain a link to your site. But without close human control, users may simply add their links as a means to take advantage of the wiki's PR. Until another user of the wiki removes the link, the linked site enjoys the benefits of this unscrupulous activity. The search engine spammers have control.
Blogs can be a source of precise, up-to-date and technically detailed information, presented by specialists and experts. Blogs are thus very valuable to info-hungry searchers, and are extremely popular.
However, some spammers start a blog, plug it full of garbage content such as comments on what they thought at 5:15, along with a link or two and some keyword rich text. Keyword rich musings don¡¦t present real value to deceived searchers. Worse still, blogs often operate in a free-for-all link structure that further validates the linked sites in search engine indexes.
Like blogs, forums can be a rich source of relevant information.
Unfortunately, some forum participants make comments in forums only in an effort to publish links back to their own sites. This may be acceptable if the user provides help or assistance to another forum member. Indeed, they should gain credit for that information, which they may have worked hard to discover.
However, when the posts become excessive and are comprised solely of glib or irrelevant comments, then value of the link, or indeed, the whole forum, can be put into question. Some forum owners only start forums in the hope that they will raise search engine rankings.
Probably the most popular spam technique today involves creating and hosting a number of Websites. These sites rarely have any intrinsic value other than providing ranking support for the owner's main Website.
I've had several former clients who had used this technique -- and had been penalized for it. After I got them to get rid of the duplicates completely, their rankings were repaired.
Why can't Google detect two exact duplicate Websites that only differ on domain names? Why would Google give these same sites first and second rank for the very same phrase? This happens all too frequently and is due to Google's preoccupation with linking between topically related sites.
Domain spam is usually the result of a corporation's attempt to have Web sites for each of its company departments or subsidiaries. Those with many subsidiaries get a big boost from these domains. Realizing this, spammers are increasingly encouraging clients to have sites hosted on different IP addresses and even in different geographical locations.
The link pattern detection used by Google has difficulty dealing with this practice, and is currently failing to cope with it. Google's new emphasis on authority sites actually makes this matter worse, as the authority can gain trust it really doesn't deserve.