Google Gets Tough(er) on Spam
from: Newsletter from High Rankings
Search engine spammers beware -- Google is out to get you. Sources inside Google report that the search engine is stepping up its efforts to catch spammers.
Google has recently made changes to automatically check for common spam techniques while indexing pages, which greatly increases their ability to catch and penalize the spammers.
One thing they're starting to catch automatically is invisible text on the page. This old spam technique is used to artificially boost a page's relevance for keywords by repeating them over and over again. To hide this repetition from human visitors, spammers place the text in the same color as the page background, rendering it "invisible" to the user. That's obviously a big no-no, and no credible SEO would ever use such a tactic.
I recently worked with a client who took a do-it-yourself approach to search engine optimization. While I encourage my clients to learn about search engine optimization, and always communicate what I'm doing with their sites, a little knowledge can be a dangerous thing. In this case, my client (on her own and against my recommendations) decided to go beyond professional optimization techniques and added invisible text and hidden links to her home page. The result? Within a week the site dropped from a top-10 ranking to no ranking at all.
So what should you do if you or an unprofessional SEO company you hired has used improper SEO techniques on your site? Well, the first thing is to clean up the spam! If you immediately correct the problem, Google generally penalizes the site for 30 days, although it can be longer. However, if you did a really bad thing (like cloaking) your site might be permanently banned.
Overall, I think Google's changes are good. But I have to admit I'm a little nervous about their detecting spam through automated programs and not by hand. From my time at NetMechanic, I learned a lot about the pros and cons of using automated programs to review Web pages. Automated programs follow a rigid set of rules that may not adequately reproduce the common sense we humans use when reviewing a page. There are some techniques that are completely innocent that may be vulnerable to Google's automated spam-checking method.
One situation that worries me is Google declaring all hidden links as bad and automatically checking every page for them. I agree that most invisible links do fall into the spam category, but not all. If you look at http://www.cnn.com/ you will find an invisible GIF link telling you to "Click here to skip to main content." Is this spam? Absolutely not. What CNN is doing is an accessibility technique called "skip navigation" to make their site friendlier to people with disabilities.
This handy technique allows people using a page reader to jump past endless lists of navigation and jump directly to content. This is a good use of a hidden link. I believe that Google would support this use and would encourage Web accessibility. It would be nice if Google could provide some reassurance to Webmasters that these "good use" techniques won't accidentally be penalized. Or maybe Google could provide instructions on how to use skip navigation in such a way as to not get into trouble. (Google, are you listening?)
Another automated spam check looks for duplicate pages (or near duplicate content) on a site. Google has gone on record as saying "Don't have duplicate pages."
Google's stance on duplicate content is justified. Some conniving individuals make "doorway" pages that have duplicate content, but different keywords on each page. When found through the automated tools, these sites may now be penalized.
Most companies don't purposely duplicate content on their site, but it could be happening without their realizing it. For example, many marketing departments create specialized "landing pages" for their PPC ads to test different messages or to track their ads on different properties. Creating custom landing pages for ads is an effective way to improve conversions - you can create a custom page that reinforces the message in your ad and moves the visitor to the next step in the buying cycle. I wholeheartedly support these custom landing pages.
The problem is that many landing pages contain content nearly identical to other pages on the site. If a company isn't aware that Google frowns on duplicate content, this widespread practice could be hurting the company's site.
You can be smart about using your landing pages by making sure you place them in a separate directory on your server and exclude them through the robots.txt file so that Google won't try to index them by mistake. (See for detailed instructions.)
[Jill's comment: In my experience, since there are so many legitimate reasons why there could be duplicate content on pages, Google generally ignores it rather than penalizes for it. However, like Chris, I also recommend that it be excluded through the robots.txt file to be on the safe side.]
Google's move to automated spam checking puts the burden on the Webmaster to know whether they are doing something that might be construed as spam. It's smart to know which things the search engines frown upon, and stay away from anything that might get you tarred with the spammer label.
Basically, if you follow the advice Jill and other professional SEOs provide, you should have no problems. If you don't know what to look for or don't have the time, hire a professional SEO to review your site for potential problems.