Trending Now

What Google could do to stop negative SEO

Perhaps you’ve heard about “Negative SEO” or “Google bowling,” where your competitors use spam techniques seemingly on your behalf to knock down your site. Ethically-challenged search marketers are once again bringing a black eye to the industry by intentionally using spam techniques to get their competitors penalized. And what can you do about it? Not much, except to complain to Google that this is the wrong way to operate. Google has choices and we need to scream that they make some better ones.

This is a story that I have been following several years, but I haven’t written about it because I’d seen little evidence that it was happening. Forbes wrote an article on it last year, but I still wasn’t convinced. It seemed like a theoretical problem and I thought that the search engines had the problem under control. From what I am hearing now, I no longer think that is the case. Google bowling is certainly not widespread, but I think it is possible to pull off and that it is a threat to honest marketers.

First, let me say that I totally understand why Google penalizes sites that benefit from spam techniques. (Throughout this article, I will refer to Google, but there is every reason to believe that all search engines need to worry about the same problems.) Search spam has grown out of control in recent years, as spammers use duplicate content, fake blog comments, fake sites, and paid links (among other tactics) to try to get their sites boosted. Google has long identified spammy content and spammy links and ignored them for ranking purposes.

But, just as with e-mail spam, simply ignoring spam did not hurt the spammers. They just threw more and more spam at Google, hoping that some of it would stick. So Google needed to up the ante.

Just as it does when it finds other spam techniques (such as hidden text or cloaking), Google began to penalize sites for receiving spammy links or posting duplicate content. This had the desired effect, I am sure, of tamping down spam techniques, because now getting caught could actively hurt the spammer rather then merely being a missed opportunity. Google is trying to turn a neutral consequence for the spammer (“oh well, that one didn’t work”) into a negative consequence (“damn, I’ll have to close that site”), which make perfect sense.

But it has opened the door to nefarious characters perpetrating spam against you, the honest marketer. This is a new search marketing problem. No one could get access to your site to create spammy content or cloaking violations. If Google caught your site crossing those lines, it’s clear that you actually did it, so your site can be punished without reservation. But these link spam and duplicate content techniques can be themselves faked so that your competitors use them precisely to have Google catch you and penalize you.

And there’s not very much you can do about this, except complain to Google. Even that much is hard. I mean, you won’t know why your rankings dropped. You also don’t know exactly what links Google is seeing to your site, or what duplicate pages are out there. And you certainly don’t know which ones Google thinks are spammy.

Rather than waiting to complain to Google after your site is hit, I think we should complain now. We need Google to come up with a way for honest marketers to protect themselves.
Unfortunately, the possibilities I can come up with are few.

Let’s start with duplicate content. Spammers now intentionally create duplicate content and aggressively promote their duplicate copy in social media so that your original is seen to be the copy by Google. Google see both copies at about the same time and judges the more popular one to be the original.

I can see two ways to combat this. One would be to promote your content yourself, so that spammers can’t easily outdo you. This takes a lot of effort and is kind of cheesy–I mean, I don’t think that Digging your own blog entry is good form.

Another way to do battle is to set up Web Feeds (RSS or Atom) for all of your content. When you publish, your feed can ping Google before the spammer’s copy can, so that should indicate to Google that you have the legitimate one. I don’t know whether Google uses this information to flag the legitimate version of the content now, but maybe it should.

But that’s not the really tough one. The really difficult negative SEO technique is link spam. Google doesn’t want to tell you when they penalize you for bad links, and they don’t want to tell you which links are the problems. The reason is simple–the real spammers would love to know what Google knows. In the game of cat and mouse, they could try lots of different types of spam links and stop the kinds that Google detects. If Google tells them what they detect, that’s rather simple, isn’t it?

So, Google can’t afford to be open about what the problems are that cause the penalties. So what else is left? I think that Google needs to allow your site to refuse a link.

Just as Google asks that paid links be coded with “nofollow” to avoid triggering the penalty, why shouldn’t the receiver be allowed to say “noaccept”? If the link recipient states that Google should ignore that link, then isn’t that an indication that no spam effort is underway?

Now, I understand that this is easier said than done. First, Google does not today show all of your links, either with the link: operator or in Webmaster tools. So, if Google wants to continue this practice of showing you mere samples of your links, then it needs to make sure that the sample includes a sample of the links it considers questionable. So, Google would have to update its sampling algorithm to ensure that.

Second, marketers would need an interface that allowed them to examine their links and refuse the benefit of any of them. I think that Google Webmaster Central could be easily updated to accommodate this. Because Webmaster Central is controlled by ID, it’s clear that the rightful owner of the site is refusing the links (so this could not become a new form of negative SEO by having your competitors refuse good links).

Third, Google would need to update its ranking algorithms so that the benefit of the links that you refuse would really disappear. If Google does not do that, then it would be beneficial to just go in and refuse all links—that would save you from spam penalties and not hurt your rankings. Instead, refusing a legitimate link must hurt your ranking, so that you will have the incentive to refuse only the spammy ones.

Fourth, there needs to be a way of making it easy to keep up with new links that are added and with links whose characteristics change. So Google should provide an RSS feed or a sortable interface to allow marketers to examine links that are new or changed so that you get a new chance to refuse the spammy ones.

If this is sounding very complicated, well, it is. I am not clever enough to know if this idea is bullet-proof. Perhaps the spammers can think of some hole in it. It’s a lot of work for Google, but worse, it is a lot of work for you and me. I just can’t think of any alternative.

Google could make it easier by trying to provide the smallest number of links in a sample for you to accept or refuse, but I can’t think of any way to reduce the work beyond that. It seems to me that, like click fraud, this is one of those immensely scary problems that could kill the goose that laid Google’s golden egg. Even if negative SEO is not widespread (yet), Google needs to nip this in the bud.

What do you think? Are there holes in my proposal to Google? If there are, let’s plug them or come up with an alternative. But we must come up with something to end this madness. Search marketing has a bad enough reputation without this. Let’s work together so that honest search marketers still have a chance to succeed on their own merits.

Mike Moran

Mike Moran is a Converseon, an AI powered consumer intelligence technology and consulting firm. He is also a senior strategist for SoloSegment, a marketing automation software solutions and services firm. Mike also served as a member of the Board of Directors of SEMPO. Mike spent 30 years at IBM, rising to Distinguished Engineer, an executive-level technical position. Mike held various roles in his IBM career, including eight years at IBM’s customer-facing website, ibm.com, most recently as the Manager of ibm.com Web Experience, where he led 65 information architects, web designers, webmasters, programmers, and technical architects around the world. Mike's newest book is Outside-In Marketing with world-renowned author James Mathewson. He is co-author of the best-selling Search Engine Marketing, Inc. (with fellow search marketing expert Bill Hunt), now in its Third Edition. Mike is also the author of the acclaimed internet marketing book, Do It Wrong Quickly: How the Web Changes the Old Marketing Rules, named one of best business books of 2007 by the Miami Herald. Mike founded and writes for Biznology® and writes regularly for other blogs. In addition to Mike’s broad technical background, he holds an Advanced Certificate in Market Management Practice from the Royal UK Charter Institute of Marketing and is a Visiting Lecturer at the University of Virginia’s Darden School of Business. He also teaches at Rutgers Business School. He was a Senior Fellow at the Society for New Communications Research and is now a Senior Fellow of The Conference Board. A Certified Speaking Professional, Mike regularly makes speaking appearances. Mike’s previous appearances include keynote speaking appearances worldwide

Join the Discussion

Your email address will not be published. Required fields are marked *

Back to top