Biznology
Where business and technology create a winning customer experience.

Google bowling ballPerhaps you’ve heard about “Negative SEO” or “Google bowling,” where your competitors use spam techniques seemingly on your behalf to knock down your site. Ethically-challenged search marketers are once again bringing a black eye to the industry by intentionally using spam techniques to get their competitors penalized. And what can you do about it? Not much, except to complain to Google that this is the wrong way to operate. Google has choices and we need to scream that they make some better ones.

This is a story that I have been following several years, but I haven’t written about it because I’d seen little evidence that it was happening. Forbes wrote an article on it last year, but I still wasn’t convinced. It seemed like a theoretical problem and I thought that the search engines had the problem under control. From what I am hearing now, I no longer think that is the case. Google bowling is certainly not widespread, but I think it is possible to pull off and that it is a threat to honest marketers.

First, let me say that I totally understand why Google penalizes sites that benefit from spam techniques. (Throughout this article, I will refer to Google, but there is every reason to believe that all search engines need to worry about the same problems.) Search spam has grown out of control in recent years, as spammers use duplicate content, fake blog comments, fake sites, and paid links (among other tactics) to try to get their sites boosted. Google has long identified spammy content and spammy links and ignored them for ranking purposes.

But, just as with e-mail spam, simply ignoring spam did not hurt the spammers. They just threw more and more spam at Google, hoping that some of it would stick. So Google needed to up the ante.

Just as it does when it finds other spam techniques (such as hidden text or cloaking), Google began to penalize sites for receiving spammy links or posting duplicate content. This had the desired effect, I am sure, of tamping down spam techniques, because now getting caught could actively hurt the spammer rather then merely being a missed opportunity. Google is trying to turn a neutral consequence for the spammer (“oh well, that one didn’t work”) into a negative consequence (“damn, I’ll have to close that site”), which make perfect sense.

But it has opened the door to nefarious characters perpetrating spam against you, the honest marketer. This is a new search marketing problem. No one could get access to your site to create spammy content or cloaking violations. If Google caught your site crossing those lines, it’s clear that you actually did it, so your site can be punished without reservation. But these link spam and duplicate content techniques can be themselves faked so that your competitors use them precisely to have Google catch you and penalize you.

And there’s not very much you can do about this, except complain to Google. Even that much is hard. I mean, you won’t know why your rankings dropped. You also don’t know exactly what links Google is seeing to your site, or what duplicate pages are out there. And you certainly don’t know which ones Google thinks are spammy.

Rather than waiting to complain to Google after your site is hit, I think we should complain now. We need Google to come up with a way for honest marketers to protect themselves.
Unfortunately, the possibilities I can come up with are few.

Let’s start with duplicate content. Spammers now intentionally create duplicate content and aggressively promote their duplicate copy in social media so that your original is seen to be the copy by Google. Google see both copies at about the same time and judges the more popular one to be the original.

I can see two ways to combat this. One would be to promote your content yourself, so that spammers can’t easily outdo you. This takes a lot of effort and is kind of cheesy–I mean, I don’t think that Digging your own blog entry is good form.

Another way to do battle is to set up Web Feeds (RSS or Atom) for all of your content. When you publish, your feed can ping Google before the spammer’s copy can, so that should indicate to Google that you have the legitimate one. I don’t know whether Google uses this information to flag the legitimate version of the content now, but maybe it should.

But that’s not the really tough one. The really difficult negative SEO technique is link spam. Google doesn’t want to tell you when they penalize you for bad links, and they don’t want to tell you which links are the problems. The reason is simple–the real spammers would love to know what Google knows. In the game of cat and mouse, they could try lots of different types of spam links and stop the kinds that Google detects. If Google tells them what they detect, that’s rather simple, isn’t it?

So, Google can’t afford to be open about what the problems are that cause the penalties. So what else is left? I think that Google needs to allow your site to refuse a link.

Just as Google asks that paid links be coded with “nofollow” to avoid triggering the penalty, why shouldn’t the receiver be allowed to say “noaccept”? If the link recipient states that Google should ignore that link, then isn’t that an indication that no spam effort is underway?

Now, I understand that this is easier said than done. First, Google does not today show all of your links, either with the link: operator or in Webmaster tools. So, if Google wants to continue this practice of showing you mere samples of your links, then it needs to make sure that the sample includes a sample of the links it considers questionable. So, Google would have to update its sampling algorithm to ensure that.

Second, marketers would need an interface that allowed them to examine their links and refuse the benefit of any of them. I think that Google Webmaster Central could be easily updated to accommodate this. Because Webmaster Central is controlled by ID, it’s clear that the rightful owner of the site is refusing the links (so this could not become a new form of negative SEO by having your competitors refuse good links).

Third, Google would need to update its ranking algorithms so that the benefit of the links that you refuse would really disappear. If Google does not do that, then it would be beneficial to just go in and refuse all links—that would save you from spam penalties and not hurt your rankings. Instead, refusing a legitimate link must hurt your ranking, so that you will have the incentive to refuse only the spammy ones.

Fourth, there needs to be a way of making it easy to keep up with new links that are added and with links whose characteristics change. So Google should provide an RSS feed or a sortable interface to allow marketers to examine links that are new or changed so that you get a new chance to refuse the spammy ones.

If this is sounding very complicated, well, it is. I am not clever enough to know if this idea is bullet-proof. Perhaps the spammers can think of some hole in it. It’s a lot of work for Google, but worse, it is a lot of work for you and me. I just can’t think of any alternative.

Google could make it easier by trying to provide the smallest number of links in a sample for you to accept or refuse, but I can’t think of any way to reduce the work beyond that. It seems to me that, like click fraud, this is one of those immensely scary problems that could kill the goose that laid Google’s golden egg. Even if negative SEO is not widespread (yet), Google needs to nip this in the bud.

What do you think? Are there holes in my proposal to Google? If there are, let’s plug them or come up with an alternative. But we must come up with something to end this madness. Search marketing has a bad enough reputation without this. Let’s work together so that honest search marketers still have a chance to succeed on their own merits.

Tweet about this on TwitterShare on FacebookShare on Google+Pin on PinterestShare on LinkedInEmail this to someone
Mike Moran

About Mike Moran

Mike Moran has a unique blend of marketing and technology skills that he applies to raise return on investment for large marketing programs. Mike is a former IBM Distinguished Engineer and a senior strategist at Converseon, a leading social consultancy. Mike is the author of two books on digital marketing, an instructor at several leading universities, as well as a Senior Fellow at the Society for New Communications Research.

17 replies to this post
  1. You say:
    “We need Google to come up with a way for honest marketers to protect themselves… Unfortunately, the possibilities I can come up with are few.”
    I suppose there could be an opposite of a “nnofollow” command, within webmastertols, where you list any domains with inbound links that you do NOT wish your site to be associated with. This has the secondary benefit of being able to allow you to “Sculpt” your link profile, but if you really are associated with a site in some way… say yours and SELand, then you can’t easily “Noaccept” one link, as presumably on the whole you’d appreciate the association.
    So – my suggestion is a “NoAccept” list, at TLD domain level only, which you list in webmastertools. Google can help by suggesting the spammiest links that they THINK are pointing to your site, to help you choose.

  2. I think that’s close to what I am suggesting, Dixon. Perhaps you’re right that they could be done on a site bases rather than an individual link basis, but that is the idea I have in mind. But as I explained, Google will not show you the spammiest links, because then they’d be showing the spammers what they know. Instead they’d have to show you a sample of link domains, some spammy and some not, and let you choose which ones to accept links from, which they then extrapolate into their judgment about whether your site ought to be penalized.

  3. In my opinion your suggestions have two weaknesses.
    First, sorting out the good from the bad links (or domains) would mean even more work then expected by you. I work as an online marketer on a professional basis and even to me it is difficult to judge a link or a domain on first sight. Usually you have to undertake some research wether a link or a domain are trustworthy or not. For example, the company I work at sends out press releases on a regular basis. Some authority websites publish this content with a link to us, because of the news value provided. On other domains the release gets published in order to build some crappy MFA landing pages. Sorting out these domains by hand would cost us a lot of extra work. Furthermore Google will not help us in selecting which links to accept, as you state correctly. So I also do not know if the link from a mediocre domain is going to hurt or to benefit my domain. Do I accept the link or do I dismiss it? Maybe I throw away a great opportunity to increase my ranking, just because I am afraid to accept additional links?!
    Second, so far Google uses Webmaster-Tools mainly to provide additional information to savvy webmasters, who know their way around. Google is, for good reason, very reluctant to make Webmaster-Tools a place where you can make important choices for your domain(s). The option to influence ones sitelinks is a careful first step in this direction. The reason for this is probably that the majority of webmasters is not aware that something like Webmaster-Tools does actually exist. Especially over here in Germany, where I am located, even some of the biggest and best ranking websites do not make use of Google’s tools. If the well-being of a domain relies heavily on actions taken within the webmaster-central, Google will need to communicate this feature much stronger, than it has been the case so far. Still there will be people who honestly do not know about Webmaster-Tools and others who simply pretend so. How can Google hand out penaltys when most people don not know or cannot know how to avoid these penalties?
    Despite this criticism I think you are talking about a very important topic that needs further consideration. I am not able to provide a better solution of how to overcome negative SEO at this moment, but I will spend some time thinking. Maybe one day I will come up with some proposals of my own.

  4. Thanks Gunnar for your well-thought out criticisms. I think you are right about both of them, but I can’t think of any alternative. I agree that choosing which links to accept could be quite difficult, but today there are sites being penalized with nothing they can do. Difficult beats impossible.
    I also agree that Webmaster Tools is a well-kept secret to many people, but so are many practices in search marketing. (It helps me sell a lot of books, fortunately.) I think you’d have to do this through Webmaster Tools because you have to know the person’s identity.
    Thanks again for the feedback–I am looking forward to any possible alternatives from you (or from anyone else).

  5. I like the points you make, but they were at the end of a long and at times irrelevant-seeming stream of thought.
    For usability and kindness (to your readers)’s sake, why not add some paragraph headings so that people who aren’t interested in reading all the fluff can just read headlines and find the meat of the article?

  6. Thanks Farhead, for being clear and concise with your point. :-)
    I will try to be more brief. My wife complains about the same thing–smart woman.

  7. The problem with negative SEO, is who decides that it is negative. We are very white hat but that doesn’t make us expert on positive and negative… Does it?

  8. Google Webmaster tools does tell you every link to your site. You just can’t see it on screen. To get the full list, you have to download them from the external links section.

  9. Yes you can download a complete list from Google Webmaster tools but i am not sure that list is an exhaustive one.
    I think that many links aren’t showed in that list.

  10. PLEASE GOOGLE ADD THIS “NO ACCEPT” Function!!! I am being a victim of Google Bowling as we speak and it sucks. How can i remove these links with out contacting 1000 sites manually? The links were added intentionally to sabotage my site!

  11. Mike, I would also love to see the “noaccept” tag supported. If we can point out the bad links to google, it could greatly reduce the amount of this horrible activity that goes on.

  12. for some reason google results seam to favor negative, not sure if we will ever know why but it sure causes those caught up in what others say some serious distress. wish there was a way to separate results that seam to have a majority of negative comments, any ideas? maybe those type of results could have a separate heading tag or color code?

  13. Search engines use a variety of mathematical formulas to determine relevance. Spamdexing seeks to undermine that. Spam can be delineated into two basic forms which include content spam and link spam.

  14. I am not a fan of the attacking others with seo. Such as spamming another site with bad links. But at the same time I don’t get how Google can allow that to happen and hurt peoples rankings. It just means competitors can throw spam links at each other.

Webinars

video

Yesterday, our author Andrew Schulkind presented our latest Biznology webinar about content marketing. If you've ever been greeted by a deafening silence after...