Google to Penalize Over SEO’d Sites

Some breaking news from SXSW spotted by Barry Schwartz: Google’s Matt Cutts said that they are actively working on an algorithm update to target what Matt called “overly-optimized” sites, where aggressive SEO has given them a somewhat unfair ranking advantage, sometimes over sites that have better content.

Here is the direct quote from Matt Cutts:

We don’t normally pre-announce changes but there is something we are working in the last few months and hope to release it in the next months or few weeks. We are trying to level the playing field a bit. All those people doing, for lack of a better word, over optimization or overly SEO – versus those making great content and great site. We are trying to make GoogleBot smarter, make our relevance better, and we are also looking for those who abuse it, whether they throw too many keywords on a page, or whether they exchange way too many links or go well beyond what you normally expect in a particular area. It is an active area where we have several engineers on my team working on this right now.

What Does This Mean for SEO?

From parsing Matt Cutts’ statement, it sounds very much like they’re targeting two things: too many keywords stuffed on the page, and excessive reciprocal or 3-way linking.

On the keyword stuffing front, this is a common situation that beginner SEOs often fall afoul of — at often works — where a page description has the keyword they’re trying to rank for again and again and again, often to the point that it doesn’t sound very natural. In other words, the quality of the text on the site is actually worse due to the keyword stuffing. In this case it’s pretty obvious that the site is being SEO’d heavily, and those SEO tactics are detrimental to the quality of a site. What Google is saying is perhaps that site doesn’t deserve to rank as highly as another site without these over-aggressive tactics that has equally good or better content.

So sites with excessive on-page optimization — those guys that talk about keyword density formulas — are likely to see a ding in rankings. This isn’t a penalty, per se, this is instead just removing the benefit from something that site’s really shouldn’t have gotten a boost for in the first place.

The excessive link exchange end is more interesting to me. For a long time we’ve known that Google can detect reciprocal links and even use it as a spam signal. It sounds like they may be turning the dial up on that signal, or incorporating it into a Panda-like algorithm. This is a delicate process because a lot of niche’s have heavy inter-linking between sites, where all of the authoritative authors in the industry know each other and link to each other — and these are natural, not spam.

It will be interesting to see how this plays out, and whether they’ll be targeting other low-quality linkbuilding tactics.

Matt Cutts also commented that this is not Google attacking SEO or the SEO industry — instead it’s targeting “people who take it too far.” He said they’re trying to make sure if you’re white hat you’re not affected, but if you’re going way far beyond the pale you’ll get hit.

A lot of people seem to think that Google hates SEO, and that’s not the case… SEO can often be very helpful.

Share this post:
Facebook Twitter Pinterest Snailmail Linkedin

3 Comments | Leave a comment

  • Great post Brian! I remember the days of “keyword density” being a big deal and I was glad when it went away. So much better to focus on useful, readable content!

    I wonder though, how Google is going to decide what is “excessive” reciprocal linking…I have a lot of industrial manufacturers I work with who have plenty of reciprocal links with their dealers/distributors. Since that is the majority of some of their links, I’m hoping that Google is taking legit links into account.

    Comment by Anna

  • I think one of the interesting implications about this statement is that Matt Cutts is basically acknowledging that things like keyword stuffing and reciprocal linking are still effective, at least in some cases, causing some sites to rank better than they should.

    I have a hunch that if the majority of links come from reciprocal linking, that may be a problem — from Google’s point of view, that’s a site that doesn’t have any “votes” from independent unbiased sites — only from people that it’s associated with. –Or possibly “excessive” will only mean sites with hundreds or thousands of reciprocals, or cites whose reciprocals are all from grossly unrelated pages, link lists, etc.

    Comment by Brian

  • From what I am seeing people are already using this to hurt their competition. I recently read a few case studies where sending certain links to your competitor is dropping them off the first page pretty quickly. This has been repeated using certain auto linking setups. This is going to be really annoying. Now you can actually be affected by negative SEO work which can certainly damage a site. It sounds like Google is making a big mistake with this.

    Comment by John

Leave a comment

 

RSS feed for comments on this post