Google Panda Update

Google makes over 500 changes to their ranking algorithm every year. Many of these are very minor and almost invisible, but occasionally Google makes major algorithm updates or supplemental algorithms that have a major impact on rankings and can cause a well-ranking site to vanish from the top search engine results.

The Google Panda update has caused a lot of frustration from site owners and SEOs over the last year. The updates are designed to improve search engine results and punish shady tactics, but every change also catches some innocent sites along with the spam.

Here are the details about how Google’s Panda update works, what it targets, and how to recover from Panda penalties. As always you can contact the Ecreative team for help with sustainable ranking help, including recovering from Panda and building a strategy for longterm SEO success.

The Panda Update

Google first launched the Panda update on Feb 24, 2011 to target low-quality quality sites and improve the quality of search results. Sites that fall victim to Panda typically see a site-wide ranking penalty to nearly all search phrases, with nothing other than the name of the site capable of ranking in the first several pages of Google’s search results.

Sites hit by Panda see first-page rankings fall from first page position down to positions in the 80s or even below 100. You can verify that Panda has affected you by using analytics to match the fall off of Google-specific search traffic with one of the dates that the Panda update ran. We maintain a list of the dates Panda has run that you can compare against.

The Panda update targets the following characteristics of a website:

  • Duplicate content: identical content on many pages of the site without canonical tags, or identical content on the site that exists on other sites on the internet. Ecommerce sites that copy & paste their product descriptions from the manufacturer can fall victim to this, since those descriptions are often copied throughout the web.
  • Thin content: many pages with very little text content. Many pages with only a sentence or two can trigger Panda, as these pages tend not to provide much value to users. In the past some SEOs tried making sites with thousands of pages with almost no content in an attempt to game the algorithm.
  • Excessive ads: too many ads is one of the low-quality signals that Panda targets.
  • Little above the scroll content: if a site doesn’t have enough content above the scroll – or at the point where it can be read without scrolling down – it is considered a poor user experience. In particular too much of the top-of-page space devoted to ads or content blocks that look like ads to Google, can trigger Panda.
  • Too many blocks: if too many users have chosen to block the site using the block feature in Chrome, this can also trigger a Panda penalty.

Panda is not a built-in part of Google’s ranking algorithm. Instead it is a separate algorithm that runs periodically. The advantage of this is that you can compare rankings drops to the last date Panda ran to confirm if a rankings loss is due to Panda. The downside is that even if you correct the problems overnight, you’ll have to wait a month or two for the next time Panda runs to see the recovery from the penalty.

Panda Recovery

If you think your site may have been hit by a Panda penalty, the first step is to verify that it’s a Panda-related penalty, and not another ranking drop. First check your Google Webmaster Tools for warnings from Google that might indicate another kind of penalty, including Panguin or manual action. Then verify that the date of rankings or traffic drops lines up with a date that Panda ran.

If you’ve verified Panda, then you or your SEO team needs to do a complete site audit against the list of Panda triggers above and determine which aspects of your site may be causing the penalty. By far and away the most common cause of a Panda penalty is duplicate content. Spot check your content and run Google exact match searches to see if that content is duplicated across the internet. This will also reveal internal duplicate content issues.

Make sure that almost every page of your site has unique content, a substantial amount (at least 100 words on the low end) and that your site uses canonical tags to avoid internal duplicate issues. Ensure that the site has no more than 3 ads on a page, and that those ads don’t push the unique page content below the scroll (a substantial amount of content should be visible, not just a line or two). This also means looking for site template content blocks that could appear to be ads to Googlebot, such as subscribe forms and internal banners.