Recently at the SMX Advanced conference, Google’s Matt Cutts revealed that a new update to the Google Panda algorithm has been approved and will be rolling out shortly. This new update will focus on scraper sites that are outranking original content sites for the content that they’re stealing and republishing as if it was their own — a common complaint in the wake of the Panda update.
This upcoming algorithm change is being referred to as Panda 2.2.
In the same Q&A, Matt Cutts stated that Google has not made any manual exceptions to the Panda algorithm. A common theory among many SEO professionals was that Google manually went through and redeemed certain popular sites, or sites that made news, that were negatively impacted by Panda and perhaps should have been. Instead Cutts said that as they tweaked the Panda algorithm after it went live those algorithm changes could account for the bounce back of some site — but again, they did not manually make certain sites “immune” from Panda.
Matt Cutts also noted that Panda was not specifically targeted at site usability, but that webmasters should pay attention to usability because it’s good practice, not because Google says so.
But then again — Google has a history of trying to incorporate what it thinks is good website practices into their algorithm — because those are the sites that give a better user experience, and of course Google’s goal is to deliver the best websites to users (because that keeps people using Google, which enables them to sell more ads).
Other Panda-Related Posts:
[posts-by-tag tags = “panda” number = “10” excerpt = “false” thumbnail = “false” order_by = “date” author = “false”] [/posts-by-tag]