Early in 2011, Google launched Panda, a search results algorithm which filtered out websites with thin, low quality content. This was the start of a series of major quality control checks. Google Panda stripped search results pages (SERPs) of poorly constructed, spammy content, enabling higher quality websites to rise to the top.
One of Google’s primary targets with Panda was ‘content farms’. These are sites which would churn out low quality content that tended to rank simply because of the sheer quantity of copy. For Google, always striving to deliver high quality results for an optimal user experience, this was a huge concern. Through the Panda algorithm, Google dealt two black eyes to content spammers and effectively removed content farms.
Since its launch, the algorithm has become one of Google’s core ranking signals. It is under constant development to become increasingly sophisticated in its evaluation of what is defined as low quality content, driving up the level required by websites wishing to rank well.
How does Panda work?
Recognising shallow content and low quality websites may be easy for a user, but for search engines it’s a very tricky process, especially on the vast, varied web. Amit Singhal, Vice President of Search at Google, told Wired magazine that in order to solve this complex issue, the process had to be as scientific and mathematical as possible. Google researchers then devised a rigorous set of questions (some of which are listed here) for website testers to review a selection of domains. From these questions and reviews, the Google team devised a set of ranking signals which formed the definition of what would be considered low quality content.
Google is constantly changing and advancing the signals and metrics it uses to determine a website’s value. This enables Google to stay on top of what is considered good and bad content, and continuously provide excellent user experience.
What Does Panda Target?
The Google Panda update targets websites with the following:
- “Thin” onsite content: Domains lacking quality content across many pages tend not to provide a valuable user experience. This could mean pages with just a few sentences, or pages with an inarticulate mass of words. Spelling and grammar matter!
- Duplicate content: If there is a high volume of duplicate content—pages with very similar or exactly the same content—then this may be a signal of search engine manipulation. In the past, domains may have tried duplicated pages which targeted a particular same keyword to try and increase their chances of ranking for this term.
- Machine-generated content: Also known as ‘spun content’, this is copy automatically produced by software to fill webpages with keyword rich, but ultimately poor quality, information.
- Excessive onsite adverts: Pages which are inundated with adverts compromise user experience.
Panda is not a penalty
While there have certainly been instances of entire domains plummeting in rankings as a result of Panda updates, the algorithm is not a penalty. It is still possible for sites which have been targeted by Panda to rank well overall. This is because rather than devalue domains with spammy content, it simply demotes the individual pages to lower SERP position.
A Moz post discussing Google Panda, quotes a Google spokesperson who confirmed this:
“The Panda algorithm may continue to show such a site for more specific and highly-relevant queries, but its visibility will be reduced for queries where the site owner’s benefit is disproportionate to the user’s benefit.”
Preventing Google Panda from negatively impacting your site is actually quite simple. The best way to reliably and sustainably improve SEO is to produce high-quality, unique content, and not cut any corners along the way.
Digital marketers are always striving to build a campaign which panders to Google algorithm preferences. From content strategies, to technical website construction, there are a number of considerations to be kept in mind:
The page must provide the user with the information, solution or service promised.
If it doesn’t, either the content, the design, or the onpage features aren’t up to scratch. This will drive people away from the site, causing a high bounce rate, poor user experience and ultimately reduced visibility.
A domain should have strong external links as references.
One way to show search engines that you have collated reliable information into a high quality informational resource is by referencing your sources. If Google can see that you are striving to provide your user with accurate information, it understands you are a responsible content curator and will reward you with boosted visibility. These links can also help search engines make associations between your site and the high quality, topic-relevant sites being referenced.
Backlinks are vital.
Building a great backlink portfolio (collection of reputable websites with links pointing to your site) is an essential part of any SEO strategy. Panda plays an important role in this, favouring pages with a high number of external sources linking to it, from pages which mention the linked-to content in a positive light. These positive, relevant backlinks prove that you are a resource and that other people regard you as such.
Panda reads your reviews.
If you have a service or product which can be reviewed by independently verified sources, then make sure you are offering users the chance to assess it on your site. 5 star ratings which are in a format Google can understand will help the search engine recognise the quality of your service, and therefore your user experience.