So 2016 is now in full swing and Google is not sitting quietly. There are rumours that the next iteration of the Google Penguin algorithm is set to be released in March of this year. So what does this mean for your business or that of your clients?
Well, first lets remind ourselves of what the Google Penguin update focuses on and what past updates have done to SEO.
The Penguin update was first launched in 2012 and was designed to reduce webspam. What is webspam? Well, in the past, SEO agencies and specialists realised that the Google rankings (and that of the competitor engines) were largely based on popularity. As such, large amounts of money was spent artificially inflating website traffic to client sites using false and misleading blogs and third-party domains.
In addition keyword stuffing, hiding of content through scripting and other underhanded or ‘black hat’ techniques were squarely in the crosshairs. Google relentlessly punished these old-school techniques to try to force publishers and businesses to build quality, relevant, sharable and engaging content to drive traffic.
With the launch of the initial Penguin update, websites who invested in black hat techniques instantly saw their sites relegated from the top natural search rankings. For many businesses, disavowing these old links was the only way forward, but even after webmasters completed this task, many sites took months to gain traction again.
This is how the Google Webmaster Tool platform became a ‘must have’ tool for businesses. This extension of the Google platform gave business owners, developers and agencies notifications of a variety of website issues – including low quality links, mobile accessibility, 301 and 302 redirect issues and more. It also provided a central location to submit website XML sitemaps directly to Google.
So what is in store for this new update?
Well, no one is 100% sure yet, however speculation in this area is often right. At present, there are two main areas where we need to focus our attention:
1. Penguin and Panda are working closer and closer together
Recent updates to Google Panda has seen the continued us of algorithms to help determine what a ‘valuable’ website is. Google has recently published information on what counts as a high quality site that will rank well in SERPs:
Our site quality algorithms are aimed at helping people find “high-quality” sites by reducing the rankings of low-quality content. The recent “Panda” change tackles the difficult task of algorithmically assessing website quality. Taking a step back, we wanted to explain some of the ideas and research that drive the development of our algorithms.
Below are some questions that one could use to assess the “quality” of a page or an article. These are the kinds of questions we ask ourselves as we write algorithms that attempt to assess site quality. Think of it as our take at encoding what we think our users want.
Of course, we aren’t disclosing the actual ranking signals used in our algorithms because we don’t want folks to game our search results; but if you want to step into Google’s mindset, the questions below provide some guidance on how we’ve been looking at the issue:
- Would you trust the information presented in this article?
- Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
- Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
- Would you be comfortable giving your credit card information to this site?
- Does this article have spelling, stylistic, or factual errors?
- Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
- Does the article provide original content or information, original reporting, original research, or original analysis?
- Does the page provide substantial value when compared to other pages in search results?
- How much quality control is done on content?
- Does the article describe both sides of a story?
- Is the site a recognized authority on its topic?
- Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care?
- Was the article edited well, or does it appear sloppy or hastily produced?
- For a health related query, would you trust information from this site?
- Would you recognize this site as an authoritative source when mentioned by name?
- Does this article provide a complete or comprehensive description of the topic?
- Does this article contain insightful analysis or interesting information that is beyond obvious?
- Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
- Does this article have an excessive amount of ads that distract from or interfere with the main content?
- Would you expect to see this article in a printed magazine, encyclopedia or book?
- Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?
- Are the pages produced with great care and attention to detail vs. less attention to detail?
- Would users complain when they see pages from this site?
Writing an algorithm to assess page or site quality is a much harder task, but we hope the questions above give some insight into how we try to write algorithms that distinguish higher-quality sites from lower-quality sites. [Source]
2. Blogging sites may cause you damage
Many website have seen an uplift in SEO benefits from reaching out to third-party bloggers and driving traffic based on niche, keyword rich sites. This however may come at a cost in the future. As Panda continues to assess the value of content published as well as the sites that host the content, any negative assessments may impact on your engagement with these sites.
Many SEO specialists have invested large sums of money developing these sites. In most cases, they have developed these sites by duplicating content from a number of sources, but by centralising it into a single URL, have gained traction with Google as a credible source. As Panda searches and traces content through canonical links and backdating content, these sites might find themselves penalised – and their loss can be your loss.
3. Real-time Penguin
Google has admitted that their updates, including Penguin may have been a little aggressive (some webmasters have seen penalties lasting up to 2 years!) and given that ‘re-running’ Penguin is a difficult task, Google has mentioned that it is looking introduce a real-time Penguin algorithm. This could mean faster assessments buy hopefully reduced sentences for offenders (possible 12 months or less).
So Should I be worries about Penguin 4?
Well, yes we all should be worries about the impact of changes to Google’s search algorithms has on our sites and those of our clients; but for the most part, this update will impact business who are buying linked traffic primarily. It is a good idea to make sure that the sites you are working with are independent and that the content being published is unique and relevant.
If you are buying traffic from an SEO expert, beware. You could be the next URL hit with the dreaded Google penalties.