10 Years of Google Updates – The Impact

Share This Post

It was almost a decade ago that Google set out to dramatically improve the quality of the results their search tool supplied. In that time, Google has succeeded in dominating the search engine space and has not given up on its ambition to make search better. Each year they release 500-600 “tweaks” to their worldwide algorithms to ensure its users are being presented the most relevant and highest quality search results possible.

While many of these updates are minor, several have represented seismic shifts. This article walks through the most significant Google algorithm updates and provides a summary strategy on how to be compliant with all of them.

Panda

Panda represents the very first update that fundamentally changed search engine optimization forever. The updated ranking criteria in Panda started the white-hat SEO revolution that continues to this day. The Panda algorithm update penalized sites with low-quality and spammy content by down-ranking them in the search results. Once upon a time, it was only a filter but in 2011, it became a part of Google’s core algorithm.

What to Watch out for

  • Duplicate content (internal and external)
  • Keyword stuffing
  • Thin content
  • User-generated spam
  • Irrelevant content

What to do

1. Get the full list of your webpages

The easiest way to identify internally duplicated or thin content and keyword stuffing is to run site audits regularly. An important note is that since the Panda score is assigned to the whole site and not to separate pages, one low-quality page can devalue an entire website. Given that, it’s important to go through every single page of a site to identify low-quality content that is dragging the site’s overall quality rank down.

2. Find Any Pages with Duplicate Content

Once there is a clear breakdown of all the webpages in the site, it’s time to find any that have internally duplicated content. Once identified, the options are to either fix any technical issues that caused duplication or edit pages to include new content. Although often overlooked, it’s also important to diversify page titles and meta descriptions to ensure the biggest boost. Furthermore, similar-looking pieces of content can be interpreted as a sign of content automation, so it is crucial to vary the structure of pages as well.

3. Identify Pages with Thin Content

Thin or light content is another Panda no-no. With the introduction of semantic search, Google needs to understand what a webpage is about. When there is very little content on a page, it makes it very hard for the search engine to identify the “why” behind a page. As a result, Google views pages with little content as irrelevant and doesn’t rank them high.

Regrettably, Google does not provide guidelines surrounding how many words a piece of content should have not to be considered thin. Furthermore, pages that are light on content sometimes perform surprisingly well and even get featured in the rich snippets section. That being said, having too many pages with thin content runs the risk of flagging a site under Panda. A good starting point is to focus on pages with fewer than 250 words and add to them when possible.

4. Find pages with externally duplicated content

Google believes that every page on the internet should add value, and this can’t be achieved with duplicate or plagiarized content. Given the time and effort that goes into producing high-quality content, it’s a good practice to check Copyscape for any external duplication.

Some industries, such as online retail, won’t always have 100% unique content given the nature of multiple businesses all selling the same products. There are two approaches to this, either try to make the product pages standout with additional product data or use product reviews, customer testimonials, and comments to expand the content.

In the event, another business is stealing valuable content; it’s a good idea to contact the webmasters to ask for the plagiarized content to be taken down or submit this content removal form from Google.

5. Make Sure there is no Keyword Stuffing

Keyword stuffing is an old practice that has been effectively outlawed since the release of Panda. Essentially, keyword stuffing is when a webpage is overloaded with keywords in an attempt to manipulate the search results to gain a higher ranking. To ensure a webpage is not guilty of this, go through any Keywords in titles and meta descriptions, tags (which are outdated as a whole), and Keywords in the body. If there appears to be an unnatural flow to the keywords or the content isn’t readable by humans, it’s time to adjust.

Penguin

Following the release of Panda, Google introduced another update called Penguin that clamped down on sites with manipulative, spammy links. Historically, link building has always been one of the most powerful ranking factors for search engines. There was a time when Google’s algorithm couldn’t determine the quality of those links, and so for many years, SEO “experts” created low-quality backlinks in droves. By buying links or use unethical techniques to manipulate the search results, businesses were able to circumvent the quality requirements and still rank on page one. Google realized this behavior needed to be stopped and introduced Penguin to put an end to it. Penguin works in real-time, checking every internal and external link connected to every website. Like Panda, Penguin is now a part of Google’s core algorithm.

What to Watch out for

  • Links from spammy sites
  • Links from topically irrelevant sites
  • Paid links
  • Links with overly optimized anchor text

What to do

1. Get the full list of backlinks

The very first thing to do when preparing to address spammy links is to extract the full list of backlinks pointing to the site.

2. Identify spammy links

Once the backlink list has been identified, the next step is to identify what links are toxic. The best and the most convenient way to do it is based on the following parameters: profile diversity, the age of the linking domain, and the amount of link authority each link shares. Once the low-quality links are identified, they can be addressed.

3. Get rid of toxic links

Now that the toxic backlinks have been identified and it’s been decided which of them need to be removed, it’s time to break the connection. One possible way to do that is by contacting the webmasters and asking them to remove spammy links pointing to the site.

4. Disavow spammy links

Unfortunately, not every website owner will acknowledge and act upon a request to remove a link. In those cases, it’s wise to disavow the harmful links with the help of the Google Disavow tool. When links are disavowed, it lets Google know to ignore them when a site is being ranked.

Occasionally it may be necessary to disavow an entire domain. By doing so, ensures any other spammy links coming from it won’t be missed. Deciding whether to disavow the backlinks to a particular page, disavow and exclude them from all future updates, or disavow and blacklist them, so they are removed from the site forever becomes an exercise in judgment and will change by the project.

The Exact Match Domain update

Also released back in 2012 was the exact match domain update. Enterprising individuals had found a way to make poor quality sites rank well in search results by using the exact match search as domain names. Before this update, when Google saw a specific query and a domain that matched it 100%, the site was automatically deemed the most relevant and received the first ranking. Given the very obvious flaw in that, Google introduced the Exact Match Domain update to remove the preference given to sites with exact match domains, and in a lot of cases, it removed them from Google’s top positions.

What to Watch out for

  • Exact match domains with thin content

What to do

There is no inherent issue with using an exact match domain if it’s paired with high quality, pertinent content. Given that, the best approach is to identify pages with thin content (as outlined above), and if any are found, expand them with relevant and original content.

There is little to no SEO benefit to having one of these domains in 2020, but it can still be beneficial from a marketing perspective. To raise a website’s authority, it’s better to invest the time and effort into quality link building instead of securing the “perfect” domain.

Piracy

Internet piracy represents one of the few areas that Google has consistently struggled to make headway in, but that hasn’t stopped them from introducing major changes. A decade ago, the Internet was inundated with pirated movies, music, books, etc. Google saw that it had a responsibility to respect and protect copyright owners and rolled out it’s a major update in the form of Pirate. The aim of the update was to penalize sites that violate copyrights.

What to Watch out for

  • Copyright violations

What to do

1. Publish Original Content

Understanding that Google values unique content above almost anything else, publishing it is a good way to boost a website’s ranking. When brainstorming and writing content, resist the temptation to steal pre-existing content; Google will notice it, and the penalties are steep. Furthermore, scraped content delivers very little benefit for the effort involved.

2. Request the Removal of Pirated Content

The Internet is an exceptionally crowded space from a content perspective with humans creating an average of 5,760,000 blog posts per day. Given that, it’s hard for Google to identify and fight piracy on its own. Help Google protect the integrity of owned content if another site steals it or if competitors use pirated content by submitting a request with the help of the Removing Content From Google tool.

Hummingbird & RankBrain

Following up on earlier releases which were designed to lay the framework for ranking on Google by giving priority to high-quality sites, it started working to understand search intent. The goal of pairing results to intent prompted the rollout of Hummingbird in 2013 and then of RankBrain in 2015. While these updates both served to provide an improved understanding of the meaning behind a particular query, they perform different functions.

Hummingbird’s primary purpose is to interpret the queries and provide the results that best match the search intent. Prior to Hummingbird, Google looked at solely at the separate words within the query to figuring out what the user wanted. Following the release of Hummingbird, Google now looks at the combination of words and also considers the context.

RankBrain is a machine learning system that assists Google in processing uncommon or unique searches. When it was released, it was positioned as an addition to Hummingbird that draws on historical data and previous user behavior. RankBrain looks at all queries, the search results that match them, and the results that users view. From this, it attempts to understand the reasoning behind selecting the pages the users chose to better predict results for future unknown queries.

What to Watch out for

  • Exact-match keyword targeting
  • Unnatural language
  • Lack of query-specific relevance features

What to do

1. Diversify Site Content

The era of using a few short-tail keywords throughout your content to successfully rank a website on page one of Google is over. If the goal is a page one placement in 2020, diversity in content and using related terms and synonyms are crucial.

Furthermore, it’s highly worthwhile to utilize natural language and with a special focus on presenting content in the form of questions. This has the added benefit of also raising the chances of being included as a featured snippet.

2. Carry out TF-IDF analysis

Optimizing for Hummingbird and RankBrain is primarily focused around improving a page’s relevance and completeness. Using a TF-IDF analysis tool is a great way to understand what can be improved. Basically, this method allows keywords to be fetched by analyzing a top 10 of competitors and collecting keywords that they have in common.

Pigeon/Possum

Back in 2014, Google recognized that ranking as a local business was exceedingly difficult, which presented a significant problem as local results tend to be the most relevant when a user is searching for something nearby. Understanding that, Google undertook a plan to improve the quality of local search results and rolled out Pigeon, followed by Possum two years later.

The primary purpose of the Pigeon algorithm update was to create a closer relationship between Google’s local search algorithm and its core algorithm. Following the update, the same SEO factors are considered when ranking both local and non-local search results. These changes resulted in a considerable boost for local directory sites. The Pigeon update also created much closer ties between Google Web search and the Google Maps search.

In 2016, Google proceeded to change the local SEO landscape once again by introducing its Possum update. Following the Possum update, Google began ranking search results based on the geographical location of the searcher. For the first time, if the searcher was physically closer to a company’s location, there was a greater chance it would be among the displayed results. Google also became more focused on the exact phrasing of a search. Following this update, even small changes in a query can result in completely different results.

What to Watch out for

  • Poor on-page optimization
  • Improper setup of a Google My Business page
  • NAP inconsistency
  • Lack of citations in local directories
  • Sharing an address with a similar business

What to do

1. Optimize Pages

With the same SEO criteria being used for both local and standard search results, the owners of local businesses need to ensure their websites are correctly optimized. The best way to identify weak spots in your on-page optimization is by running an on-page analysis.

2. Create a Google My Business Listing

Creating a Google My Business page is a must if the goal is to have a website be included in Google’s local index. It’s important to pay attention to the Category that is selected for the business so that it’s displayed for relevant searches. NAP consistency across listings is also important as well.

3. Get featured in relevant local directories

While it seems to be common sense that it’s easier for a local business to have their local directory listing to outrank their webpages in normal search results. Furthermore, local business directories experienced a significant increase in rankings following Pigeon. One strategy worth implementing is to find high-quality local directories and reach out to ask to be featured.

4. Carry out geo-specific rank tracking

Now that search results are location-dependent, it’s a good idea for local business owners to do geo-specific rank tracking.

Fred

Fred sought to curb the ranking of low-quality websites that were flooded with excessive ads, aggressive monetization, and low-value content ranking atop Google. The Fred algorithm update was designed to penalize sites that provided little value for searchers and instead were primarily for generating ad revenue.

What to Watch out for

  • Low-value, thin content
  • Excessive ads
  • Aggressive monetization
  • User experience barriers and issues

What to do

1. Look for Pages with Thin Content

While this may seem to be getting a little repetitive, the importance of quality content cannot be overstated. It’s ok to have ads on a website so long as the pages also contain information that is valuable to users. It’s a good idea to perform a site audit with a focus on any pages containing ads to ensure compliance with Fred.

2. Check the User Experience

There’s little in life that is more frustrating than ads that stop users from being able to consume the content they’re there for? It’s a good practice to view a website from the perspective of a customer and self-evaluate the site with the aid of these Google Search Quality Rater Guidelines. Doing so and making any required adjustments can also significantly improve bounce rates because aggressive ads are often the reason why users leave a page.

3. Reconsider your ads

While including ads on a page is not inherently a bad practice, it’s imperative to look at the ads from a customer perspective and make sure they don’t come across as too pushy.

Mobile-Friendly Update/Mobile-first indexing

Google’s fixation on mobile-friendliness began back in 2015 with the Mobile-Friendly Update. This update intended to provide a ranking boost to websites that were optimized for mobile devices when using a mobile search. During this initial pivot to a mobile focus, the update did not impact search rankings for non-responsive websites when users were searching on a desktop.

Following this, though, Google started rolling out mobile-first indexing, which prioritized indexing pages with the smartphone “agent” before desktop. Mobile-first indexing means that the mobile version of a page will be used for indexing and ranking to help mobile users have the best possible experience.

What to Watch out for

  • Lack of a Mobile Version

What to do

1. Go Responsive

With Google focused on mobile-friendliness more than ever before, adapting a website for mobile devices is a necessity. While it might seem ok to have one version of your site for mobile and one for desktop, Google highly recommends utilizing a responsive design.  The responsive approach has the added benefit of not introducing a second website design you have to update.

Page Speed Update

With the launch of Google’s Page Speed Update in July of  2018, page speed became a ranking factor for mobile searches. While it is not the largest factor in deciding the ranking of websites in mobile searches, it definitely plays a role.

What to Watch out for

  • Slow loading speed
  • Poor technical optimization

What to do

1. Test Page Speed

The logical starting point when optimizing for the Page Speed update is to understand where the website’s current page speed. The best way to do it is with the help of the Page Speed Insights tool (with certain limitations for Shopify sites). The tool will evaluate your site and spit out a rating on a scale of 0 to 100. The total score is calculated based on two parameters: FCP and FID.

The FCP (First Contentful Paint) set of parameters measures how long it takes for the first visual element to appear while FID (First Input Delay) is a measure of the time between a user first interacting with the site to the time when the browser responds to that interaction.

2. Improve your page speed score

Each time Page Speed Insights is used to analyze a page’s speed, it provides optimization advice that can be followed to impact the site’s score significantly. There will also be an Opportunities section which offers additional areas of improvement.

BERT

Announced on October 25, 2019, BERT is the latest update to Google’s natural language processing tool kit. The BERT update is said to improve further Google Search’s ability to understand human intention through both verbal and written search queries.

Wrapping It Up

While Google continues to make small, almost daily updates, this was a comprehensive breakdown of the major algorithm updates of the last decade. These changes have had a lasting impact on the SEO landscape, a landscape that will continue to change in the coming decade. Optimizing a website is an on-going undertaking, not a one and done activity and it’s something that can be quite time-consuming. The team at Shop Style Design can help your business reach its first page goals.

Subscribe To Our Newsletter

Get updates and learn from the best

More To Explore

eCommerce

10 Tips for Creating a Killer FAQ Page

A Frequently Asked Question page (FAQ Page) is an important part of your website. It is a great place to showcase repetitive questions from customers. In addition, having a landing page for common questions allows customers to find what they are looking for easily. If a customer can easily find the answer to their question,

eCommerce

Social Proof – How to Use Your Most Influential Marketing Asset

It is unconventional for visitors that are brand-new to your site to become life-long customers right off the bat. Your business needs to earn their trust.  Social proof is a great tool in facilitating that process. It is one of the most effective tools for building confidence in an e-commerce business. It even has the

Scroll to Top