Fluctuating Results: Major Google Algorithm Updates and The Evolution of SEO Strategies
Fluctuating Results: Major Google Algorithm Updates and The Evolution of SEO Strategies
Digital marketing and search engine optimization have moved at rapid speeds throughout the past decade. Algorithm updates have changed the search landscape and heavily modified SEO strategies, while new algorithms altogether have created opportunity for unprecedented user experience. This article will take a look at Google’s most infamous algorithm updates and will then discuss the change in strategies taken to rank in organic search results. Furthermore, a SlideShare presentation and quiz are included to provide readers with key takeaways and to test their understanding.
Google Algorithms and Updates
Throughout becoming the leader in search engine usage, Google has consistently refined its algorithms by either updating them or creating new ones entirely to ensure that users get provided with results that closely relate to their query. This section will cover the Panda, Penguin and Pigeon updates, as well as the Hummingbird algorithm, while the SlideShare below offers quick takeaways.
The Panda update was first revealed in February 2011. A groundbreaking change to how search results are presented, this algorithm scrapped many websites – especially content farms – of their worth by devaluing both thin content and duplicate content.
Generally speaking, duplicate content is when content exists on more than one page on the web.
Thin content refers to a page on a site with a considerably low amount of copy and little depth.
More often than not, the Panda algorithm effects websites on a site-wide basis (compared to just single pages) and is dedicated to serving results from websites that have high-quality, rich content. Because of this, a lot of news websites saw their rankings increase drastically whereas publishers of thin content dropped significantly. Google has provided webmasters with a blog post to help them understand what type of content it looks for.
Introduced in April of 2012, the penguin update strives to cripple websites that have unnatural backlinks pointing to a page in order to contribute to boosted page rankings. Because links can significantly influence rankings, questionable link-building tactics were largely used to gain as many backlinks as possible regardless of relevance and quality. To recover, websites can identify and disavow spammy links to avoid getting demoted in rankings.
Hummingbird launched in September 2013 and mainly focuses on delivering results that are better suited for each user’s intent. Unlike Panda and Penguin, Hummingbird is its own algorithm. It allows Google to break away from only serving pages with specific exact match keywords and can now understand synonyms and other phrases that relate to search queries, as well as context. By thoroughly answering questions and providing insightful content that relates to queries, websites are able to put themselves in a better position to rank compared to creating content that purely revolves around a keyword. This helps the user find more relevant content that can help them obtain the answers they seek.
Purely geared toward local search results, the Pigeon update was released in July 2014 to provide users with accurate results that are geared toward local businesses. In addition to regular search results, the update also influences the results that appear in Google Maps by focusing on the geographic location of users. Furthermore, the update weighs traditional organic search factors more than previously coveted local search signals.
Strategies to Successful Organic Search Results – Then and Now
Now that we’ve highlighted the significance of algorithm changes, we will now discuss the shift in best practices that have been taken to ensure successful organic search results throughout the SEO results timeline. In discussing the following, you will notice the influence that Google algorithm updates have on today’s search landscape.
Once considered the main ingredient to the ranking recipe, keywords still have a place on the shelf but the strategy has noticeably shifted. No longer can publishers focus their content on one specific word and irresponsibly repeat that exact phrase numerous times in order to gain recognition in SERPs. Because search engines have evolved to understand groupings and different variations of terms and their accompanying language, it’s imperative that multiple variations of targeted phrases get used in copy. Furthermore, the current concept is more tailored to user intent. The example below provides more context.
By understanding that a person who searches for beginner’s ski equipment is also likely to use different variations of that term, such as “ski equipment for amateurs,” Google serves the same results despite different wording.
On another note, keyword research has also changed because Google can now directly serve results for basic informational queries that have high search volume. For example, if an individual searches a general query such as “what is a computer scientist?”, Google will automatically pull information and put that at the very top of the results so users don’t need to scroll down and click through to a page. This is to bolster user experience and provide users with quick information that relates to their search. These types of keywords were at one point a desirable focal point of thinner content due to the traffic that they could offer, but now search engines are able to directly provide answers.
Content was mainly focused on ranking for single keywords and had little emphasis on the user. Furthermore, word counts hovered around the lower hundreds and did very little to thoroughly educate users on a specific topic, since the pages likely existed to only focus on a single term. Thin content continues to gradually get pushed from SERP rankings in favor of rich content that has multiple layers of depth.
Furthermore, meta tags used to be a major factor in ranking, causing individuals to stuff keywords (both branded and unbranded) into the copy that may not even be related to the page’s content. However, meta descriptions now have next to no affect on rankings and basically serve as call-to-action copy that briefly summarizes the pages that the user sees on the results page. Meta is more geared toward adding clarification to help enhance user experience.
Search directories were often the go-to tactic used for SEO mainly because it was a quick and easy way to build links. And due to the primitive nature of search engines — and the prominence of black hat SEO tactics — the results on the search pages were often unreliable and unrelated to user intent.
Because of this, directories were a formidable solution to find websites since they organize information and data under categories and subcategories — compared to keywords. The two biggest were the Yahoo directory and DMOZ. But there were also thousands of smaller options. When search algorithms started to get more sophisticated, listings in directories were part of the ranking equation. Therefore, if a site had links from multiple directories, the search engines viewed the page as authoritative and deserving of ranking. It may seem like a foregone conclusion, but the directories eventually got exploited and the SERPs inevitably diminished the influence they carried in rankings.
Today’s link building tactics differ considerably. Through creating valuable content that provides users with detailed information, publishers are able to earn editorial links from websites. Often times relationships need to be built and nurtured with website bloggers, writers, editors, etc. in order to successfully earn placements. Furthermore, generating referral traffic from backlinks is important to show that your content is meaningful to readers and provides the necessary context.
A decade ago, it was next to impossible to envision the influence that mobile devices would have on society. Nobody was planning on frequently searching for their favorite restaurant or reading the news during their commute on their Razr V3. But as of October 2014 (the latest research upon this article getting published), 64 percent of American adults own a smartphone. Furthermore, 51 percent of time spent on the Internet comes from mobile. Unlike the previously mentioned strategies, mobile is newer and will likely become the focal point of search.
Want A Custom Strategy?
If you’re looking to apply today’s best practices to help your business maximize its marketing budget, feel free to schedule a free strategy session with us today.
One part founder, two parts bourbon. Dan, our resident SEO genius, works tirelessly to solve all of your online marketing needs. With 10+ years of experience in the field (and even more on the golf course), he translates all the technical mumbo-jumbo into effective strategies that will help your Louisville, KY business succeed.