Google didn't become the number one search engine in the world by just resting on its laurels. Realizing that their algorithm was being manipulated, they began to update it regularly, and their algorithm has grown much smarter over the years. Their algorithm is now capable of more effectively identifying relevant and high-quality web content. It can even determine if a website is using blackhat tactics.
Google also developed the "no-follow attribute." There are two types of links: no-follow and do-follow. Google's PageRank algorithm ignores No-follow tags. They were developed as a way to counter blog comment link spamming, which had become a real problem. In general, all outbound links posted in blog comments, social media posts, forum posts, press releases, and widgets now use the no-follow tag.
Google also established a webspam team with the sole purpose of seeking out websites that were manipulating their rankings using blackhat tactics. The team was headed by Matt Cutts, who co-authored a Google patent on webspam. One of Google's patents shows its ability to identify websites that are manipulating Google's ranking system unfairly, as well as the ability to penalize those sites. While Google can penalize sites if they suspect them of rank manipulation, Cutts has stated that just because Google owns patents to certain capabilities does not mean that they use those capabilities as part of their algorithm.
The following are some of the significant algorithm updates that Google has made over the years to deal with search ranking manipulation and to improve their overall search engine results.
Reducing The Reliance On Links For Search Rankings
Initially, Google's search engine relied almost exclusively on backlinks and keywords to rank webpages. However, as users have evolved, so has the way Google ranks webpages. Google now uses a wide range of different factors in addition to backlinks and keyword optimization to identify and rank content, naturally reducing the importance of links. And the introduction of no-follow links has completely devalued certain types of backlinks.
Google now considers the domain authority of the websites that are linking to your page so that they are not rewarding backlinks purchased from low-quality sites. While they won't necessarily penalize backlinks from sites with low domain authority, they won't count them towards your ranking either. However, if Google notices that you have a significant number of backlinks coming from low authority sites, they may assume that you're purposefully trying to manipulate your ranking. Google could penalize you for this by lowering your ranking.
The Use Of Social Media Signals
One of the reasons why Google no longer relies solely on backlinks and keywords is that there are other ways to determine the quality of a webpage. The advent of social media introduced a whole new way for users to interact with online content through comments, likes, and shares. Social media grew significantly in the late 2000s to the point where it changed the online landscape for good. Google realized that social media signals, such as likes, shares, and comments, were citations from the users themselves, indicating proof of quality. Google announced in December 2010 that they were paying attention to social media engagement as a factor in ranking web content.
Updating Algorithms To Detect Quality More Effectively
Google continues to make minor adjustments and changes to its algorithm every day. However, every once in a while, it introduces significant changes. For instance, Google introduced the Panda update in 2011. The purpose of the update was to improve the quality of their SERP by improving Google's ability to identify poor quality or shallow content. The update addressed everything from content farming to duplicate content and more. Its release effectively changed 12 percent of all of Google's search results. The change was so significant that it changed the way that content would be created from then on.
Launching The Freshness Update To Identify More Relevant Results
Panda wasn't the only significant change Google rolled out in 2011. At the end of that year, Google also introduced its Freshness update. Its purpose was to make it easier to identify "fresher" content, meaning, newer content. The ability to do this allowed Google to present users with content that was more relevant and less outdated. The update affected 35 percent of total searches, with upwards of 10 percent of search results being significantly affected.
The Launch Of Google's Penguin Update
In April 2012, Google launched its Penguin update to combat blackhat tactics involving link-building. Before the update, link volume was a huge factor in Google's ranking algorithm. The Penguin update allowed Google to identify the quality and trustworthiness of a webpage's backlinks. This Penguin update effectively countered common link-building techniques such as link farming, link purchasing, and the use of private blog networks. Penguin also made it easier for Google's algorithm to identify keyword stuffing practices.
The implementation of Penguin also resulted in many websites receiving penalties for unnatural link building. These penalties weren't automatically executed by Penguin, although it did affect the rankings of sites practicing such techniques negatively. Most penalties resulting from the Penguin update were manual penalties.
The Launch Of Google's Hummingbird Update
Before the Hummingbird update, Google focused on lowering the rankings of poor quality content as their primary way of improving their user experience. Addressing blackhat SEO tactics was a significant component of this strategy. With the Hummingbird update, Google began to focus more on better understanding its user queries to deliver more relevant results instead of focusing primarily on matching the use of keywords.
The update used natural language processing, including synonyms and semantic indexing, to better understand user context. It also meant that Google could now understand a query even when a user misspelled a word or used slang. Google effectively eliminated the blackhat tactic where keywords were purposely misspelled to attract users who misspelled their search queries.
The Google Mobile Update
In April 2015, Google released its mobile update, which was nicknamed "Mobilegeddon." Critics feared that the update would have drastic effects on websites that weren't optimized to be mobile-friendly. Google understood that mobile use was on the rise and that the user experience on mobile devices was inferior to that on desktops.
Their mobile update rectified this by prioritizing mobile-friendly websites on their separate mobile SERPs. Any website that wasn't mobile-friendly was either penalized or removed from the mobile SERP. Since then, mobile users have eclipsed desktop users, so Google eventually moved to mobile-first indexing. Now the mobile-friendliness of a website will impact how you rank for every query.
The Rise Of RankBrain
RankBrain introduced a new way to analyze and rank web content using artificial intelligence and machine learning. RankBrain was a part of the Hummingbird update in 2015. With RankBrain, Google can now deliver relevant results to user queries even if it doesn't understand what the query means. According to Google, RankBrain is the third most important signal behind content and links.
Not only can RankBrain use machine learning to provide results to unknown queries, but it can also determine if those results are relevant. It does this by monitoring how users engage with the search engine results pages. If the engagement is low, the ranking of the page is lowered. If it's high, it will be raised. There was always some speculation that user engagement had to influence rankings in one way or another. The release of RankBrain essentially confirmed these assumptions.