


What’s up with Google’s algorithm?



Early linking strategies
Early search engines relied on what we now call “on-page” factors – basically, metadata and the content of your page. This was relatively easy to handle and led to a game between search engines and SEO. When Larry and Sergei Brin developed PageRank, they added a new world of “off-page” causes based on the World Wide Web links themselves. Links are more challenging to handle because they are beyond the control of any website. It’s just a matter of time before you buy, edit links, and even build complex link networks. Google had to adapt, and in 2011, Eric Schmidt discovered that Google had made 516 dramatic changes to the algorithm based on over 8,000 experiments, and in 2020, it increased to 4,050 based on 600,000 experiments. How can we rely on an algorithm with more than 11 changes per day?Algorithm updates
Updates, big and small, occur all the time, and being aware of them is vital to understanding our SEO impact. Even following all the current rules and guidelines isn’t enough if you don’t keep up with the later changes. Google aimed for content quality with Panda, link manipulation with Penguin, local/mobile intent with Pigeon, machine learning with Rankbrain, natural language with Hummingbird, and most recently with natural language search. Panda initially focused on content farms, later expanding to include duplicate content, low-quality user-generated content, and “thin” content created as a replacement for advertising. Panda was baked into the basic algorithm around 2015, which brought an ongoing philosophy that more doesn’t necessarily have to be better. The content should serve the purpose. Many of Penguin’s subsequent updates to manipulate link building and link schemes have severely penalized some sites for making recovery very difficult. In 2016, Google lifted many of Penguin’s penalties and updated a new algorithm that reduced the value of most low-quality links. Any attempt to create low-quality links or visible link networks is a waste of time and money. [bctt tweet=”Links are more challenging to handle because they are beyond the control of any website.” username=”ThatCompanycom”]


E-A-T – Expertise, Authority, and Trust
While Google may not specifically use E-A-T as part of its algorithm criteria, understanding what it is and how to utilize it will certainly help with the algorithm. Each of these is self-explanatory, but to clarify from a search perspective, expertise relates to understanding what searchers are looking for and providing them with precisely what they want. This means understanding the search intent. There are a lot of experts, but who is the authority on a subject? That’s where the next step comes in. You want to be the authority in your industry. Finally, you need to be a trustworthy source that people can go to and know that what you provide is on the up and up. Take care of any negative sentiment and be the gold standard.Core Web Vitals and Website Speed
The amount of time that people have to wait to interact with a website affects user experience and is something that Google monitors. In June 2021, Google finally released the Page Expert Update and gave us the Core Web Vitals, which currently consist of LCP, FID, and CLS. Largest Contentful Paint (LCP) measures the time it takes for your most important content to appear at the top (also known as “painting”). This causes the perception of speed in the eyes of visitors. First Input Delay (FID) looks at a user’s first interaction and checks to see how long it took your site to respond. An interaction can include things like clicking a link, button, image, or video. Cumulative Layout Shift (CLS) reviews your site’s content stability. If your site constantly updates the content as it loads, like from aggressive advertising, it can look like content moves as things load.Google’s Intent
Written By: Doyle Clemence