Net66: Understanding Google Algorithms Updates and Refreshes

Google’s algorithms are often the subject of much discussion, chagrin, angst and other more colourful words. This is because Google themselves rarely announce when they release or are about to release an algorithm update.

Sure, once the fallout from algorithms reaches fever pitch they can sometimes say “Oh yeah, we did redefine everything is SEO over the weekend, thanks for noticing”.

They also give vague and nondescript warnings of “We’ll be releasing a new algorithm called [Insert_Animal_Name_Here] at some point in the future”.

There have been several rumours about Google launching a new refresh of Penguin or a large update of Panda recently. First off there were some suggestions of Google experimenting with a refresh of Penguin. Which got a lot of people very excited as they believe their sites are still “trapped” by Penguin and can’t get out until the algorithm updates.

Also last week there was a large scale suggestion of Google updating their Panda algorithm. But with Google already stating that they’ve stopped confirming Panda updates, we’re like not to know.

Now these algorithms are COMPLEX. Seriously complex, but thankfully Google’s John Mueller has taken time to write a helpful post in a Google Webmaster forum:

In theory: If a site is affected by any specific algorithm or its data, and it fixes the issue that led to that situation, then the algorithm and/or its data must be refreshed in order to see those changes. Sometimes those changes aren’t immediately visible even after a refresh, that’s normal too.

In practice, a site is never in a void alone with just a single algorithm. We use over 200 factors in crawling, indexing, and ranking. While there are some cases where a site is strongly affected by a single algorithm, that doesn’t mean that it won’t see any changes until that algorithm or its data is refreshed. For example, if a site is strongly affected by a web-spam algorithm, and you resolve all of those web-spam issues and work to make your site fantastic, you’re likely to see changes in search even before that algorithm or its data is refreshed. Some of those effects might be directly related to the changes you made (other algorithms finding that your site is really much better), some of them might be more indirect (users loving your updated site and recommending it to others).

So yes, in a theoretical void of just your site and a single algorithm (and of course such a void doesn’t really exist!), you’d need to wait for the algorithm and/or its data to refresh to see any changes based on the new situation. In practice, however, things are much more involved, and improvements that you make (especially significant ones) are likely to have visible effects even outside of that single algorithm. One part that helps to keep in mind here is that you shouldn’t be focusing on individual factors of individual algorithms, it makes much more sense to focus on your site overall — cleaning up individual issues, but not assuming that these are the only aspects worth working on.

All that said, we do realize that it would be great if we could speed the refresh-cycle of some of these algorithms up a bit, and I know the team is working on that. I know it can be frustrating to not see changes after spending a lot of time to improve things. In the meantime, I’d really recommend – as above – not focusing on any specific aspect of an algorithm, and instead making sure that your site is (or becomes) the absolute best of its kind by far.

Blog post by: Greg McVey