Last Updated: 14 November 2019
One of life’s great certainties is that Google will always be trying their best to improve their search engine along with its results and features for users. In 2018 alone, they made 3,234 changes to their algorithms! Obviously, keeping up with such a rapid rate of change is difficult. Fortunately, however, there are ways and means of doing so and of recovering your website from any negative changes that may occur as a result of such updates.
Should you keep track of algorithm updates?
With so many changes released a year (and with many of those undocumented), it’s probably fair to say that there are only one or two major changes a year that are important to keep track of and worth reading up on. Examples are the ‘medic’ update last year and the BERT update this year. However, trying to research the many minor updates is likely to be a losing battle, partly due to the inevitable lack of data and partly because there are so many of them!
Staying up to date
There are a number of resources we can recommend to keep updated with the latest developments in the search engines:
There are also a number of useful pages provided by Google themselves to notify webmasters of changes. The Searchliaison Twitter account is regularly updated, blog posts of major changes can often be found at the official blog and the Youtube channel often has helpful tutorials and advice.
Algorithm change history by Moz
Giants of the search industry, Moz has a update timeline which can be helpful to have a read through, once in a while, to catch up on what major changes have affected the search results.
Barry reports on virtually everything that happens in the world of search and, as such, some filtering of content may be required. However, reading Seroundtable is a great way to stay updated.
Focus on the fundamentals
For any given search, there are a number of fundamental elements that search engines will always look for high-ranking pages to have, regardless of what updates they put forwards in the future:
- A page that fully meets the needs of the searcher
- A page that is well-optimised in terms of its content and on-page SEO, with high-quality content that is easy for both users and search engines to read and digest
- A page that is technically excellent (secure, fast-loading, mobile-friendly and so on)
- A page that provides a good user experience and is well-designed, formatted and laid out
These and other fundamental SEO aspects (such as having page that are fully crawlable and indexable) are very unlikely to ever change, regardless of the ongoing tweaks that are made to search engine systems. Natural language processing is something that we think will be big in the next few years though, as the ongoing goal of fully understanding both text and voice-based search queries (particularly ‘long-tail’ ones) gathers pace.
Let's work together
If you want to talk to our specialist team about how we can help you with your digital marketing, talk to our team today.
What to do if you are affected by a change to the algorithm
In general terms, websites can be affected negatively in 3 main ways by Google (in terms of direct effects, not counting increased levels of competition in the search results)
- Algorithmic penalties
- Manual penalties
- General ranking drops as a result of core ranking updates
If a major drop in organic traffic occurs, it’s often a good idea to identify what kind of a drop it is.
Types of organic traffic drop, in terms of visitors lost
Very sharp (for example, one day there are 10,000 visitors/day, then the next day there are 70)
What to do:
- Check that Google Tag Manager/Analytics is set up correctly – is the code on all pages? Is it the right UA code/tag container? Has it been added twice by accident?
Steep decline over several weeks/1-3 months
This is typically the result of a specific penalty (manual or algorithmic) relating to site content or the site’s link profile. Alternatively, it could be a major technical error on the site which may be affecting crawling and indexing. It may also be related to user experience or trust issues. For example, if a very large numbers of ads were placed above the fold or if the site was hacked.
What to do:
- Dig into Google Search Console, keyword data and Google Analytics. Are there any particular terms or pages that have dropped more than others? Are there any patterns to those pages that might explain why they’ve gone down? -thin/duplicate content, spam, etc. Has the number of indexed pages dropped off sharply in GSC or is there anything else that looks odd? Are there any manual penalty notices or security breaches?
- Check with IT and development teams that there hasn’t been any major changes to critical parts of the site such as the robots.txt file, .htaccess or the sitemap.xml. For example, it may be the case that Google is blocked from crawling the site due to development changes.
- Run a detailed crawl of the site (using something like Deepcrawl, SItebulb, Screaming Frog) and look for anomalies e.g. a major and unexpected URL structure change, large scale changes to canonical tags, a large numbers of pages being noindexed accidentally and so on.
- Check backlink profiles. Has there been a large influx of spammy backlinks to the site recently?
Usually, with sharp drops, there is one major reason why they’ve happened, sometimes with several other moderate to minor issues as well. Once the major issue is resolved, traffic should begin to return.
Gradual decline over a number of months
This generally means that the site in question is gradually falling in the search results due to a combination of ongoing, smaller updates and as a result of other pages outclassing them.
What to do:
Unfortunately, the best solution to a problem like this is to simply make the site better! The best solution is usually to do a detailed audit of the site to establish a good number of areas for improvement, prioritising them and then work through them.
- Identify key content on the site and improving/updating it
- By contrast, clearing out useless content and making the site leaner (low quality news articles with no traffic, for example) can also help.
- Is the site on the slow side? Improving page speed can give a small ranking boost as well as improve conversion rates and crawlability.
- What about metadata? Often, if metadata isn’t updated regularly, it may lead to lower clickthrough rates (For example, a Black Friday page that hasn’t had the date in the title/description changed from 2017)
- Is the site’s authority on the low side? Acquiring high-quality, natural links can often help to give it a boost.
There are plenty of other ways to improve a site. Our SEO audit template has over 180 factors that we measure a site against!