loader image

Search engine optimisation has often had secrecy surrounding it, which over time has given rise to a number of myths, lies and inaccurate information. Recently, there appear to be some ‘new’ myths, so to speak, on the block….

Here are some misconceptions that are growing in popularity when it comes to on-page optimisation:

1. Google will deal with your duplicate content itself

There is some suggestion that as Google advances ever further into the realms of computer science, duplicate content will become less of a problem. After all, if Google is intelligent enough to crawl, separate out and analyse the individual blocks of text that constitute your site, then surely it will be able to tell the difference between an occasional piece of innocent boilerplate text and outright content theft?

That much is certainly true, (although it’s still a good idea to use unique product descriptions where possible, for example) but where duplication of a page occurs in full across multiple URLs (ghosts of products in E-commerce stores, for example) this can cause serious problems. From the perspective of Googlebot, if it has to recrawl an exact duplicate of the same product page 6 times, it probably won’t be too pleased!

Below is an example of an Agency51 client that had this exact issue. Although the site was relatively well optimised already, there were three figures’ worth of these ‘dead weight’ pages. The below screenshots from Webmaster Tools and Analytics show the results of simply re-directing duplicate pages to their canonical equivalents and the changing of the 302 redirects to 301 redirects, with no other substantive on-page changes.

We were pleased with the results, especially given that it was a simple change to make. This ties in with what Brian Dean and Bill Sebald have found with their own and client sites, which is to say that ‘pruning’ of pages which have either little or negative SEO value can be a very effective process.

2.Technical SEO is now unnecessary, due to CMS advancements

Several individuals have said (specifically here and here) that technical SEO is now not as important for the average webmaster to understand, partly given the spread of user-friendly CMS systems that simplify the process. With the enormous spread of WordPress across the net and WYSIWIG editors, it’s now possible for anyone to have a website up and running in minutes. However, as Mike King makes very clear here, the ongoing development of ever more complex web technologies actually means the opposite, as ensuring technical compliance with Google’s spiders becomes an increasingly specialised business. Gone are the days of simply coding together a basic HTML page in the age of new technologies such as the Javascript-based Angular, and whilst Google and the other search engines will do their best to ensure as much of the web is as spiderable as possible, it will be necessary as time goes on for search marketers to keep a close eye on exactly how ‘SEO-friendly’ these new technologies are.

WORDPRESS SEO OPTIMISATION

With regards to the CMS giant WordPress, the amount of effort that Automattic have gone to to ensure user-friendliness does mean that technical knowledge is not essential in order to make a website function. However, learning basic source code interpretation and having knowledge of spider emulation tools is helpful, as otherwise diagnosing problems may be difficult (as is the case with other CMS systems.) Additionally, although there are plenty of free SEO audit tools on the web to help the average webmaster get an overview of their domain’s technical setup, understanding much of this does require a level of familiarity.

Out of the box, WordPress does come with some unfortunate SEO problems, which is why so many people have installed the Yoast plugin (more than 35 million at last count) some of which does require manual setup and specialist knowledge in order to deal with these issues. For example, the standard category, post and tag taxonomies which invariably need customising, as well as setting permalinks (which is not set to post name by default.)

Let's work together

If you want to talk to our specialist team about how we can help you with your digital marketing, talk to our team today.

3. Using exact match keywords in your title tag is a bad idea

The argument relating to this particular concept is simple: as Google develops its abilities in semantic search, the usage of specific keywords in strategic areas (such as the meta title and the page URL) will diminish in importance, as latent semantic indexing, topic recognition and Hummingbird all do their thing. However, all is not quite as it seems.

The following screenshots from the large-scale experiments carried out by Ahrefs and Backlinko both show a distinct correlation between usage of keyword in the title tag and ranking position.

The following ‘clean’ result  (incognito window, signed out of Google/viewed through a proxy) in google.co.uk for the keyword ‘digital marketing agency’ illustrates the importance of keywords in title tags very well:

Given that every single result has the exact match keyword in, we’d say that was a pretty strong correlation (the rest of the results also have the exact match ‘digital marketing agency’ keyword in the title tag.)

The takeaway? As long as you’re not engaging in keyword stuffing practices, keep putting keywords in your title tags!

4. User experience errors arising from 404s can be solved with spectacular design

There seems to be an awful lot of information out there about how to design’creative’ 404 pages:

Last time we checked, 404 pages are only served to the user when something goes wrong – the purpose of a 404 page from a user’s perspective should be, first and foremost, to do the following:

1.Inform them that something has gone wrong

2.Provide them with a solution to the problem, or if that is not possible, direct them to the most appropriate page (the homepage, a search function, etc.)

To that extent, it’s somewhat hard to distinguish whether the below pages achieve the above objectives:

For example, the above page does not provide a clear CTA to direct the user elsewhere.

Whilst visually attractive, the use of industry-specific terminology on Southwest Trains’ 404 page arguably does not provide adequate  explanation as to what has happened to the user.

The lack of clickable or searchable elements on Worrydream’s 404 page makes it difficult for the user to navigate elsewhere (unless their main method of navigation is to edit URLs in the address bar.)

There is an argument here, too, that if a page is too well-designed it may easily confuse visitors. An elderly searcher, for example, may not realise that a snappily-designed, cleverly worded 404 page is actually an error page. The above examples rely in part on the user having a certain level of understanding and familiarity with the rest of the website’s architecture, design, colours and styling to the extent of being able to recognise that something has happened that shouldn’t happen, and where 404 pages rely on a degree of humour (that may be subjective) users may disagree on the suitability of the page.

There isn’t anything inherently wrong with spending time on designing a nice-looking 404 page of course, but it is important that functionality is integrated into the design, and that CTAs are prevalent.

Let's work together

If you want to talk to our specialist team about how we can help you with your digital marketing, talk to our team today.

5. A fast, secure site will help you rank better

Whilst we would be the first to say that site speed and security are important elements in the setup of a domain (particularly for E-commerce sites) it’s hard to see why there has been so much written about the organic search benefits of them specifically:

Google have stated that both security (HTTPS) and the speed of your site are ranking factors, but that would be missing the point; (between them they only account for around 2-3% of the algorithm-the rest is, as most of us already know, links and content) with both of these, user experience is the key  – no-one likes an unsecure, slow site. Instead of drawing a causal relationship between these factors and ranking better, we should be concentrating on what benefits this brings to users, and the knock-on effect as a result. Google will soon mark pages specifically as not secure within the search results (and do the same in Chrome as well) so it’s certainly worth getting ahead of the game now from a conversion and clickthrough perspective, if not an organic positioning one.

AMP AND SEO

With regards to AMP, there is some amount of chatter which suggests that it may not be as crucial as first thought, and may in fact lead to you losing some of your traffic to Google itself.

We’re inclined to wait it out for now and see what happens – the push towards blazingly fast pages can develop into an unhealthy obsession and end up being little better than a red herring as far as organic rankings are concerned, and  furthermore the claim of pages loading under 1 second being ideal is somewhat mad, given that around 600ms of that is completely beyond the control of the domain (due to requests having to be processed before they even hit the hosting server, and the natural speed limits of 3G/4G phone networks compared to home broadband for example.)

Furthermore, AMP has been confirmed as intended for article-based pages and sites, so from a commercial angle there is not necessarily a need to rush to get ‘AMPed up’ unless doing so relates to the company blog.

Having said all of that, speed remains an important consideration for the modern website owner, and by making more well-established changes (such as minifying JS and CSS, and browser caching) substantial improvements can be made. Indeed, it remains sensible to do baseline speed optimisation before trying to use AMP as a ‘magic bullet’ to solve loading issues.

 

Takeaways:

It always pays to be mindful of what constitutes ‘good’ information in the world of internet marketing. As far as this article is concerned, these are the points to note:

1. Prioritise full-page duplicate content and URLs over other types of (internal) duplicate content

2. Learning basic technical SEO will be beneficial, for any website owner

3. Within reason, using keywords in your title tags remains a good idea

4. 404 pages should be functional and helpful-visual attractiveness should not come before usefulness

5. AMP does work, but rushing headlong into adopting it may not be the best of strategies.

 

If you find all of this is a little overwhelming, why not get in touch with a member of the Agency51 team and we’ll be happy to talk you through how we can help you!

Ben Henderson

Ben Henderson

Ben Henderson is a SEO specialist at Agency51, and enjoys working on and writing about all aspects of technical SEO for a wide variety of websites and industries.

    10 Great North Way, York Business Park, York YO26 6RB

    © Copyright Agency51 2019. All rights reserved. All trademarks acknowledged. Agency51 Limited, trading as Agency51, is registered in England & Wales: No. 05516494. VAT Registration Number: 192 0540 24. Agency51 is part of the PureNet Group of Companies

    Privacy Policy     Cookie Policy