Site:
Cookie monster This site uses cookies. By continuing to browse the site you are agreeing to our use of cookies. Find out more here.
Skip to content...

SEO 101: The Technical Foundations of Search

Tackling the broad & ever changing range of activities that add up to Search Engine Optimisation (SEO) can seem daunting. So, before we get started, please...

Don't Panic

Don't Panic!

The goal of this series is to give a solid grounding to the core concepts of SEO, built by our team of experts from years at the cutting edge of SEO, and which remain true in the face of Google, Yahoo! & Bing’s ever changing algorithms.

You will learn what aspects of Technical SEO actually deliver sustained ranking improvements, and how you can build and execute an SEO campaign that delivers improved sales for your website.

These concepts will be built upon in later instalments of the SEO series, and will allow you to confidently execute strategies that make an immediate difference to your website’s performance.

Finally, in addition to developing a thorough knowledge of the foundations of expert SEO, you will also be able to critically assess the usefulness or otherwise of future innovations to promote your website’s performance in the search engines.

Oh... and before we get started, we should be clear on one definition...

What Is SEO?

Search Engine Optimisation (SEO) is the process of improving a website’s traffic from search engines’ organic (i.e. non-advert or non-sponsored) search results.

It's important to note that this definition deliberately focusses on traffic rather than just increasing rankings, making our focus more on business success rather than just SERP visibility for visibility's sake.

In particular, we are most interested in targeting relevant traffic which is likely to convert in some way on the target website.

For example, in the below screengrab, we can see Amazon's dominance for the highly relevant search term 'Buy Amazon Kindle'. A high proportion of clicks from this term will convert to sale, as the intent of the searcher is clear from the 'buy' part of the search keyphrase.

SERP dominance

As you can also see from the previous image, while improved rankings are key to increasing clickthroughs for a target search term, improving a listing's SERP visibility or relevance to the search term will also improve clickthough rates (an important ranking factor itself these days, and therefore doubly important to the expert SEO).

For those still getting used to the various acronyms prevalent in SEO (and PPC & CRO of course), SERP stands for the Search Engine Results Page.

What Do Search Engines Want?

It’s amazing how few SEOs ask themselves this question when confronted by speculation on a new rumoured penalty, or whether a technique is ‘Black Hat’ or not. Which is a shame, as answering the question invariably clears up any mystery. To understand why, we need only look at the very earliest days of search.

While an extensive knowledge of the history of search is not strictly necessary, understanding that Google came to prominence in the early 2000’s on the back of two, key, innovations is.

Those innovations were:

  • Google.com was faster than the other portals (i.e. it was more convenient)
  • Google’s results tended to be more relevant

Arguably, these two innovations were still the key to Google’s $18.9 Billion annual profits in 2011. More importantly for the SEO however, is the source of those profits: Google’s dominant search market share.

This is a direct result of Google's convenience and relevance.

In the UK, we can use public data from Hitwise to establish that (at the time of writing) Google.co.uk has 84.46% of all searches made in the UK, and Google.com has 5.42%. That’s serious market dominance, and shows why most SEOs in the UK get Google-centric.

UK Search Engine Market Share

We can also examine how Google is doing globally to understand how Google’s search proposition has managed to transcend cultures and languages by looking at known global search market shares (accurate as of Jan 2012).

Notable countries where Google isn’t dominant include China (Baidu.com), Russia (Yandex.ru), and South Korea (Naver.com).

Google Market Share Worldwide

Increasing its market share is Google’s route to increased profitability and growth. Without growth, Google will find it harder to hang onto key engineering talent that has driven the company’s success to date and their business is predicted, in such a scenario, to go down the slow stagnation path followed by Microsoft and IBM before them.

Global Search Engine Market Share

Indeed, even while one of the most successful businesses on the planet, Google is already battling major brain-drain issues thanks to the emergence of Twitter & Facebook as fast growth technology companies in recent years.

The upshot of all this for an SEO is that you can be sure that any algorithm change that impacts negatively on Google’s marketshare will not make it past testing.

That means changes (or rumoured changes) which make Google less convenient, or its results less relevant can be safely ignored in favour of changes which improve either of those metrics.

We will discuss a great example of how this value system delivered a cutting edge for the expert SEO in the next section.

Technical SEO That Shifts The Needle

Lets get into some practical nuts & bolts on the core of SEO (yes, this involves HTML: but please remember not to panic!).

At the current count, here at QueryClick we run through over 110+ potential technical issues when auditing a site. Many of those issues also encompass multiple ‘items’ - for example all the onpage elements discussed here fall into a single ‘issue’.

However, many of the issues are only likely to be found in larger brand websites with major web presences (accidental cloaking, forced session handling, etc).

For this series, we’ll be focussing on only the key issues that can make a real difference to your website’s ranking.

Key Tip:
When listing technical issues, always assign a priority to them based on SEO impact vs time to implement.

Over the last decade, a number of technical changes webmasters can make to their website have floated to the top of Google’s algorithm because they have delivered the best relevancy in its search results.

They are (in no particular order):

  • OnPage Code Optimisation
  • Page Speed Optimisation
  • SERP Conversion Rate
  • Page Canonicalisation
  • Your Backlink Profile

Yes, that’s right: there’s no mention of social media in there. How refreshing!

Well OK, there is an element of social impact discussed in the backlink analysis! But we’ve deliberately hived off any in-depth discussion of social media to another guide to better focus on the nuts and bolts in this one.

We also want to ensure that in the early days of your education you don’t mistake correlation for causation which is rife in discussions around social media, and which is brilliantly disseminated in this article by Matt Peters.

Onpage Code Optimisation

This is a classic area of SEO, and offers the fewest ‘competitive’ edge opportunities as it’s such a well understood area. The key onpage factor to consider is getting your <title> and <h1> tags working together effectively.

<title> and <h1> tag synchronicity

This is the core of onpage and relies upon a fundamental SEO concept, that of targeting one page per keyphrase.

The basic idea is that you should treat every page on your website as a landing page focussed around performing for a single SEO search term and converting the resulting traffic.

One Page Per Keyphrase

This grows out of the fact that the more keyphrases you try to target on a page, the more you dilute the keyphrase focus of the page and the less well you perform for either term.

In addition, your SERP call to action would be weaker, and you're also likely suffer from higher bounce rates when targeting multiple terms.

The <title> tag (the tag that provides content displayed across the top of a browser window and in the page tab) is the most important tag on a page.

Most Important Tag On The Page

It’s closely followed by the <h1> tag (meaning ‘header one’: literally the most important headline content on the page).

H1 And H2 Tags

If both these tags are focussed around a single keyphrase, then the page’s relevance to that search term is strongly signalled to the search engines.

It’s also important to note that tests have shown that the order of the words is important. So the first word in the tag is given more value than the second, and so on.

Finally, you should consider that a phrase which is exactly matched performs better than a page where the words making up the phrase are all present but in a different order or separated by other words.

This is a strong indicator of relevancy for Google, and should not be underestimated.

Key Tip:
This law suggests that the closer to 100% keyphrase richness you achieve, the better relevancy and performance will be.

However, you do then sacrifice the chance to target synonyms and long-tail terms, so think carefully before pruning!

Other page code items which are beneficial to mention your target keyphrase in are:

  • HIGH: Same-domain links pointing to the page
  • HIGH: <h2> & <h3> tags. Less important than <h1> but useful for targeting synonyms
  • HIGH: In the meta description (to target SERP conversion)
  • MEDIUM: In the first <p> tags encountered in the page code order
  • MEDIUM: in <img> filenames and alt attributes
  • MEDIUM: in richmedia (video) filenames and transcripts
  • LOW: <strong> & <em> tags (respectively: bold & italics)

A note on <p> tags: Google has stated it uses ‘Page Templating’ to determine unique page content from general page ‘architecture’.

In practice, this means that Google is aware which parts of a page are simply ‘Main Navigation’ and treats that content differently from the unique page content - which is, of course, presumably hand written and so is the truly relevant (that word again!) part of the page.

This means that a link in the first paragraph of the main body of content on a page should carry much greater weight for Google. And in fact we find this is so by testing the theory.

Page Speed Optimisation

Over the last three years, Google has ramped up the importance of page load times as a factor in their algorithm.

We can return to our touchstones of convenience and relevance to understand why that is.

When Google was starting out, internet access was via dial-up modems. One of the reasons people were drawn to Google was the speed with which its homepage resolved (because of its famous simplicity) and the speed with which it returned results.

Now that the (first) world has largely sped up to fast download speeds Google has moved the focus along to the pages linked to from its results pages.

Web usability experts and conversion rate optimisers are well aware of users’ sensitivity to computer application (including web pages) response times.

A good rule of thumb nowadays is if it takes longer than three seconds to comprehend the suitability of a web page to a visitor, then that visitor will leave.

The same law and logic applies to load times for listed pages in Google SERPs.

If a searcher uses Google and consistently gets results which take a long time to load, they may start to use a different search engine which favours fast loading websites. We know Google doesn’t want that so they’ve taken a carrot and stick approach with webmasters.

On the one hand, Google will go to great pains to display your site load times in their Webmaster Tools and highlight how you perform against the rest of the internet.

Webmaster Tools

This data has also recently been incorporated into Google Analytics reports.

Google have also released free tools for webmasters to diagnose the cause of their slow loadtimes and make the necessary improvements.

Page Speed

On the other hand, there is clear evidence that if you ignore Google’s help, they will pull your site’s web pages back from the top positions to improve their searchers’ search experience and they have gone on record to state that page load time is now a factor in their algorithm, reversing their original stance on the subject.

It’s also worth noting that Google also stated they will penalise slow-loading sites running PPC campaigns with them by incorporating the same change into their Quality Score.

One of the key competitive advantages available today for smaller businesses to outperform their larger, more established competitors, is through ensuring they beat the three second rule for page load times, particularly when Googlebot is visiting.

SERP Conversion Rate

The most recent addition to the 101 guide, Search Engine Result Page (aka: SERP) conversion has been a highly valuable competitive edge for expert SEOs for a few years now.

And although it’s now more widely understood, the opportunity to make dramatic, immediate improvements by focussing on this area of SEO remains real. The underlying concept is simple: Google prioritises SERP listings which achieve successful clicks.

Google’s patent for this aspect of the algorithm explains that a successful click is one which satisfies the searcher’s reason for searching. That can be measured by observing SERP click behaviour.

Lets use an example to illustrate how Google does this: Say you search for ‘Car Insurance’, and choose the top organic listing (moneysupermarket.com in our sample search, shown below). Google is aware of the search term you used, the result you clicked on, and when - storing the information in a cookie in your browser.

Car Insurance

Should you later return to the page, Google is aware of the page refresh and understands that the page may not have been helpful to you: especially if you return quickly to the SERP. Subsequent behaviours are also tracked.

So if, for example, you then decided to try the number two result and again returned to the SERP before ultimately choosing the third result, Google will note that the third result was more relevant to your search term than the number one result and promote the listing up the rankings.

If you scale up this awareness to the huge volume of searches performed on Google every second, it’s easy to understand the power such human behaviour can offer in refining search result page effectiveness.

Equally, if after performing a particular search most searchers choose the second result over the first, Google can be confident in the signal and promote the second result to the top position.

The ‘Click History’ a listing generates for a search term aggregates over time, allowing otherwise poorly optimised pages to perform strongly for terms where they exhibit high ‘satisfaction’ metrics for searchers.

It also follows that dislodging a number one listing which has a strong click retention rate (i.e. a result which causes few searchers to return to the SERP once clicked) is much harder than a listing which fails to retain visitors.

It’s instructive to note that the converse behaviour (i.e. a listing which is ranked number one but which rarely retains visitors) results in a URL which will slip down the SERPs regardless of other SEO metrics: an excellent anti-spam outcome of this factor of the algorithm and one of the main metrics updated during the Panda / Farmer Google Update.

When an SEO is looking to improve their SERP conversion rate, they should consider a number of questions:

  • How relevant is the SERP snippet to the target search term?
  • How immediately relevant is the landing page content to the target search term?
  • How much more relevant is my listing than a competing listing?
  • Does a searcher need to leave the page to ‘convert’ for their search intent?

Understanding these relationships allows for the holy grail of modern SEO: SEO improvements which also improve site conversion rates. This is a virtuous circle, and can supercharge the effectiveness of your website.

To address the questions, it’s often useful to look at the PPC listings for the target search term to identify the common sales Call To Actions used to encourage a searcher’s click.

Car Insurance CTAs

By using the page target keyphrase term in the meta description tag, as well as the NOODP and NOYDIR meta robots controls, an SEO can fully control the snippet used for an organic listing in the same way as a PPC advert. This should be used to craft a call to action that appeals to the searcher.

In the more advanced guides, we’ll also investigate how using SEO & PPC in tandem can be used to target two different customer types in a single SERP, increasing overall traffic without cannibalising either channel.

Following up on the promise of your carefully crafted SERP snippet is a more involved process, which requires a degree of usability experience to truly move the dial. But using survey feedback data from 4Q and Kampyle, along with basic A/B testing using Google Website Optimiser will allow you to start immediately improving your page bounce rates, and therefore lower the SERP Return Rate to the search engines.

Page Canonicalisation

Canonicalisation is simply the work of removing duplicate (or seemingly duplicate) content from search engine access to leave a single (called ‘canonical’) URL for the content.

Page (& Site) duplication is easily the number one technical SEO issue encountered when a new client is audited here at QueryClick. And usually there are multiple forms of duplication on a single site, multiplying the damage this issue causes.

A good example of this is commonly found on eCommerce websites which offer multiple product search and navigation options that alter the URL.

Ecommerce

At the time of writing, for each of the options above, the allsaints.com site generated a new URL with a distinct URL parameter despite the bulk of the textual content remaining the same: effectively creating multiple duplicate pages for the item.

  • Disclaimer: QueryClick were brought in to work on resolving all the issues covered in this document, but the original concepts illustrated remain true.

It’s easy to detect when duplication is in play, as Google filters out duplicated by default, though if you perform a site operator search with results set to 100 you can prompt a dialogue from Google offering to repeat the search with duplicate content included.

Search Without Omitted Results

This feature in fact simply appends the ‘&filter=0’ parameter to Google’s query string, which was established by Google’s Matt Cutts as the way to access the now little discussed ‘Supplemental Index’.

Although often treated as if it no longer exists, we can see that clearly Google still operates this ‘Two Tier’ index, and given that the Supplemental Index results were only used if there were no results at all in the main index that were relevant for a search query, the importance of diagnosing which site pages are held behind the ‘&filter=0’ filter remains.

We can use chained site operators to drill into the duplicate pages within a large domain to capture all instances of duplication and schedule them for a solution. For example, using the allsaints.com example from earlier, we can look at just a folder named ‘uncategorised’.

Uncategorised Folder

Or search only for pages with “All Saints Rossetti Beanie” in the title.

All Saints Rossetti Beanie

Or only those product title pages in the ‘uncategorised’ folder.

All Saints Rossetti Beanie In Uncategori

By inserting a minus (-) symbol in front of any of these site operators we can use them to exclude rather than include results. For a full list of site operators, check out Google’s help page.

With site-wide duplication, the ideal solution is to consolidate duplicates into the canonical URL, rather than simply use robot indexing controls to exclude access to all the duplicate versions.

This is because if large numbers of URLs are excluded from Google’s index, then the backlinks pointing to those URLs will be discounted from the overall domain backlink profile, with serious results for the overall SEO strength of the domain.

Ideally then, we consolidate and we have two options available:

Either should be deployed to remove any duplicate URL detected in search indices.

301 Redirection

Monitoring of the de-duping process should be done via Google’s Webmaster Tools, which will also pro-actively alert you to perceived duplicate content (which may simply be a case of identical <title> tags rather than actual duplicate content, so make sure you investigate the cause of the alert first).

Your Backlink Profile

The final technical point discussed in this document is arguably the most significant: your domain’s backlink profile.

A backlink is a link to your domain, from a different domain. For clarity, here is how a domain breaks down:

Anatomy of a subdomain

  • http://[sub-domain].[domain].[tld]/[directory]
  • http://www.example.com/jokes/

Links from the same domain to other subdomains or directories are only internal links, and don’t pass the same value as an external link (a backlink), though they are important signals to a search engine on the content of a page and therefore its relevance to a search term.

So a link from www.example.com/jokes/ to uk.example.com would be considered an internal link, while a link from the same page to www.different-domain.com would be an external link, and would be referred to as a backlink of different-domain.com.

So hopefully that’s made the nomenclature easy(er) to understand! What about your backlink profile’s impact on your SEO?

The origins of Google’s algorithm lie in their much discussed PageRank metric. It was inspired by the scientific principle of citation: papers which are cited frequently are considered more valuable than papers with fewer citations.

If you think of websites as papers, and citations as links from other websites, you can understand the principle. It’s also important to note that highly cited papers (or websites, in our analogy) pass more value to their cited papers than poorly cited papers.

Therefore the most highly cited - or backlinked - websites on the internet will perform well in Google, and when they link to other websites, they pass on some of their value (or authority as we will now call it) to the linked site.

So, at a very fundamental level, the value of a domain is determined by its backlink profile, and increasing a domain’s value comes from increasing the number of links pointing to it.

We can refine that statement somewhat to highlight two particular types of link that are particularly valuable:

  • Links from high authority domains (sites like the bbc.co.uk or apple.com, etc)
  • Links from relevant websites (sites with similar content or themes to your own, often referred to as contextual (links)

The reason why relevancy is important is because over the years, as unethical SEOs have sought to improve their websites though generating mass backlinks from domains they’ve set up themselves, or by buying links on high value domains purely for the PageRank they pass, so Google’s qualification for links that can be considered part of your backlink profile has become more sophisticated.

For example, if Google thinks you are buying links for SEO, they will remove the ability for those links to pass any value. If they think you’re buying a lot of links, they’ll classify you as part of a ‘Bad Neighbourhood’ and penalise your ability to perform in the search results with +60 or +90 penalties (a penalty which prevents ranking above position 60, or position 90 respectively).

In either case, Google will tell you that it’s detected such activity via the Webmaster Tools, making it important that you monitor this for such messages.

No SEO should consider buying links, and the only long term strategy that will add value to your website, and ultimately your brand is by playing according to the rules set down by each search engine.

Fortunately, there are plenty of ways to be ethical and improve your backlink profile. One of the most effective is using competitive intelligence to identify what your competition has done to give you a step up in executing your linkbuilding strategy. We’ll discuss this is more detail shortly.

Some Other Key Axioms Of Linkbuilding To Consider

Axiom: backlinks to a page add value to the page, but also improve the authority of the domain the page sits on.

This means that running multiple domains should be avoided unless you are willing to run multiple linkbuilding campaigns to get them all performing strongly for competitive search terms.

It also has larger implications for Multinational SEO strategies.

Axiom: The depth of a page from the homepage determines how much domain authority it receives.

Page Authority

The depth is not determined by directory structure or other URL components, but simply by the minimum number of clicks required to get from the homepage to the page.

This means that you should prioritise your pages, so the highest value landing pages are linked from your homepage, while longer tail terms sit deeper.

Beware of making your sitemap too ‘shallow’ or you’ll be unable to deploy this valuable technique.

Axiom: The total backlink value held by a domain is spread across all indexed pages on the site.

301 Redirection

This means that you should use as few pages as necessary to conserve domain authority, which of course, sits in tension with ‘One Page Per Keyphrase’.

Therefore you should consider carefully your target keyphrase terms, and always ask if a page you are building is necessary or not, merging the content with another page if not.

Axiom: The value passed by a link is divided by the total number of links on a page.

This means that link parsimony should always be practised on your website.

Link Value

Only link when it’s relevant & valuable to do so.

Try to keep to a minimum the ‘structural’ linking on a page - i.e. having the whole site listed in your main navigation is a bad idea because it creates a large number of links on all pages, as well as making the site ‘shallow’ (see axiom on 'Depth of a Page from Homepage').

Axiom: Allow natural deviation through synonyms when linking or encouraging links from external sources.

A page which is linked to with exactly the same anchor text sitewide is likely not placed by a human hand, and is therefore not given as much weight as naturally varied links.

Allow natural variation in your internal linking and don’t only go for keyphrase rich anchor text.

Linking to authoritative sites also promotes your association with authority. Conversely, linking to low value of ‘Bad Neighbourhood’ sites, associates you with the bad neighbourhood. Therefore:

Axiom: always seek to link to the most authoritative sites, or original news sources.

Linking To Authoritative Domains

Identifying Backlink Opportunities

The most important development over the last two years has been the increased importance of domain diversity in a backlink profile.

At QueryClick we’ve developed various Linkbuilding Gap Analysis Tools to help supercharge our backlink development and promote domain diversity.

Report Analysis

By targeting high value domains linking to your competitors you are also targeting relevant domains. By drilling down into their backlink profiles, you can find exactly where each link is coming from and execute and action to gain a similar link to your own domain.

Likely actions include:

  • Generating interesting / unique data around products / service / your industry & presenting them via infographics with analysis or as downloadable stats.
  • Engaging with social networkers active in your area to encourage backlinking.
  • Engaging with industry bloggers and forums highlighted.
  • Distributing press releases via highly distributed PR wires.
  • Developing news content that qualifies for Google News inclusion.
  • Providing press packs to encourage better backlinking from news websites.
  • Developing your social media presence in high value networks.
  • Developing value-add content likely to capture links when promoted.

There will be many more options and opportunities suggested by the gap analysis. Your objective should be to gain natural, occasionally keyphrase rich links from relevant sites which are themselves considered authoritative.

Measurement of the performance of your linkbuilding can be achieved by exporting backlinks from your Google Webmaster Tools account.

Backlink List

You should also monitor your SERP positions via the ‘Search Queries’ report.

Webmaster Analytics

A Final Point On Technical SEO Fundamentals

It should be apparent by now that SEO is more of a holistic process that should inform your website development and the promotion of your site online.

In effect, SEO is simply harnessing those natural activities to better deliver performance in the search engines.

The question of what your brand should be chasing in the search engines will be covered in the next guide published by QueryClick:

SEO 102: Marketplace Strategies that Drive Bottom Line Profit.

Hello. Is it us you're looking for?

We can see it in your eyes, we can see it in your smile.
You're all we've ever wanted and our arms are open wide.

Send us your details today and we'll get in touch with you