What is Technical SEO & Why It’s Important

Gary Illyes
Gary Illyes

Gary Illyes of Google offers advice on how to use Google's best practices to outrank your rivals. He'll give you pointers on SEO and all things Google Search Algorithm.

Table of Contents

What is Technical sEO

Technical SEO involves making changes to a website’s “back end” to make it easier for search engines like Google to crawl and index the page.

When practicing technical SEO, you might focus on the following aspects (this is not a comprehensive list):

Streamline website code to make it more accessible to spiders so they may access the most crucial stuff first.

Select a content management system that is optimized for search engine optimization.

Configure and keep an eye on the CMS to spot any SEO issues the system may have by default.

Website speed (how quickly a page loads) is a factor in Google’s ranking algorithm and has an effect on user experience.

What Is Technical SEO in Simple Words?

Technical SEO is about making sure your website is built correctly and optimized well for search engines. This involves things like how you set up your URLs, what keywords you use, whether or not you use HTTPS, and much more.

There are different types of technical SEO: On-Page SEO, Off-Page SEO, and Link Building, and it’s important to know each one.

On-Page SEO covers all of the parts of a web page that affect how it appears in search results. These include things like your URLs, meta description tags, internal linking, image alt text, and more.

Off-Page SEO refers to actions taken outside of your website that help improve your rankings. Things like guest posting, link building, social media marketing, press releases, and more fall into this category.

Link Building is essentially getting links to your website from other sites. You can do this by writing blog posts, asking influencers to mention you, and more.

A good technical SEO strategy should involve all three elements.

Why Is Technical SEO Important?

Technical SEO is one of the most important aspects of optimizing your site for search engines. If your site isn’t technically sound, it won’t rank well for certain keywords. And if it doesn’t rank well, it won’t bring in traffic. This article explains why technical SEO matters, what it entails, and how to find out whether your site needs some work.

How Can You Improve Your Technical SEO?

Technical SEO is one of the most important aspects of improving your site’s performance in Google. If you want to rank well organically, it’s essential to make sure that your site loads quickly, follows best practices for code structure, uses structured data markup, etc.

There are many different things that can affect how fast your site loads, including the following:

• JavaScript

• CSS

• Images

• HTML

• Server speed

• Page load times

If you want to know what else you can do to improve your technical SEO, check out our list of the 10 Best Practices For Improving Your Technical SEO.

Technical SEO Checklist

The following checklist includes some of the most important things you need to do to optimize your website for search engines like Google. You don’t want to miss anything!

1. Create a sitemap.xml file

2. Optimize your URLs

3. Improve your internal linking

4. Optimize your images

5. Include Schema markup

6. Add alt text to your images

Create an XML sitemap.

An XML sitemap tells Search Engine Optimization (SEO) spiders where you want them to go. This helps ensure that they are crawling your site properly. If you don’t provide one, they might miss out on some important data.

A good sitemap includes all of your products, category listings, and subcategory listings. You should make sure that each listing contains enough information to allow the spider to find everything easily.

When you’re ready to submit your sitemap, use our free tool.

Fix duplicate content issues.

Duplicate content isn’t just an annoyance for users; it can also be used to trick search engines into believing there are many different pages about the same topic, which could impact how well your site ranks in search results. To avoid problems like this, you’ll want to make sure that each URL on your site points to exactly one version of every page. You can do this by adding unique identifiers called “canonical URLs” to each page.

There are several ways to fix duplicate content issues, including canonical tags and removing session IDs. If you’re looking for a quick way to identify duplicate content across multiple pages, we recommend checking out our free tool, Search Console Audit.

Register your site with Google Search Console and Bing Webmaster Tools.

Submit your XML sitemap and verify it with both Google Search Console and Bing’s Webmaster Tools. This helps you monitor how well your webpages perform in search results. You can use these tools to disavow links or to test your site’s mobile usability.

What are the characteristics of a technically optimized website?

Technical optimization includes all aspects of a website’s design and architecture, including how it loads, what information it provides, and how easy it is to crawl. A technical audit looks at each aspect of a site and identifies areas where improvements can be made. This allows you to focus on improving those specific elements while keeping everything else the same.

A technical audit begins with a thorough review of your existing site. You’ll want to identify any issues, such as broken links, slow loading times, outdated content, missing images, etc. Once you’ve identified problems, you’ll work together to develop a plan of action. For example, you might decide to add additional server space to handle increased traffic, update your hosting provider, or add caching software to improve performance.

If you’re not familiar with web development, don’t worry – we can help. We offer free consultations to discuss your goals and needs, and provide recommendations based on our experience. Our goal is to make sure that your website performs well and meets your expectations.

Why should you optimize your site technically?

Technical issues are common and often overlooked. They can cause your site to perform poorly, especially if you don’t know how to fix them. But fixing technical issues doesn’t just help your rankings; it also makes your site easier for visitors to navigate and use. In fact, according to BuiltWith, nearly half of mobile searches start with typing into a browser address bar. This means that if your site isn’t optimized correctly, people won’t even bother trying to find what they’re looking for.

Structured data helps search engines understand your webpages. When you add structured data markup to your site, search engines like Google and Bing can better understand the information presented on your pages. For example, adding schema.org markup tells search engines about the main topics covered on each webpage, such as products, movies, restaurants, etc. Schema.org allows search engines to display relevant product reviews and ratings next to listings on local maps. Adding rich cards lets searchers see additional information about businesses, including opening hours, phone numbers, directions, and photos. Rich Cards give you control over how much information appears on each card, allowing you to tailor the experience for your customers.

Google loves fast loading sites. A slow site can frustrate visitors and make them less likely to return. If you want to rank well in search results, you’ll need to ensure that your site loads quickly. Speed improvements come in many forms, including reducing image sizes, optimizing images, compressing CSS files, minifying JavaScript code, and combining multiple scripts onto one file. You can also use tools like PageSpeed Insights to check whether your current setup meets Google’s recommendations.

It’s crawlable for search engines

Robots are like little spiders that help search engines index your site. They’re called robots because they “crawl” around your website looking for things like images, text, and other important data. If you want to make sure your website is indexed properly by search engines, it needs to be crawlable.

There are many different types of robots out there, each with their own purpose. Some robots look for broken links, while others just scan for keywords.

You can control how much information they see about your website. This helps ensure that your website doesn’t show up in searches for terms that aren’t relevant to what you sell.

 Schema Markup | Structured Data

Structured data is a way to make sure search engines understand exactly what your site offers. This includes things like product names, descriptions, prices, ratings, reviews, categories, and even dates. In addition, there are several ways to use structured data to help improve your rankings.

The most important thing to keep in mind about structured data is that it helps search engines better understand your content. When you provide structured data, you’re telling search engines how to display your information. For example, if you want to show up in the “best sellers” section of Amazon, you’ll need to add a category tag. If you want to appear in the “new releases” section of Barnes & Noble, you’ll need to include a release date.

Another benefit of adding structured data to your site is that it makes it easier for people to find your products. Search engines can now tell whether your product is relevant to a particular keyword. They can also determine the price range of your items and whether they’re out of stock.

If you want to learn more about structured data, check out our guide here.

International SEO | Using Hreflang

Hreflang is a way to tell Google what language you want it to show up in. If you are targeting multiple countries, you can specify where each one lives. This allows you to make sure that your site appears correctly in those locales.

You can even add different languages to different parts of your website. For example, you could have English text on your homepage, and French text on your About Us section. You don’t have to do this manually; there are tools out there that help you manage this process.

HTML, JavaScript, and CSS

HTML is the basic building block for any website. JavaScript adds functionality to sites while HTML and CSS make it look nice. These three elements work together to build a web page.

Crawling, Indexing, and Rendering

A crawler is responsible for searching the web and gathering all of the information about a particular topic. This includes things like where a webpage is located, what it says, how many times it appears on the internet, etc. Once all of this information is gathered, it is sent to an indexer. The indexer stores all of this information in a database called a search index. When someone searches for something on Google, the search engine looks up the words in the index and displays the most relevant results.

An indexer is responsible for collecting all of the information discovered by the crawlers and putting it into the search index. To do this, the indexer must crawl every single webpage on the internet. If you think about it, there are literally billions of documents out there on the web. So the indexer needs some help. Enter the crawling process.

The crawler finds each document on the web and sends it to the indexer. The indexing process begins once the crawler finishes sending all of the documents to the indexer. At this stage, the indexer starts looking for keywords within the text of the documents. These keywords are used to categorize the documents into different topics. For example, if I wanted to find all of the articles written about the latest iPhone release, my keyword list might look like this: “iPhone”, “Apple”, “release”. Once the indexer has collected enough data, it creates a search index.

Finally, we come to the last step: rendering. Renderers display the information stored in the search index. They use the keywords to find the best match for the user’s query. In our previous example, the iPhone article would show up because it contains the word “iPhone” and the word “release” appears in the title.

Canonical Tags

A canonical tag lets search engines know which version is the most authoritative. If you have multiple versions of one page, it helps make sure that search engines treat each version equally.

When you add canonical tags, don’t change anything else about the web page. This could cause problems for both visitors and search engines.

Use Page Improve to check if your canonical tags look good.

Client-side rendering versus server-side rendering

Googlebot crawls your site just like a human does. This is called client-side rendering. With client-side rendering, you don’t need to worry about how Googlebot sees your site because it looks exactly the same to humans and algorithms. However, there are some downsides to client-side rendering. For example, you might want to use JavaScript to make things look nice for visitors. But since Googlebot doesn’t execute JavaScript, your site won’t render correctly. You’ll lose out on potential conversions.

Server-side rendering is different. Instead of crawling your site like a person would, Googlebot crawls your website like an algorithm would. This allows you to control what information Googlebot sees. If you’re concerned about privacy, server-side rendering could be a good option for you.

You can choose one method over another depending on your needs.

SEO Server log files | Google’s Crawl Budget

Log files are important tools for analyzing what crawlers do on your site. You can use Screaming Frog to check your log files for crawl errors. If you notice anything unusual, it could mean that someone is trying to hack into your account or that your server is down.

Indexing refers to crawling and finding a page on your website. Crawlers will stop if they find something that doesn’t exist. This includes things like broken images, missing text, or HTML tags. Fixing problems with these items will make your website easier to index.

Search engines cannot read every single word or phrase on each page of your site. They will stop if there are keywords that don’t exist. Using synonyms for those terms will help ensure that your site gets indexed properly.

There are many ways to improve the indexability of your website. For example, adding Schema markup helps search engines understand what information exists on your site. Adding structured data makes your site easy to crawl. And making sure that your URLs are unique will help prevent duplicates from being crawled.

Audit your redirects.

If you’ve been running paid traffic campaigns for some time now, it’s likely that you’ve already optimized your site for conversions. However, there are still many things that you might overlook. One such thing is your redirects. Redirects are used to send people to different URLs once they arrive on your site. This is especially important if you want to improve your conversion rates.

Before we start talking about how to audit your redirects, let’s take a look at what they do. A redirect is basically telling the browser to go somewhere else. For example, if someone types www.example.com into their address bar, the browser will automatically try to load another domain called www.example2.com. If that doesn’t work, it will eventually display a 404 error.

The most common reason why you’d use redirects is because you want to move people from one page to another. In fact, you probably don’t even know that you’re doing it. You just assume that the visitor will see the same experience regardless of where he lands on your site. But that’s not always true. As you can imagine, redirecting people from one page to the next could cause problems. Let’s say that you

Fix HTTP errors.

HTTP status codes are used to communicate information about how a server handled a request. They’re important because they help you understand whether something went wrong during a request. If it did go wrong, you’ll want to know exactly what happened. You might even want to take action based on the error code.

The most common HTTP error codes are listed here, along with some examples of each one.

404 – Not Found

This indicates that the resource requested cannot be found. This happens when someone tries to access a file that doesn’t actually exist. The server responds with a 200 OK response code and includes a link to myfile.txt.

STORY: How To Add Custom CSS In WordPress

In this tutorial we will learn how to add custom css in wordpress. We will use our theme editor to make changes to the style sheet of our theme.

Redirect Chains

A chain of redirects can be problematic for search engines. If you are building out a website, it’s important to avoid creating a chain of redirects. Redirect chains can lead to issues such as duplicate content, broken links, and low search rankings.

Search engines don’t want to crawl through multiple URLs for every single URL on your site. They want to find what is relevant to each individual URL. When you build out a website, make sure you aren’t accidentally creating a chain of redirections.

Avoiding redirect chains can help improve your search rankings. You can use tools like Moz’s Open Site Explorer to see where your redirects are coming from.

Webmaster tools

To check if there are any problems with your robots.txt file or sitemap, you should use the Webmaster Tools. This is a free online resource that allows you to submit your site map directly to Google without having to upload it manually. Here, you will find instructions on how to add your site map.

When submitting your sitemap you can choose between XML and HTML format. If you select XML, your sitemap will be submitted as plain text. If you select HTML, your sitemap submission will look like a normal webpage. In addition, you can provide additional information about your sitemap such as the URL where you want it to be published.

You can also specify whether you want to make changes to your existing sitemap or start over. After you’ve finished adding your sitemap, simply hit “Submit.” Your sitemap will now be visible in the section “My Sitemaps,” under the heading “Sitemaps.”

In the next step, you’ll see a list of URLs that Google found in your sitemap. These URLs represent different parts of your site. For example, the homepage might contain a link to one of your articles. A category page might include a link to a specific product. And a blog post might mention another article. Clicking on each of these URLs takes you to the corresponding part of your site.

If you’re looking for something specific, you can filter the sitemap by domain name, date range, language, or even keywords. Simply enter some words into the box above the list of URLs. For example, if you wanted to know what pages contained the word “blog,” you could type “blog” into the search bar.

After you’ve selected the relevant URLs, you can view detailed statistics about the number of times each URL has been crawled and indexed. You can also download the sitemap itself, which contains all the URLs that you’ve added to your account.

The Webmaster Tools allow you to monitor crawl errors, track rankings, and manage redirections. There are three main sections within the Webmaster Tools: Crawler Statistics, Search Console, and Site Map.

Broken Links (404)

A broken link is one of the most common reasons why people lose interest in a particular website or blog. If you’re wondering how many broken links do you have on your website, there is a simple way to check it. You just need to use a free online tool called “Disavow”. This tool allows you to see what URLs are considered spammy and thus, blocked by Google.

You can use the Disavow Tool to fix 404 errors, 301 redirects, robots.txt files, and even invalid sitemaps. In addition, you can also use this tool to identify spammy links.

Server Errors (5xx)

A server error occurs when a web server does not respond properly to requests. This could mean that there are problems with the software running the server, or it could be because there is no server hosting the requested resource. A common example of a 5XX response code is “Not Found.”

The HTTP status codes range from 200 OK to 4xx Client Error to 5xx Server Error.

HTTP Status Code Meaning | Description | Examples

200 OK | Successful request | 200 OK

201 Created | New object successfully created | 201 Created

202 Accepted | Request accepted by the origin server | 202 Accepted

 

203 Non-Authoritative Information | Response includes information about the original request sent by the client | 203 Non-Authoritative Information

SEO CHECKLIST 

Checklists and regular tasks to improve website visibility

SEO CHECKLIST & REGULAR TASKS A working document for improving your website visibility over time. 


One-time SEO Activities 


These tasks should be completed once the site has gone live. They only need to be done once, ideally immediately after the site has been developed, and certainly  before any search engines spiders (bots) have indexed your site.  

Activity Checklist

TASK 

COMPLETED 

DATE

Install Google Tag Manager

   

Create a Google My Business page

   

Perform a site audit (if already live)

   

Optimise for SEO

   

Switch to HTTPs

   

Ensure the site complies with GDPR

   

Create Privacy Policy, Copyright and T&Cs pages

   

Subscribe to SEO news publications

   

Claim your social media pages

   

10 

Add heatmapping services

   

11 

Add an XML sitemap and submit to Google Search Console 

   

12 

Ensure responsive design

   

13 

Create audience personas

   

14 

Become familiar with Google Webmaster Guidelines

   

One-time SEO Activities – Details 

  1. Install Google Tag Manager 

The Google Tag Manager 1 will allow you to quickly and easily update measurement codes and related code fragments, collectively known as tags on your  website.  

With many different code snippets from third parties requiring to be installed on a web page, it can get overwhelming trying to keep track of the tags. This is  where the Google Tag Manager comes in useful. It’s a web-based interface that is designed to simplify the process of managing and adding new tags. You  don’t need to edit the source code to edit, add or manage tags – you can do it all through Google Tag Manager. 

Once installed 2, it’s simple to deploy Google Ads, Analytics and other third-party tags such as Crazy Egg, Adroll, comScore etc. 

  1. Create a Google My Business page 

Google My Business is a free tool for businesses to manage their online presence across Google, including Search and maps. It allows you to easily connect with  customers by letting you manage how your business appears on Google. 

Also referred to as GMB, it can be considered to be similar to online business directories such as Yelp, Yell or Thomson Local. Unlike these business directories,  GMB is far more comprehensive, allowing you to manage many different aspects of your business listing, including: 

  1. Google map location – shown on Google Maps apps on various devices such as smartphones, tablets and desktop PCs 
  2. Google Knowledge Graph – the snippets of essential data shown on the Google search results when someone searches for your business c. Google Local Pack – similar to search engine results pages (SERPs), where various different matching businesses appear for a search phrase 
  3. Perform a site audit 

Use a tool such as Screaming Frog or Xenu (see regular weekly SEO tasks for details) which enable you to check for common problems before going live (and  being indexed).  

This is an important task, which will allow you to: 

  1. Scan your site for technical problems 
  2. Find duplicate content 
  3. Discover broken links / pages 
  4. Check page loading times 

1 https://www.google.com/intl/en_uk/business/ 

2 https://tagmanager.google.com/

  1. Discover pages that are blocked by robots.txt 
  2. Identify issues that may affect SEO 
  3. Optimise for SEO 

Once a site audit is complete, you can then optimise the site for SEO before launch. Once a site has gone live and has been indexed by the search engines, it  can be difficult to undo or repair bad pages quickly, especially if the URL is changed. 

Before submitting to Google Search Console: 

  1. Run link checkers (see 3. above for suggested tools) 
  2. Add XML sitemap and submit to Google Search Console and Bing Webmaster Tools 
  3. Use H1-H6 tags where appropriate (use only one H1 tag per page) 
  4. Complete preliminary keyword research 
  5. Discover primary keywords 
  6. List similar semantic keyword phrases 

iii. List long-tail keyword variations 

  1. Find out what search phrases people are actually using  
  2. Use ahrefs 3, KeywordTool 4 or similar service for keyword research 
  3. Obtain first relevant backlinks from related websites 
  4. Add structured data markup 
  5. Create and add a robots.txt file 5
  6. Set up Google Search Console 6
  7. Install Yoast 7(if using WordPress platform) 
  8. Switch to HTTPs 

As well as preventing malicious attackers, securing your user’s data, and protecting against hackers, using a SSL (secure sockets layer) certificate to your site has  an added benefit to SEO – it can help a site to rank higher in the search engines. 

As Google Analyst Gary Illyes explains, “If all quality signals are equal for two results, then the one that is on HTTPs would get the extra boost”. Anything that  can be done to create a better experience for users will help rankings. 

3 Ahrefs: ahrefs.com  

4 KeywordTool.io 

5 https://support.google.com/webmasters/answer/6062596 

6 https://search.google.com/search-console/ 

7 https://yoast.com/

  1. Ensure the site complies with GDPR 

It’s crucial that you comply with GDPR (General Data Protection Regulations). Businesses that are non-compliant risk fines up to £17m (or 4% of global  turnover), as well as a loss of credibility and reputation. 8 

GDPR has been in place for some time now. It’s a misconception that very small businesses are exempt from complying with the new regulations. Almost all  businesses that store or process ANY personal information about citizens living in the UK (and the EU) need to comply. Even if you’re just a micro-business and  are collecting and storing customer’s names or email addresses, you need to comply with the new GDPR law. If you’re the owner of a business, charity, start-up  or organisation, the responsibility lies with you to ensure that you take steps now. 

If you need to check whether your business needs to comply with GDPR, I built a Risk Calculator on my website, https://www.gdprquick.com/risk-analysis/

  1. Create essential pages that send trust signals: 

Lowquality and spammy websites often have one thing in common – they hide away from publishing information that suggests they are complying with local  and national legislation. Often, these sites fail to rank in the search engines, so take the time to ensure you have the following essential pages on your website: 

  1. Privacy Policy (and GDPR statement) 
  2. Copyright Policy 
  3. Terms and Conditions of Use 
  4. Frequently Asked Questions 
  5. Delivery and Returns Policies 
  6. Subscribe to SEO news publications 

This will help you to keep up to date with fast-moving changes to search trends, algorithms and updates. Suggested publications include: 

  1. Search Engine Journal 
  2. Search Engine Land 
  3. Search Engine Watch 
  4. WebmasterWorld 
  5. Google Webmaster Central 
  6. Claim your social media pages 

There’s a huge benefit to claiming your business’ social media pages. Primarily they are an excellent way for potential customers to express an interest in your  business and to reach out and communicate with you. This in itself is reason enough to claim and optimise your social media pages. 

In addition, they provide additional inbound links to your website, which can help drive traffic to your website. Although social shares do not have the same  8 https://www.gdprquick.com/

weight as an inbound link from an authoritative website, it can amplify awareness in your products and services, extending your reach and brand awareness. There’s no reason to claim your page on EVERY social media platform; select the ones that are most appropriate for your target audience. 

  1. Facebook (best for B2C) 
  2. Twitter (B2B and B2C) 
  3. LinkedIn (best for B2B) 
  4. Others – where appropriate 
  5. Add heatmapping services 

Although this is not an activity designed to boost SEO, heatmapping tools will help you find out what your first users actually do on your website, to determine  how effective your user experience is, and to understand more about how your website visitors navigate through your site. 

By recording the actions of visitors to your site, you can see how users interact with your site, enabling you to optimise it for a better experience, increased  interactions (which can help ranking), as well as higher conversion rates. 

Although there are a number of heatmapping tools, some of the best ones include Hotjar 9, Crazy Egg 10, Mouseflow 11, and Inspectlet 12

  1. Add an XML sitemap and submit to Google Search Console 

An XML sitemap acts as a roadmap of your website, and informs search engines about the different pages that exist on your website that are available for  crawling and indexing in the SERPs (search engine results pages).  

It’s important to create an XML sitemap to Google Search Console so that they are aware of the pages on your website that you consider to be worthy of being  indexed. However, be aware that although you might include every page in your XML sitemap, their algorithm decides what pages it will deems to be included  in their index. An XML sitemap does not guarantee ranking positions. 

  1. Ensure responsive design 

Responsive design means that your website is rendered consistently across a variety of different devices and screen sizes, including smartphones, tablets,  laptops and desktop PCs. 

With the exponential rise of different devices, a significant proportion of your visitors will be viewing your site on their mobile phones. Ensuring that your  website is using responsive design will help your users by automatically scaling its content to match the screen size.  

9 https://www.hotjar.com/ 

10 https://www.crazyegg.com/ 

11 https://mouseflow.com/ 

12 https://www.inspectlet.com/


Responsive design is now a confirmed ranking factor for mobile search. The mobile version of your website is now used by Google to review and rank your site,  so it’s important to optimise your site for mobile users if you want to rank for any given keyword phrases. 

  1. Create audience personas 

Audience personas are fictional characters that represent your ideal target market. They are usually 1-page documents that detail a typical customer, and  include their name, gender, job title, business type, age, motivations, needs, pain points and challenges. 

They’re important when producing a new website as they help you to develop a site that meets their fundamental needs. By understanding who your audience  are, you’re able to create a site that is highly engaging and persuasive, which will increase your conversion rates. 

If you need help creating your audience personas, please email us and we will send you a template with directions. 

  1. Become familiar with Google Quality Guidelines 

It’s important to pay close attention to Google’s Quality Guidelines 13. Some bad practices and tactics may lead to a website being removed entirely from the  Google index or become negatively affected by an algorithm update or manual spam action. 

Read the guidelines at https://support.google.com/webmasters/answer/35769, which contain a list of basic and specific principles that will ensure the website  is not penalised: 

  1. Make pages primarily for users, not for search engines. 
  2. Don’t deceive your users. 
  3. Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you’d feel comfortable explaining what you’ve done to a  website that competes with you, or to a Google employee. Another useful test is to ask, “Does this help my users? Would I do this if search engines  didn’t exist?” 
  4. Think about what makes your website unique, valuable, or engaging. Make your website stand out from others in your field. 13 https://support.google.com/webmasters/answer/35769

Six-Month Weekly SEO Tasks 

Try to perform the following SEO tasks each week, ticking them off when they are completed. 

Activity Checklist

TASK 

10 

11 

12 

13 

14 

15 

16 

17 

18 

19 

20 

21 

22 

23 

24 

25 

26 

27

Perform site crawl

                                                     

Log into Google Search Console

                                                     

Create/update backlink report

                                                     

Measure channel performance

                                                     

Discover high-change pages

                                                     

Analyse search competitors

                                                     

Monitor top referrers

                                                     

Test page loading times

                                                     

Ask customers for site feedback

                                                     

10 

Monitor traffic trends

                                                     

11 

Social media posts

                                                     

12 

Write down actions taken

                                                     

13 

Monitor lost links

                                                     

14 

Links from brand mentions

                                                     

15 

Write at least 1 blog post

                                                     

16 

Write a guest blog post

                                                     

17 

Get 3 new directory backlinks

                                                     

18 

Distribute a press release

                                                     

19 

Connect with influencers 

                                                     

Six-Month Weekly SEO Activities – Details 

  1. Perform site crawl 

Understanding how a website is structured gives you an incredibly good insight into improvements that can be made to help boost SEO. As well as helping you  to identify problem areas such as broken links, it also helps you to visualise how PageRank and Domain Authority trickles through your website, and how you  can manipulate it to give extra weight to deeper pages.  

Use software such as Xenu Link Sleuth 14 or Screaming Frog15 to give you a bigger picture of the site, and to: 

  1. Discover broken or missing URLs 
  2. Highlight problematic or duplicate meta tags and headings 
  3. Highlight duplicate content 
  4. Canonical tags 
  5. Robots tags (Screaming Frog only) 
  6. New pages that need to be added to the sitemap 
  7. Look for any page that isn’t showing a “200 OK” status 
  8. Log into Google Search Console 

Google Search Console (GSC) is a free service from Google that helps website owners to maintain, monitor and troubleshoot their website’s presence in the  Google Search results pages (SERPs). 

Previously named “Google Webmaster Tools”, GSC is a useful tool allows you to learn a huge amount of information about how Google sees and ranks your  website, as well as who is using it. Logging in will allow webmasters to check indexing status and optimise visibility of their websites: 

  1. Confirm that Google can find and crawl new pages 
  2. Fix indexing problems and request de-indexing of new or updated content 
  3. View Google Search traffic data for your site: 
  4. Is traffic in line with trends? 
  5. Are there any spikes (negative or positive)? 

iii. If so, what are the reasons for spikes? Is there anything you can replicate/fix? 

  1. Check for alerts when Google encounters indexing, spam, or other issues 

14 Xenu Link Sleuth: http://home.snafu.de/tilman/xenulink.html 

15 Screaming Frog: https://www.screamingfrog.co.uk/seo-spider/

  1. Show which sites link to your website 
  2. Troubleshoot issues for AMP, mobile usability 
  3. Create/update backlink report 

Backlinks are incredibly important for search engine optimisation because they are the ‘backbone’ of the internet. They’re important because each inbound  link (or backlink) is essentially a ‘vote of confidence’ for your website. Therefore, the more backlinks you have, the higher the chance of a good ranking in the  SERPs.  

Although in reality it’s not quite this simple, the fundamentals of backlinks is true. All links are not equal, sadly. Backlinks originating from spammy, poor quality or off-topic sites can carry no weight – or worse – they can penalise your website. So, conducting regular backlink reports is essential. 

A backlink report will enable you to report on the status of inbound links to your site to show the growth in links, and: 

  1. Identify low quality websites and/or pages that need to be removed or disavowed in Google Search Console 
  2. Download the backlink data for each week (or month) 16
  3. Measure channel performance 

As well as reporting on backlinks, there are other key metrics that need to be measured regularly to find out whether your SEO activities are effective. Producing reports for major KPIs trends will indicate whether the SEO activities is having an impact on the success of the site, including: 

  1. Site visits 
  2. Bounce rates 
  3. Time on site 
  4. Conversions from organic SEO and paid SEM activities 
  5. Keyword rankings (for selected keyword phrases) for Google and Bing 
  6. Top queries 
  7. Discover high-change pages 

Take a deep look into which pages saw the largest positive and negative traffic changes. These high-performing pages can give you clues about what factors  caused the page to become successful. They may have attracted a certain number of high-quality links from other websites, or cover a topic that is less  competitive, have a particularly useful infographic, or may be popular on social media. 

Look for common factors that can help you replicate (or avoid) other optimisations across other sections or pages on your website. 16 Recommended tool – Majestic SEO (Site Explorer > Raw Export > Download all Banklinks)

  1. Analyse search competitors 

How are other competitors performing with their SEO activities? When you begin your SEO, list the top five competitors in your industry that you will be  competing with, and monitor the following metrics: 

  1. Ranking changes for major keyword phrases 
  2. New keywords 
  3. Site structural changes (monthly) 
  4. Discover new links to competitors (with ahrefs 17) and reach out to the website to ask if they can include your website 
  5. Monitor top referrers 

Keep track of the major websites that are sending traffic to your website, and monitor the following metrics: 

  1. Number of unique visitors 
  2. Conversion rates 
  3. Broken or removed links 
  4. Test page loading times 

Back in 2010, Google announced that page loading speed became a factor in search rankings. This made sense, because speed has a significant impact on  audience user experience (UX), something that Google has advocated for years.  

“A search result for a resource having a short load time relative to resources having longer load times can be promoted in a presentation order, and search  results for the resources having longer load times can be demoted.” – Google  

Fast pages subsequently saw a boost in search rankings (although not across the board). Slow loading pages may not rank as high as a page that loads under 3  seconds. Ideally, you’ll want every page to load in 3 seconds or less. Run regular loading time reports on the following key pages: 

  1. Home page 
  2. Product category page 
  3. Search page 
  4. Contact us page 
  5. About us page 
  6. Ask customers for site feedback 

This insight can be incredibly useful. As website owners, it’s difficult to keep monitoring your website for issues that affect your audience. Most users are  incredibly willing to offer their feedback – if you ask them for it. 

17 Ahrefs: https://ahrefs.com/

If you are in contact with your customers always try to ask for their feedback (either by email or using an on-page feedback tool such as GetFeedback,  SurveyMonkey, Get Satisfaction etc.), in particular their experience with: 

  1. Loading times 
  2. Usability 
  3. Errors encountered 
  4. Trust factors and transparency 
  5. Checkout process 
  6. Monitor traffic trends 

By keeping a regular report on daily, weekly and monthly trends, you are eventually able to produce a report that will help you to see how seasonality affects  traffic and conversions. 

If you don’t monitor traffic trends you’ll be confused when you see spikes – and may attribute the cause to something other than the real reason. For example,  traffic to one of my websites (BritEvents.com) spikes around the 28th October, and lasts about a week. From my investigations I realised it was due to people  searching for events for Halloween and Bonfire Night. This same spike occurs each year, and has done for more than a decade. Understanding these trends  helps you to create relevant content in a timely way to maximise traffic volume.  

  1. Social media posts 

Although social media doesn’t directly contribute to SEO, it helps build authority, making it an important SEO activity. Social signals may not help a website to  rank higher in the search engines, but it will help to have a positive effect on brand exposure and recall, which (it has been suggested) may contribute towards  a ranking signal. Many high-ranking websites have strong social signals, so it can be of benefit to post regular engaging content to Facebook, Twitter, LinkedIn  etc. 

Tips: 

  1. Try to vary the backlinks from social media – link to various pages on your website, not just the home page. Link to deeper pages too. b. Encourage interaction on your posts. More interaction means more eyeballs, so try to engage your audience by asking questions, sharing images, and  creating polls. 
  2. Write down actions taken 

Whenever a SEO activity or change is complete, write down the activity undertaken along with the date that it happened. If there are any significant changes  (either negative or positive) then you’re able to undo – or replicate – the activity. 

This is important, but easy to forget. Around 8 years ago I made a significant change to one of my websites (traffic-update.co.uk) and didn’t make a note of  what changes I’d made. In total, there were around 30 huge changes made in less than a week. One of the changes I’d made resulted in the site losing its top-5  ranking for thousands of pages. However, because I didn’t make a note of the changes I’d made, I was unable to attribute it to a particular change. The site 

practically disappeared in Google for almost six months. 

  1. Monitor lost links 

Losing links happens to every website for numerous reasons, so it’s important to keep track of which links have been lost. You’ve spent so much time building  links to your website, which is why it’s incredibly frustrating when those links disappear without any warning. It happens. But why? 

Although it’s impossible to find out why some links are lost, there are some common reasons: 

The website owner has removed your link from their website; 

The linking page redirects to another page (301 redirect); 

The linking page no longer exists on the website (404 error); 

The linking page is no longer indexed in the search engine. 

Where possible, try to reclaim lost links so that there is not an impact on your SEO. You can find lost links by using tools such as “Site Explorer”, which will give  you a list of backlinks that have been lost over time. 

  1. Links from brand mentions 

When a third-party website mentions your brand, they may or may not link to your website. The goal here is to turn a brand mention on a website into a link – or at the least try to accompany the brand mention with a link somewhere close to the mention: 

  1. Set up a Google alert for your brand name 
  2. If your brand is mentioned, Google will send you an alert by email 
  3. Check the page to ensure that they have included a link 
  4. If there is no link, get in touch with the webmaster or site owner to ask them for a link. 
  5. Write at least 1 blog post 

As well as providing fresh content for both your users and the search engines, blog posts are very useful for your internal backlinking profile. You can use blog  posts to boost the number of links to deeper pages which may not get much prominence or visibility in the search engines. 

  1. Where possible, try to include the link within the blog post itself 
  2. Use appropriate anchor text, but don’t try to overload the anchor text with too many keywords, as it may appear spammy c. Surround the link with semantically related content 
  3. Write a guest blog post 

If there is a particular keyword phrase that you want to rank for, write a unique blog post that is designed to be published on a third-party website. The  purpose of this post is to gain a link from a related site. 

Tips: 

  1. Many webmasters won’t allow anchor text within the body of the text, so include an author bio with a link instead 
  2. Don’t send the guest blog post to more than 2 or 3 webmasters at once. The idea is to get that article published on one site only, not a number of  websites. There should only be one version of that blog post on the web. Any more will look spammy. 
  3. Get 3 new directory backlinks 

The web is littered with online directories. Most of them are low-quality, and won’t provide any SEO benefit. However, some free and paid websites are better  quality, and can help your backlink profile. The goal is to get 3 new links from quality online directories. Consider Directory Maximizer 18 which I have been  using for several years, and have helped raise my backlink profile – and subsequently my rankings for various websites. 

Important note: 

  1. It’s important to keep away from any directory that appears to be low quality or spammy, which can have a negative impact on your rankings. 
  2. Distribute a press release (monthly) 

Press releases can be a very valuable source of high-quality backlinks, which can have a huge impact on rankings. However, they take time to produce and  distribute, and may need specialist advice. Aim to distribute one newsworthy press release per month.  

Always remember: 

  1. Include a link to your website in press releases. If the media outlet allows it, try to get the link within the body of the press release. If this is not  allowed, include a link in the author bio. 
  2. Consider your target keywords carefully, and craft your press release to include them. However, don’t overdo it because it’ll look as though it’s been  written for search engines, not humans. 
  3. Think about the audience. Create compelling content backed up with facts, figures and statistics. 
  4. Promote your press releases through social channels too. 

18 Directory Maximizer: https://www.directorymaximizer.com/


Whilst Creating a New Page 


These tasks should be completed once the site has gone live. They only need to be done once, ideally before any search engines spiders (bots) have indexed your site.  

Activity Checklist

TASK 

COMPLETED 

DATE

Optimise the page for a single search query

   

Optimise the URL

   

Front-load tags

   

Use single H1 tag

   

Link to external resources

   

Optimise images

   

Use semantic phrases

   

Add schema markup

   

Write for humans, not robots

   

10 

Optimise readability

   

11 

Use the EAT acronym

   

Whilst Creating a New Page – Details 

  1. Optimise the page for a single search query 

When creating a new page, ensure that you only focus on one search query. For example, if you are creating a page about yellow socks, make the whole page  about yellow socks alone. Do not optimise for any other search query, e.g. “black socks” – make a new page instead.  

Always try to remember to include variations on the search query, for example: “men’s yellow socks”, “yellow hold-ups”, “Acme socks yellow” etc, and  incorporate them into your page copy. 

Also, research search intent – ie why are the users searching for “yellow socks”. If you produce content that meets the user’s search intent, then you stand a  better chance of ranking higher for their search query. 

There are three types of search intent, and ensure that you weave copy that reflects what the user is searching for: 

  1. Informational – when the user is looking for specific information, such as “why are yellow socks better than blue socks” 
  2. Navigational – when the user is searching for a particular type/brand of yellow socks, e.g. “best price Gucci yellow socks” 
  3. Transactional – when the user intends to buy a product, e.g. “buy yellow socks” 
  4. Optimise the URL 

The URL structure is important, so spend time thinking about how you will be optimising it. If you’re using a WordPress CMS, a particular URL will be suggested  for you, e.g. 

www.mysocks.com/socks/gucci/yellow/ 

This is a good URL structure to use. WordPress automatically generates “post slugs”, which are URL variations based from the title of each post. However, you  will need to ensure that you turn on custom permalinks to use this feature. If you are using a custom-built CMS, try to use a similar URL structure as above. 

Avoid using any types of querystring that do not include the keyword phrase, such as: 

www.mysocks.com/page.php?id=12345 

Where possible, think about structuring the URL to include the primary keyword phrases that people are using to find what they are looking for. 

  1. Front-load tags 

For each page on your website, there will be certain “meta tags” that are used to tell the search engines more information about each page. Although Google  and Bing are very good at ascertaining the content of a web page, it’s still advisable to spend time creating keyword-rich meta tags, especially the “Title” and 

“Description” meta tags. 

A typical meta title/description tag will look something like this: 

<title>Yellow socks to buy from the UK’s fastest-delivery sock supplier</title> 

<meta name=”description” content=”Buy yellow socks from the top-rated underwear store. Free overnight delivery.”> 

When we talk about front-loading the tags, we place the most important keywords towards the beginning of the meta tags. Notice how the above two meta  tags mention “yellow socks” towards the front of the text. This is called “front-loading” 

  1. Use a single H1 tag 

A typical web page is made up of many different elements. One of these elements are “header tags” that look like this: 

<h1>Yellow socks</h1> 

The Header tags are elements used by search engines to understand the structure of the text on a page better. Each header tag acts as a title or subtitle for the  following piece of text, and may range from H1 > H2 > H3 > H4 > H5 > H6. 

Although it’s not completely necessary to use a single H1 tag on a page, using more than one becomes less manageable. My advice for non-SEOers is to use  just one H1 tag (as the page’s title) and to use the H2-H6 tags as subtitles further down the page structure. 

Note: always try to include the primary keyword in the H1 tag, as well as including it in the H2-H6 tags, perhaps making use of the different types of search  intent to create subsequent tags. 

  1. Link to external resources 

Most SEO experts will agree that one of the most important (but underused) aspects of search engine optimisation is linking to other websites. After all, it’s  how the web was originally designed to work. 

However, many marketers won’t do this for fear that it will damage their search rankings, or cause visitors to leave, or even harm their reputation. All these  fears are unfounded, and it’s been known for some time that linking out to other sites is a good thing. 

Search engines view links to external pages (i.e. webpages owned by third-parties) as being good for the end user, as long as you’re not linking to malware or  scammy websites. 

Of course, it would be unwise to link to a competitor’s website, but there is a lot of value in linking to reputable sources of information that is helpful to the  user. For example, a page about yellow socks could link to some research from Oxford University that claims yellow socks are better at keeping bunions at bay. 

By doing so, you’re demonstrating that you’re willing to help the end user with their purchase – and that can only be a good thing. 

  1. Optimise images 

When using images on your page, ensure that they meet the following requirements: 

  1. Size has been optimised – don’t upload 4mb images that take a long time to load on slow connections. Use a host of online services and freeware to  compress images to a more suitable size before uploading them. Most CMS systems automatically resize images as they are being uploaded, but if not,  then freeware such as JPEG Optimizer 19, Optimzilla 20, or Kraken.io 21 can batch compress images before uploading. 
  2. Use semantic phrases 

Semantic SEO is the process where you build more meaning into the context of the words you use on your webpage. In the early years of the internet, search  engines used to evaluate a page based on the keywords found on a web page. For example, if a web page contained the phrase “yellow socks” repeatedly  throughout the copy, then Google used to rank you solely on the phrase “yellow socks”. 

Today, search engines are far more effective at determining a more general overview of what a page is all about. Algorithms were introduced so that the  search engines could ascertain the overall topic of a page, even if a primary keyword phrase is only used once – or in some cases, never mentioned. 

Google’s ‘Hummingbird’ algorithm was introduced to better determine the topic of any given page (or website). By using semantics, it’s able to see whether a  page should rank for a particular topic by analysing the relationships of words, their meanings, and the implication of those words. 

So, when writing copy for the web, to give the search engines a better understanding of the topic of the page (which can help with ranking), use related words  and phrases to build up a bigger picture. 

For example, a page about “yellow socks” could contain words and phrases such as: 

  1. Direct synonyms and alternative use names, such as “stockings”, “sox”, “pull-ups” etc. 
  2. Directly related phrases, such as “apparel”, “clothes” etc. 
  3. Indirect phrases, such as “feet”, “toes”, “warmth”, “comfort” etc. 
  4. Add schema markup 

Sometimes referred to as just “schema”, this is a range of tags placed within your site’s HTML that is used to help describe the information contained on a  particular web page. 

19 http://jpeg-optimizer.com/ 

20 http://optimizilla.com/ 

21 https://kraken.io/web-interface

An example of schema markup may look like this: 

<div itemscope itemtype=”https://schema.org/Product”> 

 <span itemprop=”name”>Yellow Socks</span> 

 <span itemprop=”author”>Joe Bloggs</span> 

</div> 

The major search engines (Google, Bing, Yahoo! And Yandex) collaborated together and introduced the schema markup structure22 to allow webmasters to  describe content in their webpages in a better way. Ultimately, this helps the search engines to better understand the context and content of the page. 

Many different types of content can use schema markup, including, people, places, businesses, events, and products. 

Although there is no conclusive evidence that it helps with a website’s rankings, there is plenty of evidence to show that search results containing  comprehensive schema markup enjoy a better clickthrough rate in the SERPs (search engine results pages). 

  1. Write for humans, not robots 

There is a temptation for webmasters to write content that is designed for robots, rather than for humans. The idea is that if copy is written that influences the  search engines into ranking higher, then the trade-off is worth it; i.e. a website will enjoy more traffic. Although this is now old-hat, there are still plenty of  copywriters with SEO experience who will write copy that is stuffed with keyword phrases, such as: 

“Buy yellow socks from our yellow sock store today. Yellow socks are considered fashionable, even if you wear them with yellow socks hold-ups. Get your yellow  socks from YellowSock.com at a discounted price and get your yellow socks within 24 hours”. 

Today, search engine robots aren’t influenced by keyword density. They care more about quality over quantity, so forget about writing for robots, and create  content that is designed to be understood by your real audience. 

Once draft copy has been produced, it’s relatively easy to go back and add semantic phrases (see above) and related words that help search engines, whilst still  writing content designed to be read by your customers. 

  1. Optimise readability 

When we sit down and write content for our websites, it’s easy to overlook the readability of our content, especially if you’re used to thinking about the impact  of SEO (keyword density, semantic indexing, meta tags etc). Readability is often the last thing to be optimised (if at all). 

Readability involves writing content that is written at a level so that it is easy to understand and comprehend. But how do we know that a piece of content is  easy to understand? Well, there are several algorithms that have been designed to measure content readability. Interestingly, search engines also measure  

22 https://schema.org/docs/full.html

content readability using these algorithms (usually the Flesch-Kincaid Readability Score). Others include SMOG Index, Coleman-Liau and Gunning-Fog. They  take the number of syllables in a word and amount of words in a sentence to give a readability score. 

Here are some tips to improve readability: 

  1. Avoid writing content that contains buzzwords or jargon that may be difficult for the reader to understand. If you absolutely need to include jargon,  abbreviations or technical words, always attempt to explain what they mean to the reader. 
  2. Write conversationally – as if you’re taking to someone over coffee. This will help comprehension too. 
  3. Use simple, readable typography (fonts) on your page. Choose a font that is easy to read, and avoid novelty fonts where possible. d. Use short words – there’s no point in trying to impress audiences with your grasp of the English language if they don’t know what you’re trying to say. e. Keep the number of words in a sentence to a minimum. Aim for between 9-12 words (or between 50-60 characters). 
  4. Use images, graphs or infographics. These can help break down the monotony of sections of heavy text.  
  5. Use a combination of short and long paragraphs for variation. 
  6. Use the E.A.T. acronym 

In February 2019, Google confirmed that E.A.T. is an important part of their ranking algorithm. But what does it mean? 

E.A.T. stands for “Expertise, Authoritativeness, Trustworthiness”, and by following it when producing new content there is a greater chance that your efforts  will result in a higher ranking. Google won’t rank your page on content alone, especially if you’re publishing information on a specialist subject such as health,  finance or construction for example. By sending people to a page that they consider low quality, then Google risk losing trust too. 

When producing new content / pages on your website, follow the E.A.T. acronym in the following way: 

  1. Expertise – can this page be considered to be “expert content”? Understanding the search intent from your users and you will have the knowledge of  the types of content that your visitors are looking for.  
  2. Authoritativeness – does the person who wrote this copy appear to have the expertise in this subject? Is the person referenced as an expert elsewhere  on the web? 
  3. Trustworthiness – Are there any negative reviews about the person who wrote this copy (or business behind it) on other websites such as Trustpilot,  Google My Business, Facebook etc? Consider ways that you can convey trust on your website: have a privacy policy in place, adhere to GDPR, publish  your contact details, ensure your site has an SSL certificate etc. All these are relatively easy ways to boost trustworthiness.

After Creating a New Page 

These tasks should be completed once a new page (blog post, product page, help page etc.) has been published, ideally before the search engines have indexed the  content. 

Activity Checklist 

TASK 

COMPLETED 

DATE

Check for spelling errors

   

Examine meta tags

   

Test page loading times

   

Ensure page is mobile friendly

   

Check for duplicate content

   

Ensure page is not blocked by robots.txt

   

Check canonical tag

   

Create inbound links to new page

   

After Creating a New Page – Details 

  1. Check for spelling errors 

If there’s one thing that can kill trust in a web page, it’s typos. The odd spelling mistake is perfectly natural, of course, but if your content is littered with errors,  then the visitor will lose trust – and purchase elsewhere. 

Charles Duncombe, CEO of JustSayPlease found that a single spelling mistake affected his conversion rate – reducing it by an astonishing 80%. That’s an  extreme example, but highlights the reasons why you should always ensure that spelling mistakes are obliterated from your copy. 

  1. Examine meta tags 

It’s a wise move to double-check your meta tags after your page has been published. There’s no need to run a link-checking tool – just right click on the 

webpage and select “View Source”, and inspect the meta tags for mistakes. Follow the guidelines for creating meta tags in this document. 

  1. Test page loading times 

When you’ve just published a new page, run a speed test on it to see if there are any problems with loading times which can affect conversion rates. 

It’s common to forget to compress images before uploading, or to add a script that causes a blockage, or worse – sending the visitor on an endless 301 redirect  loop. Run the page URL through Google’s PageSpeed Insights 23 to check for slow-loading elements. 

  1. Ensure page is mobile friendly 

For reasons explained elsewhere in this document, it’s important that the page you’ve just published is mobile-friendly. Your visitors will be using a whole array  of different technology, devices and screen sizes to access your website, so ensure that it loads without any issue on as many different devices as you can.  Ensure all elements (especially the important ones such as the ‘call-to-action’ button) are visible in the user’s screen for all devices. 

  1. Check for duplicate content 

This isn’t going to be too much of a problem if you’ve sat down and written the content yourself, but if you’ve been given content to add to a webpage, then  it’s crucial that you check to see that it’s not at risk of being considered ‘duplicate content’. Whilst duplicate content in itself isn’t a bad thing – and contrary to  the advice given by many SEO’s, you’re not going to be penalised for it. However, search engines may decide not to index or display your page because it’s  considered to be available elsewhere on your site – or on another third-party website. 

Use CopyScape.com to see if the content you’ve published is completely original. 

  1. Ensure page is not blocked by robots.txt 

Once a website has gone live, your developer will typically add a small text file to the root of the domain to instruct web robots how – and what – to crawl and  index. Although many SEOs will tell you that it is used to keep a web page out of Google’s index, it’s not really the case – robots can (and do) sometimes ignore  the robots.txt file, even though they try to adhere to it. The primary reason (as outlined on Google’s support pages) for a robots.txt file is to avoid overloading  your site with requests by the robots.  

However, if you specifically tell the robots in the robots.txt file not to index a particular page, then it will usually comply. Over the years there have been many  examples of frustrated webmasters spending weeks trying to find out why a page hasn’t been indexed by Google – only to discover that it has been  “noindexed” in the robots.txt file. 

To check to see if robots can crawl and index your new page, use Google’s robots testing tool on your Search Console 24

23 https://developers.google.com/speed/pagespeed/insights/ 

24 https://www.google.com/webmasters/tools/robots-testing-tool

  1. Check canonical tag 

Just like the robots.txt file, the canonical tags used on a webpage can tell the search engines whether to index that page, based on its ‘canonical location’.  Webmasters use the canonical tag to specify their preferred version of a page; particularly when that content can be accessed through multiple URLs. 

For example, there may be two different URLs which both show the same content, such as: 

www.mysocks.com/product.php?item=yellow-socks 

www.mysocks.com/apparel/socks/yellow-socks/ 

To ensure that robots crawl and index the correct pages, always run the page through a canonical location checker (example tool in footnote 25). 

  1. Create inbound links to new page 

The final task for a new page is to create at least one inbound link, so that it can be indexed quickly.  

As long as the page has a link to it from another page on the same website, then no action is needed. However, if your website is not crawled very often by  search engine spiders, then you can either submit the page through Google Search Console (and Bing Webmaster Tools) or by the preferred method – by  providing a link to it from another site that is crawled regularly. 

This can be any website, but I recommend that the page containing the new link is on-topic, i.e. the same or similar subject matter. This could be a social media  page such as a public Facebook post, Tweet or LinkedIn article, or another site with a high DA (domain authority). 

25 https://www.seoreviewtools.com/canonical-url-location-checker/

Sales & Marketing skills that last a lifetime 

Two respected experts in sales & marketing –Chris Haycock and Bruce King have joined forces to  

share with you all the essential skills you need for your business to succeed. 

Welcome to The Digital Market SG. Think of us as the Netflix™ of sales and marketing training. 

If you are running a business with revenue between £10,000 and £1 million, and are looking for  

inspiration, insights, lessons learnt, resources and growth strategies from down to earth, relatable  

businesspeople, then The Digital Market for you. 

Rather than hiring an expensive mentor, you can learn from TWO mentors for a lot less than the price  

of one! 

Grow your skillset, grow your business 

You can’t build a business without having essential sales and marketing skills. If you’re a small business  

owner with a fantastic idea or product, then the chances are that you don’t have the skills required to  

market your products effectively to the right audience. 

And you probably don’t have the best possible sales skills to turn enquiries into customers that keep coming back for more – time and time again. 

That’s where Digital Market Singapore comes in. Our platform has been designed and coded from the ground up (no WordPress, Teachable or Udemy templates here) for the  purpose of passing on our skills TO YOU. With these skills, ANYTHING is possible. 

Technical SEO: What is it? The goal of technical SEO is to raise a website’s technical standing so that search engines will rank its pages higher. The three pillars of technical optimization include making a website quicker, easier to crawl, and more intelligible for search engines.

Technical SEO is crucial since it essentially guarantees that your website is user-friendly and devoid of any technical problems that would impede search engines from understanding and ranking it. Technical SEO should be used to draw organic traffic and convert that traffic into paying clients.