What is a Technical SEO Audit?
Anyone involved in digital marketing (or marketing in general) is familiar with some of the core concepts of search engine optimization. Keywords, links, blogs – all of these have moved from the realm of industry-specific buzzwords to common everyday language. This is rightly so, as they are important aspects of the overall SEO process.
Often overlooked, misunderstood, or flat-out ignored, though, is technical SEO. This is likely because it is the least like ‘classical’ marketing. Technical SEO shares a border with the worlds of programming and computer science, making it somewhat foreign to traditional marketers.
As a technical SEO agency, we know how complex this sector of search optimization can be. That’s why we’ve developed this in-depth guide to help you uncover what exactly technical SEO is, how to conduct a technical audit of your site, and why you should not be afraid of the word ‘technical’ in your marketing strategy.
Want to learn how Digital Authority Partners approaches SEO? Watch the video below!
What Are the Main Ranking Factors?
In general, you can break down the entirety of SEO, all of the hundreds of factors that the search algorithms take into account, into three brackets:
- On-Page – this includes the bulk of the classical marketing purview: your content, keywords, URLs, internal links, and the like
- Off-Page – the other pieces of the marketing mix: social media presence, external backlinks, e-mails, and the like
- Technical – the technology, programming, and coding side of your site
All three of these areas need to be monitored constantly to ensure that you are ranking as high as you can.
What Is Technical SEO?
Broadly speaking, technical SEO covers every aspect of your search engine ranking that is not user-facing. Search engines crawl your website periodically, identifying and measuring specific factors. These factors include everything from the way your page loads, to the ease in which a user can navigate around your website, to the use of keywords to highlight what you do and are trying to say.
Technical SEO is about making it as easy and as clear as possible for the search engines to conduct that crawl. For example, if the crawler cannot even access your site, then it does not matter how good the rest of your optimization practices are – you simply will not get ranked. If the crawler can access your site, but cannot interpret the information it finds there, then you will be similarly penalized in the rankings.
Think of your web presence like a car when it comes to your search engine optimization. The body and the paint are comparable to the keywords and backlinks: they are what attract you to it, initially. The user experience is your driving seat, pedals, steering wheel, etc. These are all very important requirements – but without the engine under the hood, the whole thing is without purpose.
Technical SEO is the engine that brings value to the rest of the work, so you need to check it frequently for any issues.
How Do You Know If Your Technical SEO Is Working for You?
There is no easy answer to this. It would be great if you could simply run a test and get a ‘yes’ or ‘no’ response, but the truth is that technical SEO can be incredibly complex. There are a lot of moving parts when it comes to your website running smoothly and in a manner that users find accessible, all while making sure that search engines like Google can easily access, crawl, index, and above all understand everything on your site.
You can start by conducting a technical audit. There are many different tools and plugins that you can use to help you, so there is no need to be nervous if you are not familiar with your website’s source code.
How Do You Conduct a Technical SEO Audit?
There are several crucial factors that make up the technical aspects of your site SEO. For a full audit, you will need to look at each and remedy any issues that you find. Our technical SEO audit checklist is a comprehensive list of the most valuable technical aspects. Work through this list and you should be in good shape.
- Is your content visible?
- How fast does your site load?
- Are you managing your crawl budget?
- Are you being marked as a duplicate site?
- Have you set up Google Search Console?
- Have manual actions been taken?
- Is your site mobile-optimized?
- Are there any coverage or indexing issues?
- Are you being indexed correctly?
- Do you have any broken links, dead pages, etc?
- Have you set up robot permissions?
- Is your sitemap accessible?
- Is your site secure?
- Does your site have malicious or negative backlinks?
- Is your structured data working properly?
That might seem like a lot, and it can come across as fairly intense work. Once you have completed the technical SEO checklist for the first time through, however, you will essentially just be checking that everything is still working when you conduct your future audits. This means that you should not have to do as much in terms of fixing issues as they should be few and far between.
Let’s look at what each step means, what you are looking for, and how to check that they’ve been done correctly.
Is Your Content Visible?
You can start the technical SEO audit by simply checking that your content is viewable. When you layout your web pages and add content, you do so in a sort of silo – your content management system or web development system, typically. Get in the habit of checking things over from a visitor perspective every time you hit publish.
The best practice is to visit your pages using a Chrome browser – Google uses a version of this to perform their rendering checks. So, if you are good on Chrome, you should be good for Google (and it is Google you are looking to optimize for the most part, as they are by far the most popular search engine).
You can then go one step further – use the Chrome features to disable JavaScript and see how your pages are affected. If things go missing or are now incorrectly displayed, then you will need to adjust the back-end to account for it.
Checking for content visibility can also be done by accessing the cached version of the page. Simply go to Google and enter a search term that will bring up your website in the results. Then, click the three vertical dots next to the result, followed by the ‘cached’ button – this will bring up the version of your site that Google has stored. This is the page as it was the last time it was crawled. From there, you can visually check that everything is (or at least, was) in order.
If you have Google Search Console set up (and you should, more on that later), you can also use the URL inspection tool to assess the current status, check for any performance warnings and test your live page.
How Fast Does Your Site Load?
Site loading speeds are important both in terms of user experience (some reports suggest that a site that takes three seconds to load will have a 32% higher bounce rate than one that takes only 1 second), and your ranking.
Check your load speed with Google Page Speed Insights, and check that your load speed for both mobile and desktop are in “the green”. Measurements include factors like how long it takes for the largest image to load, and how long before a visitor can interact with any features. Your report should show you any areas that need your attention. You can then take appropriate measures to improve your speed, your ranking, and your visitor engagement.
Are You Managing Your Crawl Budget?
Crawl budget (or crawl time) is essentially the number of pages on your site that a search engine will crawl through within a set period. This budget is in place for two key reasons. First, if a search engine is constantly crawling your site then it could affect performance for ‘regular’ visitors, as you only have so much bandwidth available at any given point. Second, the crawler robot itself has a limited amount of bandwidth. There are millions of websites to be identified, indexed, and ranked, so only a certain amount of crawl time can be allocated to each site.
Why is this important to you? If you are ‘wasting’ your budget on non-essential pages, then you may be missing out on having higher-value pages indexed. Pages like your terms and conditions, privacy policy, and 404 - page not found errors are wasting your budget, as are pages that take a long time to load or time out.
You can check your crawl budget very easily. Go to your Google Search Console and click on ‘crawl’ and then ‘crawl stats’, and you will see the number of pages on your website that Google is crawling per day.
Are You Being Marked as a Duplicate Site?
If you have multiple pages that Google (or other search engines) do not interpret as the same page, then you might be competing against yourself for your ranking position. For example, the page https://www.digitalauthoritypartners.me and the page https://digitalauthoritypartners.me will both be competing for the same position if they are not correctly flagged to the web crawler as being the same.
This process is called canonicalization – you are telling the search engine algorithm that your page is the ‘canon’ one, the master-site if you will. This will make sure that your canon site is the one that is crawled, indexed, and crucially, ranked. There is a handy Google plugin to help you with this if you do not want to go through the laborious process of analyzing your source code manually – it’s called the Lighthouse Plugin.
Using Lighthouse is easy – once you have installed the plugin (a free, open-source tool), simply go to ‘Options’, check the SEO box, then generate a report. When that has been completed, make sure that under ‘Content Best Practices’ the report reads as ‘document has a valid ref=canonical’.
Have You Set Up Google Search Console?
Google Search Console, formerly Google Webmaster Tools, is an excellent, free SEO tool that anyone who owns a domain can use. Simply go to the console, use a Gmail account to log in, and associate your website with your email address. Then you will have access to some powerful optimization tools, including being able to confirm that Google can find and crawl your pages, request that your pages are re-crawled (after you have pushed a new publication, for example), and view any highlighted issues.
Once you have your Google Search Console affiliated with your site, you can move on to the next step.
Have Manual Actions Been Taken?
Manual actions are those where an actual human being at Google has stepped in, reviewed your site, and highlighted issues. On the navigation panel of the search console, there is a header called ‘security and manual actions’.
Click this and see if there are any against your site. There will rarely be – the algorithm is relied on for the vast majority of the time – but if there is, then you should get them resolved as soon as possible as they will have a big negative impact on your search engine ranking position.
Is Your Site Mobile-Optimized?
Over the last couple of years, Google has moved to a mobile-first crawl/index model. This means that your site should have a mobile version that is well-designed and easy to use across all screen sizes.
This includes a whole range of technical factors, but the primary ones you should concern yourself with are page loading times, screen size parameters, and image/video rendering.
You can check how your mobile website performs in a few different ways. First, you can simply navigate your way around your site on your mobile device and see how it performs. Second, you can use Google’s test for mobile-friendliness, which will give you a score for how mobile-user friendly your site is, and highlight any specific issues that need your attention.
Another quick way to check for sizing issues is to load up your site on your desktop and resize the window. If the displayed content adapts to the new size of the window, then you are more than likely displaying well on mobile (although, this check will not tell you about image rendering or load speeds).
Two of the Google tools we have already mentioned can also be used here: Search Console and Lighthouse both have mobile checking features.
On the Google Search Console, you can go to ‘enhancements’ and then ‘mobile usability’ to find out if there are any issues that the crawler has identified.
Using Lighthouse, select ‘options’, check the box for SEO, and run a report. This tool will look at the technical aspects of your mobile site more from a user viewpoint. It measures things like how long it takes before a first impression is available for viewing, and how much time it takes before navigation buttons and drop-down menus can be interacted with, for example.
Are There Any Coverage or Indexing Issues?
Indexing issues mean that your site or pages are not being crawled correctly (or at all), meaning you will not achieve any ranking whatsoever. You can use the Google Search Console again here (we did tell you it was an essential and powerful tool) by analyzing the number of valid pages, the noted errors, any warnings, and comparing them to previous audits.
If you have fewer valid pages than you did at your last audit – and it is not because you have removed content – then you have an issue. Likewise, if there are now more errors or warnings on your console than there were before (the ideal number being zero) then you have some work to do.
The console dashboard will tell you what the problems are – server errors, pages not found, etc – so you can make changes straight away. If you want to remove a page from being crawled/indexed, update your robots.txt (more on that later).
You can also go back to your Lighthouse report. Run it the same way as before and check that the sections for crawling and indexing read as ‘page isn’t blocked from indexing’ and ‘robots.txt is valid’.
If both of those comments are in situ, then it means that the page is available for crawling and should be able to be indexed correctly by Google.
Are You Being Indexed Correctly?
You can conduct a very quick check to see if your pages are being indexed correctly. Go to Google, and in the search bar, type ‘site:’ (without the quotation marks), followed by the domain name you want to check. When you hit enter, the results page will populate with each page that is indexed underneath that domain.
For instance, if you were to enter ‘site:example.com’, you might expect the results to show:
https://example.com
https://example.com/aboutus
https://example.com/services
https://example.com/contact
If there are too few results, then you have a problem with the robot not being able to find and index your pages. If you have too many results, then it is likely that you need to go back and audit your site for canonization.
Do You Have Any Broken Links, Orphan Pages, Dead Pages, Etc?
Broken links, orphan pages, and dead pages have two effects – they will negatively impact your search engine ranking position, and they will irritate and eventually force away your visitors. Each link on your page should go where it is supposed to, and when the new page is loaded, it should be populated with the content that your user would expect to see. There is nothing quite as annoying for a site visitor as trying to find information, load an article, or even make a purchase, only to be greeted with ‘404 – page not found’.
An orphan page is not linked to from anywhere else on your site, meaning that visitors can only find it if they have the address or by following an external link. Web crawler robots will find it hard to find and access these pages, and if they do they will not be able to easily associate it with your other content.
You can manually check for broken links, orphan pages, and dead pages by clicking every navigation link, and loading every page on your site. This is a lot easier to do on a six-page website than a multi-product ecommerce site which could feasibly have thousands of pages. To automate this process using Google Search Console, go to the coverage report, and filter to highlight errors.
At the bottom of the screen, you should be able to see any non-indexed pages that have been flagged due to the 404 error code. Then you can choose to fix the problem by adding in the appropriate content, removing the page entirely, or removing it from the crawl.
While you are looking at your internal site links, it is good practice to make sure that any single page is as close to the home page as possible. This allows your visitors to easily get to any page on your site from the home page in just a couple of clicks.
The more clicks it takes, the more complicated your site is perceived to be by the algorithm. This will affect your ranking as it portrays your site as not particularly easy to navigate, and therefore not very user-friendly.
Have You Set Up Robot Permissions?
Robot permissions are a way for you to manage the access to your pages that the search engine crawlers (robots) have. You can tell the search engine that you do not want a particular page to be crawled and indexed, and thus it will have no effect (negative or positive) on your ranking.
You can do this fairly easily by using the robots.txt file in your website source code. You can find this by attaching the suffix /robots.txt to your root URL (the main home page for most sites – so www.digitalauthoritypartners.me, and not www.digitalauthoritypartners.me/services/seo for example).
If there is no file there, then you will have to create one. You’ll want to do so in Notepad or a similar tool that is not a word processor. You can then upload it through your domain provider or hosting service.
You can then use several simple terms to allow or disallow access to specific robot crawlers (or all of them). Bear in mind that the default position until told otherwise is that all robots can access all of your pages.
A couple of things to note here:
- Robot permissions are not enforceable. Just because you have stated that you do not want any robots to index a particular page, does not mean that they cannot. You can rely on reputable search engines (Google, Bing, etc) to adhere to your request, but other robot crawlers may not.
- Using robots.txt simply means that a web crawler should not crawl and index the page. If you have other pages or content that points to that page then the search engine may still find and register the URL, just without any content attached to it. To keep a page entirely hidden, you should use password protection, or mark it as ‘noindex’ in the source code.
Is Your Sitemap Accessible?
A sitemap is a file embedded in your source code that identifies each page of your site, and its connections to your other pages. It can be a fairly complex piece of work to write for yourself. Thankfully, many web page-building programs will do this automatically for you, but it is worth checking that it has been submitted to Google and/or other search engines.
Your site will be identified, crawled, and indexed with or without a sitemap, but having one means that it will be indexed more frequently and that your ranking will be improved.
You can check this on your Google Search Console, as well. On the navigation bar go to ‘Sitemaps’, and check if it has been submitted. You should also be able to see the last time your sitemap was crawled, which is handy if you are concerned that new pages have not been added to your analytics.
Is Your Site Secure?
The very first step in checking if your site is secure begins with the address bar. Does the address begin with HTTP or HTTPS? The presence (or lack thereof) of that ‘S’ will affect your ranking.
The search engines will lend higher importance to the sites that have basic security considerations in place, and can therefore be thought of as more trustworthy and user friendly.
You may have both HTTP and HTTPS sites, in which case you need to circle back to canonicalization to make sure that your secure site is considered the master. You can also set up a redirect from one to the other, to make sure that both crawlers and visitors reach the correct pages. When using a redirect, try to always use a permanent (301) redirect rather than a non-permanent (302), as it will give your site more authority with the algorithm.
You can check this manually by visiting all of your pages through organic navigation – i.e. clicking from the homepage through to each linked page – or by using the trusty Lighthouse plugin.
Go to ‘options’, and instead of ‘SEO’, make sure that the ‘best practices’ box is checked. Run your report, and on the results page, check that under ‘passed audits’ the report has ‘uses HTTPS’.
Does Your Site Have Malicious or Negative Backlinks?
If your site is being linked to by dangerous, untrustworthy, or generally dubious sites, then you could be penalized by the perceived association. The Google algorithm is designed to recognize these types of links and take them out of the equation. Yet, on the off chance that one or more have been missed, you will need to manually tell Google that you are not looking to be affiliated with that type of site.
You can do this on the Search Console, by going to the ‘links’ menu under ‘navigation’, then clicking ‘more’ and ‘top linking sites’. This will show you the sites that are currently linking into your pages, and you can scan through the list to see if anything out of the ordinary stands out. If so, click on the ‘disavow link’ button to make it clear that you want nothing to do with the source.
Is Your Structured Data Working Properly?
Structured data, also known as schema markup, is an advanced part of the digital marketer’s arsenal. At present, it does not affect your search engine ranking position, but it does make it easier for the search engines to recognize different content on your site and improves the way that your site is presented in the results.
When you conduct a search on Google, you get results in a list form. Standard results show the title in blue, with a brief description beneath in black. This is what we all picture when we think of a search results page. Rich results, however, give you more – so if you search for a recently released film, for example, you might get an image or a video clip of the trailer, your local cinema showing times, and reviews. The same applies to products. Right there on the result page, you will see star reviews of either the competing products that you searched for or local retailers that carry them.
Getting those rich results all comes down to structured data. While your standard ranking will not be changed, the way your site is displayed in the ranking will be much improved.
You can use the (free) tools from Google to check that your structured data is working as it should. The Rich Results Test is great for seeing what structured data on your site can be used for the generation of rich results on the Google search page. The Schema Markup Validator will let you assess all of the schema on your site and check that it can be interpreted correctly by the web crawlers.
Conducting these checks should be part of your technical audit purely so that you know that all of your schema is working as it should and that your content is being correctly recognized. You will also know that you are giving searchers the best possible introduction to your site pre-click.
How Often Should You Audit Your Site?
Google and other search engines update their search engine algorithms frequently which has an immediate effect on all aspects of SEO, including the technicals. You should complete a full technical SEO audit at least twice a year, ideally once a quarter. Smaller, incremental, or more basic audits can be conducted ad hoc, with a quick check completed once every month or so.
This will ensure that not only is your site still performing as it should but that you are adapting your technical foundation in response to search engine changes.
You should be aware, however, that while our checklist is a good starting point at the time of writing, any substantial changes to the way that web crawler robots and the underlying algorithms work could easily make the list incomplete.
This means that it is important that you stay abreast of the updates and evolutions of the search engines. Failing to do so could mean that your competitors get a foothold higher up in the results than you, and you lose visitors and growth traction.
Crawl Your Own Site
There is also a range of tools available to help you crawl your own website, giving you the kind of information that the search engine crawlers are getting. This can give you a good overview of the main issues with your site, as well as (in some cases) provide you with advice on how to resolve them.
Searching for web crawlers will give you a wealth of options, both paid and free. You should be able to find a tool that suits your technical abilities fairly easily. Some are overtly code-heavy, but there are many more user-friendly options if you are not super tech-savvy.
When you have crawled your site, and the results have been delivered, you can then use that information to start checking off items from the list. Then you can re-crawl and make sure that your fixes have been effective.
Final Words
At Digital Authority Partners we have the in-house skills, experience, and knowledge to help you fully understand and optimize the technical side of your site. Our team of experts will conduct a thorough technical SEO audit for you, before creating a plan that gets your site running smoothly and ranking highly. This includes everything from load speeds and mobile performance to making sure that search engines can easily access and correctly index every element of your site.
Want To Meet Our Expert Team?
Book a meeting directly here