Technical SEO provides the backbone of any high performing website. Just like when building a house, you would lay the foundations before painting the walls. Technical SEO works in the same way; having the correct foundation for your site to be built on is key to sustaining and growing your organic performance.
Just like a house it also requires regular maintenance and upkeep to keep it in the best condition. Technical auditing is a large part of this, but that doesn’t have to mean putting your site through a rigorous technical auditing process every week. Instead, build a plan that allows you to stay on top of the technical health of your website and still being able to find time to focus on writing that perfect blog post, and picking the perfect shade of white for your walls.
This plan is a three-pronged approach that allows you to stay on top of the daily technical issues that may crop up on your site whilst also planning for the long term. By breaking up the ways we look at the technical health of a website over a 6-month period, and even automating some of them, it becomes much easier for us to implement truly great technical SEO.
Step 1: Weekly Technical Checks
The easiest way to get into regular technical auditing of the sites that you manage is by carrying out weekly technical checks. These are more about making sure nothing has broken and continuing to implement the changes that have come out of a full technical audit. This will allow you to keep on top of the technical health of your website or client without having to perform an in-depth technical audit every week. There are three key ways that make it easy for you to stay aware of what is happening on your site:
1. Analytics Reports
Create a technical SEO analytics report that gets sent to you on a Monday morning. This will allow you to understand what is happening on the site for the past week and investigate anything that sticks out as alarming. Some of the key things to look for include:
- Average load time based on a range of factors:
- Domain lookup time
- Server connection time
- Content load time
- Bounce rate
- Differences in performance metrics for different browsers – this can indicate that your website is not appearing properly on different browser types.
A good pre-built report that looks at many of these variables is this report from the analytics solutions gallery.
2. Custom Alerts in Analytics
Custom alerts in Google Analytics are one of the most powerful ways to keep track of what is happening with your site(s). These are incredibly easy to set up and allow you to get a near instant alert if any of your key metrics change significantly. Set these up to look at drops both over a week and a day which will allow you to keep on top of any major issues on your site, without having to spend hours a week digging for them. These are set up within the customisation tab in Google Analytics.
Here are some of the best:
- Traffic drops and increases: If your organic traffic drops or increase suddenly it may mean that either tracking is broken or that you have lost or gained keyword rankings. This is set up as follows:
- Conversion rate drops and increases: This alert must be set up with goals in mind, but if the conversion rate of one of your key goals is dropping then it most likely correlate with issues on your site. This can be set up as follows:
- Changes in bounce rate: This implies that something has happened to the page itself and there is a risk it may lose its rankings. This can be set up as follows:
- No transactions or a significant decrease in transactions: This implies that your e-commerce tracking has fallen off or that something has had a big impact on visits to your site or your conversion rate. This can be set up as follows:
- Change in visits to 404 page: If there has been a massive increase in visits to you 404 page, someone or something has caused a page to be removed. This can be set up using the following template:
NB: make sure that you have the correct title for your 404 page
3. Automated Email Alerts
Many of the large SEO solution providers can (or do) send you breakdowns of what is happening on your site on a weekly basis. This allows you to stay on top of any specifically SEO related issues that may be affecting your site at that moment in time and prioritize fixing those. We use SEMrush, so I can only speak for the kinds of alerts they provide but the most useful ones are:
- Site Audit Updates: These reports provide a basic technical analysis of your website, helping you to identify any major or minor problems that have cropped up in the last week. This allows you to prioritize any work you may have been doing in terms of working through your 6 monthly technical audits.
- Backlink Audit Updates: This report keeps an eye on your back-link profile, alerting you to any new links, good or bad, that you may have gotten in the last week. This allows you to assess what is coming in on a weekly basis and decide the outcome of it. For a good link, this may mean reaching out and thanking them as well as to see if they would be interested in working with you in the future. For a truly bad link, this may require a look into the disavow file.
Step 2: Monthly Health Checks
On a monthly basis, it’s a good idea to run a site crawl using a tool like Screaming Frog. This will pick up on key issues that may have slipped through the cracks during your weekly checks. Some of the key things to look out for include:
Your XML sitemap exists as documentation of what you care about on your website. Whilst Google will still crawl pages that are not in your sitemap, making sure that it is always up to date, contains the URLs that you want indexed and is error free is a strong signal to them.
Crawl the URLs in your sitemap once a month to see that they are error free and then check that any new pages or sections that have been added to your site in the last month are included. Then resubmit this in search console. By being consistent in your approach with your sitemap, you can reclaim some control over how Google sees your site.
Broken internal and external links
During the month new things are added to a website and older pages are often moved around or removed. This can result in broken internal links and they can prove a nightmare for both users and bots. A simple way to stay on top of this is to export all the internal links from your website and then sort them by status code (Screaming Frog is a perfect tool for this). This will show you all the internal links on your site that are currently resulting in errors and should make them easy to fix. The same can be done with external links.
Whilst you should be able to stay on top of this through your weekly checks, Screaming Frog will often pick up on other response code errors that the weekly checks won’t. Having pages that return a response other than a 200 or 301 can be bad for SEO and so it’s important that you stay on top of any errors that may crop up here.
It’s also a good idea to assess what is being blocked by your robots.txt. Knowing what is in this file and staying on top of it can help you have some control over what is crawled by search engine bots. Adding in pages that aren’t going to be landing pages can be a great start to this.
Images are another area that can easily be neglected and can become a much larger issue over time. Checking that the correct protocol has been followed regarding images on the sites you are looking after should happen on at least a monthly basis. This can be done through the images tab in Screaming Frog.
Potential issues with your image include:
- Too large – if the files on your site are not optimised, then this can really impact on the page load speed of pages containing those images
- Missing Alt Text – Alt text is important for both SEO and for the end user. The Alt Text is there to tell Google and those using a screen reader what is contained in the image and can help with your images appearing in the image search results
- Missing descriptions – whilst Alt Tags are important from an accessibility perspective, the image description that is displayed on the page is another important element and an opportunity to optimise for target keywords
Unless you’re managing a site that adds hundreds of images a week, then this should be relatively easy to manage and maintain.
No index and no follow tags
There should always be a reason these tags are added but often they can be added by accident by someone who doesn’t understand their purpose. This can result in key pages of your site not being indexed or link equity being incorrectly passed through your site. This will have a direct impact on your rankings across the board. Checking that none of these tags have been added accidentally will allow you to rank for the keywords you want to rank for.
Incorrect implementation of canonicals can have a massive impact on your site, with the potential forincorrect content or duplicate content appearing in the search results. Generally, this can be avoided with a strict use of self-referring canonicals with the ability to change to a non-self-referring if necessary. Sometimes, however, this can slip through the cracks, so it’s worth checking that everything is as it should be once a month. This can be done easily in Screaming Frog by navigating to the canonicals section in the directives tab:
Step 3: Broad Technical Auditing Every 6 Months
The final step is the biggest and can be done right at the start of the technical auditing process to help inform your weekly and monthly checks.
Conducting a broad technical audit every 6 months ensures that you pick up any hidden issues on the site that may have been overlooked since you last checked in. This type of technical audit also enables you to start looking at new technical optimisations you want to implement in the coming 6 months.
As the SEO world is always growing and evolving, it is important to be continuously on the lookout for new ways to innovate and differentiate. The key areas to investigate during your biannual technical audit are the bigger issues that may be affecting your website. You can then implement a plan over the next 6 months to fix these.
With the importance of mobile-only growing, making sure that your website is up to date and running to its full potential is one of the first areas you should look at during your technical audit. This can range from using the fetch and render tool in search console to check your site is being displayed correctly on mobile, to looking into whether the site should think about implementing AMP.
Speed is another key area when it comes to technical SEO health. Making sure that your site is the fastest loading of your direct competitors can have a tangible effect on your organic rankings and also improve your sites conversion rate.
When looking at understanding site speed and how you or your client is performing, it’s good to look at a couple of different areas. First run the site through a couple of free tools, like Google’s Page Speed Insights, GTmetrix or Pingdom. These sites will give you a breakdown of how long it takes your site to load and what is causing any delays in load time. You can just use one, but it’s good to run all three so you can get a clear picture of the main issues affecting the site. Then you can prioritize these issues based on which will have the most impact and which will be the easiest to fix. These issues can then be worked on a monthly and weekly basis.
Auditing your schema is a key way to ensure that your site remains technically relevant and is achieving as much visibility as possible. Changes on your web pages can result in the mark-up used suddenly containing errors and then disappearing off the search results. No one wants that, and so checking that the implementations are still correct is key. New schema is also recognized by Google every now and again so it’s worth making sure that everything is up to speed.
Google’s Structured Data Testing tool is a great way of checking the markup on individual pages or you can check a snippet of code that has not yet been deployed on a page to make sure it is 100% accurate before it is pushed live.
Log file analysis can take a lot of time, but it is worth looking into this every few months. The main reason being that log files are the only fully accurate way to see how search engines are crawling a site. Log file analysis will help you understand if your crawl budget is being spent where you want it to be. It will also highlight if there are areas of your site that aren’t being crawled thoroughly as well as identifying why this is happening. By doing this as part of a biannual technical audit you should then be able to prioritize how and when bots crawl your site.
Not having a technically sound site can make or break your SEO efforts. At the end of the day, Google will reward you handsomely for keeping on track of this key SEO area. By remaining consistent, you can expect to see consistent rewards without having to devote your whole work life to technical SEO.