SEO Audit Guide: Performing Better With Technical Search Optimization
SEO Level: Intermediate, a community post submission.
It’s well understood that search engine optimization is important. The idea of “technical SEO” gets thrown around a lot in discussions about optimization and its benefits, but few people take enough time on the technical end of things.
Even if you’ve spent considerable time researching and conducting optimization, you might not know that much about what technical SEO is. It seems like the term “technical” is a barrier to entry for a lot of people, but it’s important to understand that anyone can grow their understanding and improve their results.
We’re going to run through all of the things that you should have on your technical SEO audit today. We’ll explore important technical terms, why they matter, and how to implement them into your strategy.
Hopefully, the information below will get you on track toward the results you’d like to see. Let’s gets started.
What Makes a Technical SEO Audit?
Let’s specify the type of search engine optimization that’s involved in technical SEO.
Technical SEO tends to exclude most of the factors that one focuses on when doing on-page optimization. For example, things like your writing style, word count, images, and social media campaigns aren’t factored in directly.
There’s overlap between specific parts of those factors and technical SEO factors, though. Most aspects of on-page SEO have corresponding off-page factors included in your technical SEO audit.
The things we’ll discuss today exist mostly off of the screen and have more to do with the way your site functions. For example, items such as your website code, HTML elements, user experience, site speed, and more.
It’s helpful to create a document that lists these site elements. If you’re working through today’s list and managing your site as you read, try making a spreadsheet of the ranking factors below and jotting down notes as you go along.
Without a way to organize your thoughts and ideas for your site, it’s difficult to keep everything in view and actionable.
Your technical site audit is a look at the factors below to ensure that everything is working as well as it should be. After you go through everything, you’ll have a clear idea of where you can improve and why things aren’t working. If things are working, you’ll have a good idea of what’s making that happen as well.
Once you’ve got your spreadsheet ready for notes, let’s look at the essential points of a technical SEO audit. It can help to use additional tools like Sitechecker to give you some further insights and tools on how to make the required changes.
1. Crawling and Visibility
The central idea of optimization is to rank your website and all of its various pages in the appropriate searches. The foundational point of that process is the knowledge that your site is visible to the search engine.
Without search engine visibility, you have no chance of visibility to the customer.
While it may seem like Google’s AI has infinite wisdom, it might still have trouble crawling and indexing your pages. In other words, it’s possible for Google to index your website in ways that are disadvantageous to you.
This is more true if you have a lot of content or your important content is spread apart and linked poorly. Further, if you’ve got a new site without many external links, Google might not crawl you right away.
The good thing is, there are ways to improve your visibility. Start the process by looking at the Google Search Console.
Google Search Console
Search console is a feature provided by Google that helps you understand how Google sees your site. It’s also a tool that gives you direct insights when Google picks up errors or weak spots on your site.
It’s a free tool that gives you some of the most direct and important insights into your performance on Google. Note that you don’t need Search Console to rank well or find any position in the search results. It’s just a useful tool that gives you eyes on things that you would otherwise need a third-party service to see.
It’s a lot easier to work through your technical SEO audit on your own with the help of the Search Console. The same is true for the long-term management of your site.
Search Console also offers you some insights into keyword research, engagement rates on your site, mixed content issues, and more. Arguably, the most important thing that Search Console offers is the ability to see exactly how Google crawls your pages.
One important tool here is the ability to see your indexation status. This shows you how many of your pages Google is indexing over time. It’s a way to spot indexation issues and notice trends.
For example, if you’re producing a lot of content but not getting indexed, there’s an issue to troubleshoot.
Note that those who are ranking for other search engines like Bing have access to similar tools. For example, Bing webmaster tools is an excellent resource that provides similar tools.
A Note on “Crawling”
If you’re unfamiliar with the term “crawling,” it refers to the way in which Google finds and ranks your website’s pages. Google’s AI crawls websites through the network of links among different websites.
Think of a massive house with billions of rooms that are all connected by doors. Each room is a different page on the internet, and each door is a link to that page. The rooms that have more doors are easier to find.
The angle at which you enter the room affects the impression you get as well. The same is true for Google bots. If you have a complex site, Google could rank it differently depending on which pages have the most links, how the site is organized, and more.
Through the link, bots land on your site and document what they find there. This is when all of the myriad ranking factors get documented and indexed for the search engine algorithm to use.
The process seems pretty simple, but it’s important to note that there are different ways your site might be interpreted. Numerous factors come into play, and managing those factors through Search Console is extremely useful.
Other Search Console Features
So, Google Search Console helps you confirm that your site is visible and crawlable. It also gives you the ability to request a re-indexing of your website or particular pages.
You need to request a reindexing if something drastic happens on your page. For example, maybe you got flagged as spam. Maybe you have a time-sensitive post that has to get listed as soon as possible or else it will be worthless.
Search Console gives you updates when these things happen as well. You don’t get those direct updates if you don’t have the Search Console. You just notice that your results have dropped and don’t know why.
You could also get penalized for various aspects of your site that aren’t up to Google’s best practices. In any case, reindexing allows you to get back into your rightful place in the search results a lot faster than you would if you let Google crawl normally.
The crawling cycle takes somewhere around two weeks to go full-circle. Having a way to say “Hey Google, I’m updated” is very helpful when you don’t have two weeks to spare.
Further, the platform gives you insight into your mobile optimization and AMP metrics. You’re also able to look at things like a rich results test.
A rich results test lets you know which content on your page might be used in the Google results and go beyond a normal link. For example, think about search results that produce images and other data that aren’t text. These factors do a lot to improve user engagement, especially in local results.
Robots.txt files are a way to let different crawlers know which sites you’d like to get crawled. In a lot of cases, you’re letting a search engine know that you wouldn’t like a particular page to get crawled.
It might also be a way to streamline the pages you want indexed to manage a particular SEO goal that you have. For example, maybe there are a few duplicate titles that you have on your page, and you want to eliminate one of them to free up space for the other.
It’s important to note that while these files work a lot of the time, there are instances where they don’t. For example, different search engines read the syntax differently. Further, some search engine crawlers won’t even read the robots.txt file.
Your page could still get indexed if it’s linked to by other pages on the web as well.
2. Using Sitemaps
A sitemap, often called an “XML sitemap,” is more or less a streamlined way for Google to understand and index your pages. The map gives a tidy list of your website URLs to get crawled and indexed.
To understand why this is important, think about your website and all of its pages as a flow chart. There’s a source point that breaks off into two separate categories, both of those turning into three more categories, and the process continues.
By the end of the chart, you have fifty different items belonging to fifteen different categories. Your website might look a lot like that, especially if you’ve been using it for a long time. It’s hard for Google to get to all of those pages, especially if they’re new or they have no backlinks.
The site map lists all of your pages in one place instead of working through dozens of links. When you submit a sitemap, Google indexes all of those pages a lot quicker than it would otherwise.
There’s also more certainty that your pages get seen.
Another way to supplement crawlers is through a schema markup. This is coding that helps crawlers understand and navigate your website more quickly.
Websites That Benefit Most from Sitemaps
Sites with a few pages get indexed pretty quickly regardless of whether they have a sitemap or not. Your pages stay the same, you only have eight or nine pages in total, and you’re already indexed in Google.
In those instances, you don’t have to submit sitemaps unless something significant changes.
Websites with a lot of moving parts do need to submit maps on a regular basis, though. If your website has a deep structure and consists of thousands of pages, a map is essential if you want to get indexed. The same goes for pages that have updates and change on a regular basis.
News websites, for example, should submit new sitemaps when they modify pieces of content. The same goes for businesses and sites that post on a daily basis. Google won’t rank you the same day on its own, and a lot of that content is time-sensitive.
You may benefit from site maps if you have a new website that’s small and stagnant as well. New sites tend to have weak linking structures, no backlinks, and hardly any domain authority to speak of. That means it’ll take Google a while to find you because you’re not situated in a network of links yet.
Submit a sitemap and you’ll circumvent that process by giving your information straight to Google.
What are No Index URLs?
Using no index URLs is an important thing if you have a lot of dead content or unimportant content on your site. Say, for example, that your site has users that sign up, create profiles, and post things.
Also, say that 40 percent of those people don’t ever end up posting, but their user profile holds its own URL. Including all of those profiles in your sitemap could send the wrong message to Google.
If Google sees that the majority of your site consists of pages that don’t have meaningful content, you’ll lose domain authority. The power of your important pages diminishes, and you start to lose the success you’ve earned.
You might see that there’s more traffic going to your site at large, but whether or not that traffic does anything for your business is a different story. You can use “no index” in the sitemap to prevent Google from indexing those sites.
That way, your website will get indexed in the fashion that you want it to. There won’t be any dead pages bogging your search results down.
You might do the same for old pieces of content that conflict with your new content. Say, for example, that you have an active blog and you post multiple pieces each week.
Some of those posts rank for similar keywords and ideas. In other words, they compete with each other, diminishing the value of one another. When you take out the old post, you free up space for your new one to rank well and succeed.
You can use tools like Ahrefs to get insights into the kind of keywords you should optimize for as well as the competition you might be giving yourself.
Sitemap Format and Tags
It’s one thing to understand what a sitemap is, and another to apply yourself and manage one. Shake off the dust a little bit and accept that you can understand how technical web issues work as well as the different nuances that go along with them.
That said, take a look at this example of a single-page XML sitemap:
That’s the map for a single page of a website. It might look a little scary at first, but you’ll find that it’s not that complex. Notice that each line follows a pattern.
There’s an indicator (<loc>) followed by a piece of information (https://productiveshop.com/), followed by the closing indicator (</loc>). Note that the closing indicator is the same as the original, except that it has a forward slash in front of the word.
Understanding the Four Tags
The primary tags for you to understand are displayed above. The first line is the location tag, signified by the “<loc>” text. This is where you signify the URL location of the site.
Make sure that you list the updated URL perfectly here.
The next line is the “last modified” tag. This tag isn’t required, but it helps out a lot. Google cares about this factor even though it’s not essential for indexing.
The last modified tag is written as <lastmod>, and the date follows it. If necessary, you can also add the specific time and time zone to this piece. A full date and time last mod would look something like this:
After the date, insert a “T” to signify the specific time. Then, add the updated time in “military time.” To add a time zone, enter a plus or minus followed by the number of hours away you are from the UTC.
If you’re confused about that last bit, take a look at this breakdown of those time differences. The next tag is the “change frequency” tag. This is another optional tag, and it signifies how often you plan to update that content.
It’s signified by <changefreq>, and you enter daily, weekly, or monthly, followed by </changefreq>. Professionals at Google have noted that this factor isn’t that important for the search rankings.
Finally, another optional tag is the “priority” tag. It allows you to rank the importance of the page relative to the other pages on your website. The scale goes from 0.0 as the least important, all the way up to 1.0.
Image, Video, and Mobile Sitemaps
Note that there are also sitemap possibilities for your images, videos, and mobile pages.
The consensus on these is that they don’t do that much for your site unless those factors are essential to your business. So, for example, if your website thrives off of a myriad of videos posted all of the time, go ahead and use a video sitemap.
If your website is geared for phones and mobile devices only, use a mobile sitemap. The thing is, all of that information gets indexed by Google in the normal process, so it’s not necessary to spend the time working through those maps.
If you’re not a fan of all of the coding and tagging required in the process of making a sitemap, don’t worry too much. There are a lot of tools and platforms online that do all of that work for you.
It’s important to understand the nature of the sitemap so you can use it to your advantage, though. When it comes to understanding and creating them, there are a lot of options for you to work with.
You could even consider bringing in an SEO professional as an extension of your in-house marketing strategy. If that’s not in the book, you can work with a dynamic XML sitemap platform that updates your maps on Google Search Console as you make changes.
3. 404 Errors And Solutions
404 errors are instances where a website isn’t found. It’s not findable through the normal URL that you’ve assigned to it. This is unfortunate, because there’s often a lot of time and effort invested into a particular URL, and you’d be starting from scratch if you just created a new page or “no indexed” that page.
These are common issues that might seem like the end of the world when they come up. The key reasons that 404 errors occur are that a page doesn’t exist anymore, or the link to the page is dead.
Dead links lead to 404 errors, as they fail to send the user over to the target page for one reason or another. Maybe the URL of the site has altered slightly, causing the link not to work, or there could be a number of other issues that cause a dead link.
These issues do some damage to your SEO, and they reduce the integrity of your site significantly. Not only does Google knock you down in the results, but users have a more difficult time navigating your site when they do arrive.
The beautiful thing is that you can repair links by just inserting the correct URL into the anchor text. In most cases, links die because URLs change, so the requires that you update the URL.
It’s difficult to find 404 errors if you don’t know how to look, though, and that’s the trouble. Fortunately, Google Search Console gives us a little help in this regard.
Finding 404 Errors
Go to Search Console, and look at the menu on the left-hand side of the page. There’s a series of drop-down options like “Dashboard,” “Messages,” “Search Appearance,” and “Crawl.”
Select “Crawl,” then click on “Crawl Errors.”
Once you get to that page, click on the tab that says “not found.” This should produce a list of any sites on your page that aren’t showing up through the links that Google has come in contact with. They document every time this and compile those instances into the “not found” list.
To find the dead links, click on the tab that says “linked from.” There, you’ll see all of the problem areas. Don’t get overwhelmed if you see a large number of links in this list. When there’s a significant number on the list, it typically means that there’s a single dead link somewhere in your menu bar.
It could be the menu bar or any other site area that includes site-wide links. You might just have to fix one link and see that everything falls back in order.
Another way to streamline the finding process is through Screaming Frog. This site allows you to get a look at any crawling or linking issues that might harm you.
Custom 404 Page
404 errors slip through the cracks, and there’s little we can do to prevent them. Aside from getting very meticulous managing for your URLs, the best you can do is to address errors when they come up and keep on moving.
That said, the standard 404 error page that pops up is a particularly unattractive one. It’s a white page telling the user that your site isn’t up to snuff. It looks bad, and it could damage your brand.
A standard 404 error page is like a big sign that says “don’t trust this website, it’s too old.” Even though that’s probably not the case, it still appears that way to many users.
One thing you can do to improve your appearance is to create a custom 404 page. You have the ability to create a well-branded, friendly 404 page that lets the user know that the page can’t be found. Then, you have the opportunity to direct them back to your site in a way that retains the trust you have.
It’s something that seems trivial, but it does a lot to smooth over the user experience when there are 404 issues at hand. You can do anything you want with the 404 page, too, so there’s nothing stopping you from turning it into some kind of lead generation.
SEO Benefits of Custom 404 Pages
When Google crawls your site and it runs into a 404 hiccup, the standard 404 page doesn’t allow Google to crawl any further. The bot gets stuck because there aren’t any links back to your page. This puts a big wrench in your well-oiled and crawlable website.
A custom 404 page provides numerous links back to your own website, allowing Google to keep crawling.
4. 301, 302, and 500 Errors
All of the 3xx status codes have something to do with what is called a redirect.
For a little perspective, it might help to know what status codes are in general. Whenever someone visits a site, the web server gets the request to visit the site and produces the website HTTP header along with its particular HTTP status code.
The status code tends to get hidden from the user’s sight so long as everything goes the way it should. When there’s an error, you’re directed to a page that displays the status code. Hence the 404 error page, or the 301 error page.
301 errors are instances when a URL is supposed to redirect the user to a new URL. Say for example, that you want to update your content with a bright and shiny webpage. You’ve got a solid amount of user traffic to your existing page, but you would benefit a lot from a new and improved page.
301 redirects allow you to utilize the previous URL so that you don’t have to start from scratch. Any user who stumbles upon the URL to your original page is redirected to the new page.
Issues come up when that page is improperly linked or there’s some miscommunication that prevents the redirect from happening.
Fixing and Troubleshooting 301 Issues
If you find that there’s an issue with a 301 error, the best place to start is in your web hosting platform. Many people use WordPress and Apache, and those interfaces make it easy to adjust your redirect information.
Generally speaking, though, all web-hosting platforms should give you a simple way to manage different requests like redirects and more. Go to the place where you initiated the request for redirect in the first place.
In terms of Apache, that means selecting “Hosting,” then going to the .htaccess tab. There, you can examine whether or not the links of your redirect or updated.
Another thing to note is that redirect chains occur involuntarily in a lot of cases. A redirect chain happens when there are multiple redirects between the requested URL and the final destination.
As you scale your website, you could add more and more redirects that you forget about. It’s important to look through the whole chain and identify errors if you’ve got a redirect chain that’s producing 301 errors.
302 redirects are those that indicate a redirect that won’t last forever. It’s a temporary shift of URLs while something gets taken care of.
Maybe you have some serious adjusting to do on your site and you don’t want users to see your page while you take care of it. You might also find that there’s explicit material on your site that you weren’t aware of, so you set up a redirect while you make changes.
One nice thing about the 302 redirect is that it doesn’t affect the Google standing of the original page. There’s no change to the index, and your SEO campaign can keep working like normal.
302 errors tend to come from errors in the linking setup. If you find that there’s a 302 error, go into your web hosting platform and see that the links you used were correct.
It might help to reenter both links and resubmit the 302 redirect.
500 errors, also known as internal server errors, indicate that there’s something wrong with your server. This is a big deal, and it’s important to call your web hosting provider and get some professional help if it happens.
The issue in question could be a small one that’s easily fixed, but you have to make sure. In most cases, though, you need the customer service team to help diagnose the particular issue, because 500 errors aren’t specific on what the actual issue is.
Managing Redirects Yourself
It’s important to mention that redirects can have a negative impact on your SEO if you don’t handle them correctly. When you plan to make these changes yourself, make sure that you’ve done your research and you know how to use your particular hosting platform.
The last thing you want is significant damage to your site as a result of a poorly-managed redirect. It’s also important to keep checking on things after you put the redirect in place.
If there’s an error that you aren’t aware of, you will lose a lot of web traffic. The best bet is to work with a professional to see how the process works. They can manage your back-end processes for you, or they might help you to understand the process so you don’t make significant mistakes.
5. Meta Title and Meta Description Factors
The meta title and meta description are a little less technical than some of the ideas above, but they still fall into the category of “technical SEO.”
“Meta” factors are those that offer insight into the HTML or XHTML file. The meta title is also called the title tag. It’s the text that shows up in the link in the search results.
Try to keep the tag under 70 characters so that it’s visible in its entirety on the search results. The tag should also include the target keyword that you’ve optimized your content for. At the same time, don’t “spam” the tag with keywords.
In other words, don’t stuff the description with keywords because Google will know what you’re doing.
The meta description is the text that shows up below the link. It’s the description you give for the particular content in question. While the meta description doesn’t play a big role in rankings, it’s an important factor for user experience.
6. Internal Linking and Backlinking
The link structure for your site is one of the most important factors in crawling. It’s through your link network that Google finds your pages.
Both the links that you send out from your site and the links to your site have a lot of importance. Backlinks, or links that come to your site from other pages, are big indications that your site has relevance.
If you don’t have many backlinks, it’s important to try different methods to accumulate some. Sometimes, site owners offer to create content for other sites in exchange for a link back to their own website.
Internal linking is the process of putting links to other web pages on your site into your content. Internal linking strategies help to draw users in through the sales funnel, direct crawlers to index in the best way and more.
Your internal linking also plays into your page heading hierarchy and your navigation structure. Users navigate your site through links, and a complicated navigation structure impacts the user experience. Google uses site architecture and navigation structure as a ranking factor as well.
External links are those that run from your site to other sites. External link relevancy is an important factor as well. Make sure that you’re linking to respectable sites, as they enhance your own relevance. If you’re linking to websites that provide real value, that will elevate the value of your content.
One thing to watch out for in the process of external linking is toxic links. Toxic links are those that are flagged as spam or could be dangerous to users. Pages that are stuffed with spam or pop-ups typically fall into that category.
Nofollow links are those that you don’t want Google to use when it factors search rankings. They might also exist without you knowing. The only difference between these links is a tag that says rel-“nofollow”.
It’s important to check different links to ensure that the URL doesn’t contain that tag. If the tag is present, that link will not factor into your SEO campaign at all, and you won’t see any benefit from it.
7. Page Speed and Load Time
Page speed is the rate at which your web pages function in real-time. Load time is one of the analytics used, and it refers to the amount of time it takes your website to load completely.
There are other factors like layout shift score as well. Your layout shift score refers to how well your website adjusts to new content on the page. For example, a chat window opening might adjust the layout of the entire page.
Good layout shifts allow those changes to occur without significant delays.
The best way to notice that there’s something up with your page speed is through Google Analytics. Analytics gives you a myriad of insights, one of them being site speed.
Log into Analytics, go to the Google Analytics Path, select “behavior”, then “site speed”, then “overview”. Google will notify you of any irregularities or signs of poor performance.
You can also take a look at Google PageSpeed Insights for an additional breakdown of your factors, potential courses of action, and more.
8. Additional Technical SEO Tools
On top of the resources and ideas above, it’s important to note that there are a lot of other resources to work with. Generally speaking, the more tools you have at your disposal, the closer look you’ll get at the state of your SEO.
If you’re just starting a new site or business, the best place to start is Google My Business. There, you’ll manage things like maps, authentication, reviews, and more. It’s not overly technical, but it has to do with the way that Google displays you and your brand.
If you’re someone that’s creating a lot of content, a great optimization tool is Yoast SEO. Yoast gives you the ability to optimize your on-page content in real-time. You get an SEO score when everything is said and done, and you can ensure that the things you produce are well-optimized from the start.
Managing your optimization on all platforms can be a big challenge. It’s made a lot easier with something like SEMRUSH, which gives you different tools for PPC, SEO, content marketing, and more.
If you’re looking to dive deeper into the technical side of things, there are options for you as well. It’s important to have multiple tools that give you different angles on your campaign. Deep Crawl is a great technical SEO tool that can give you key insights.
Looking for More Help With Your Optimization?
Hopefully, our look at a technical SEO audit was helpful to your campaign. There’s a lot more to learn, though, and there’s always more to improve on. We’re here to help you move forward.
Book an intro call with us to see what we can do to tailor your website and drive sales for your business. We’re here to push your site to the place it deserves to be.