What do you do when an SEO client first reaches out to you?
You don’t want to jump into a project without knowing what their website is like.
You want to know their current standing, see how their website is performing, and get an idea of what your team needs to tackle.
That is where an SEO audit report comes in.
What’s an SEO Audit Report?
An SEO audit report comprehensively analyses the current SEO status of a client’s website. It looks deep into their website’s code, content, structure, and overall SEO performance.
It’s the process of evaluating a website’s SEO performance, finding any issues, and giving suggestions for improvement.
An SEO audit report covers areas like:
- Crawlability and Indexing
- Site architecture
- User experience
- Keyword research
- Competitor benchmarking
- Backlink profile
- On-page SEO
It’s essentially an overall “health check” of a client’s website to see how it’s performing and what areas need improvement.
Tools for Conducting an SEO Audit
An effective SEO audit report is based on hard data about the website’s technical health, traffic, and link profile (as well as those of your competitors).
The internet has an endless list of tools that can help you with an SEO audit, such as Google’s Search Console, Ahrefs, Moz Pro, and Screaming Frog.
You can use any of these tools to uncover specific SEO issues related to your client’s website, such as missing meta tags, broken links, duplicate content, and more.
Here are the tools you’ll need:
- Google Search Console (Free Tool): Google Search Console is a free tool that helps you find and fix technical SEO issues. It also provides valuable SEO insights, allowing you to monitor and troubleshoot problems related to your website’s performance on Google search.
- Google Analytics: Google Analytics provides detailed information about your website traffic, such as page views, bounce rate, and user behaviour. It’s a great way to get an overall snapshot of your client’s website performance and uncover insights into their target audience.
- SEMrush or Ahrefs (Preferable – Paid): You can use either of these tools to comprehensively look at your website’s performance on search engine results pages (SERPs). Get great SEO advice and insights into the site’s organic performance and make more informed decisions.
- Screaming Frog (Free Tool): Screaming Frog is a free crawler tool that helps you identify any issues on your website — like broken links, duplicate content, and missing meta tags — that can hurt your SEO. It’s an excellent alternative to SEMrush or Ahrefs if you’re on a tight budget.
- Your Time and Patience: Patience is key because it takes time to analyse the data and piece everything together.
The SEO Audit Report Process:
- On-page SEO Audit:
- Technical Errors
Off-page SEO Audit
- Backlink profile
- Social media presence
- Check important metrics
Search Console Audit
- Hidden errors, penalties
Writing an SEO Audit Report
Step 1: Is the Site Indexed?
The basics of doing an SEO audit of any site is to determine if Google has indexed it. Is Google even picking up the site?
You can do this with a simple Google search query:
site:yourdomain.com (without any space)
It will return a list of all the indexed pages, which should look something like this:
Google will even show you the number of pages they have indexed. In our case, they have indexed 6260 Media One pages so far.
If none of your pages turns up in the SERPs, then you have a serious issue — Google has yet to index the site, if at all they’re going to.
Step 2: Analyse the Site Architecture
The site architecture is how your website is structured internally. It includes things like the internal linking structure, hierarchy of pages, and usability elements.
The goal is to ensure that users can easily find what they’re looking for on your website.
When you audit the site architecture, you’re looking for things like:
- Does the website have a clear structure and hierarchy?
- Are all the pages easily accessible from any page on the website?
- Do links open in new tabs/windows or redirect users to irrelevant pages?
You can also review the page titles and check if they’re relevant to the content.
You also want to check if the page has enough consistent Call to Action buttons.
Make sure you have one call to action above the fold or in the header and a few more strategically placed throughout the page.
Are the CTA consistent? Does it make sense with the content?
Step 3: Necessary Pages Exist?
So, if the site is missing any of these pages, you might consider creating them.
Step 4: The Look Audit
Does the site look good?
Is it easy to navigate?
Are there any design elements that could be improved?
These are all important questions to consider when auditing a website.
The goal is to ensure the website looks modern, professional, and up-to-date.
Step 5: SSL Audit
Make sure the website is secured with an SSL certificate. That’s an important factor for SEO, as Google clarifies that HTTPS-secured websites will have a slight ranking advantage over those without it.
Check if the certificate is valid and if there are any broken links or errors.
Also, check if the website redirects all HTTP requests to HTTPS.
Simple — start by loading up the website using HTTPS. For example, enter “https://yourdomain.com” in the browser address bar.
If it loads without returning the dreaded “Not Secured” warning, you’re good to go.
If it returns the error, then you need to look into getting an SSL certificate installed on your website.
Step 6: SSL Redirect
You also want to replace the https:// with http:// in the browser address bar. The site should redirect to https://.
Now go a step further and try it with a page on the website. For example, enter “http://yourdomain.com/xyz” in the browser address bar.
If it doesn’t redirect to https://, then that shows you have a duplicate copy of your site- a secure and unsecured one.
You must remove the duplicate and ensure all requests are redirected to https://.
That’s an important step in securing your website and improving its SEO performance.
Step 7: Mobile friendly
Mobile devices now account for more than half of all website visits. That means if your site isn’t mobile-friendly, you’re missing out on many potential customers.
To check if the website is optimised for mobile devices, run this search query “Mobile friendly test” in Google.
Google will return a Mobile-Friendly Test tool that you can use to check the website’s responsiveness.
Enter the URL and click “RUN TEST.” The tool will test how well your website works on mobile devices and give you a score and recommendations on what to improve.
Step 8: Page Speed
Website visitors expect your website to load quickly, and if it doesn’t, they will click away.
Google does measure your page speed, and, as it turns out, they expect it to load in under 3 seconds.
If it doesn’t, Google will lower its ranking.
To test your website’s page speed, we suggest you use GTmetrix or Pingdom
Enter the website’s URL and click “Test Your Site”. GTmetrix will run a comprehensive analysis and provide a detailed performance report and suggestions on how to improve your speed.
Pay special attention to the Largest Contentful Paint (LCP), First Contentful Paint (FCP), and Fully Loaded Time (FLT).
Ideally, the LCP score should be below 2.5 seconds, the FCP should be below 1.8 seconds, and the FLT should be under 3 seconds.
Once you have your scores, it’s time to address any issues that could be slowing down your website.
Scroll down to see these issues.
Step 9: Checking for Indexing Issues
Pages that Google hasn’t yet indexed will never appear in the SERPs. That’s because they’re not in Google’s database.
So, the first step in an SEO audit is to check for any indexing issues. Is Google indexing all the pages on the website? Are there any pages that are blocked from being crawled and indexed?
You can toggle between “Not Indexed” and “Indexed” to see how many pages Google has indexed and not indexed (and the reasons for it).
You’ll see a graph of the pages that Google has been indexing over time. That will help you identify the fluctuations in the indexing.
Below it will be the reasons why Google isn’t indexing your pages. That is a great place to start when looking for SEO issues.
We suggest you go through all the reasons one by one and inspect all the pages in question.
Remember that not all pages need to be indexed. For example, you don’t have to index the thank you pages, login/registration pages, and any other page you deem irrelevant to search engine users.
In other words, having “not indexed” pages is completely normal. You just want to make sure they’re indexing all the pages that you want them to.
Click on each reason to see which pages haven’t been indexed for that particular reason. Then, inspect each page to see what’s causing the indexing issue.
Here are some of the reasons Google may fail to index your pages:
- Duplicate content without user-selected canonical
- Not found (404)
- Excluded by “noindex” tag
- Crawled by currently not indexed
- Pages with redirect
If you find that they haven’t indexed a page that they should have, you can take steps to fix it. Follow Google’s Guidelines to resolve the issue, and once you have, hit the “Validate Fix” button to re-index the page.
Alternatively, click the “Request Indexing” link to request Google to re-index the page.
Step 10: Does the Site Have a Sitemap?
A sitemap is an essential piece of a website’s structure. It helps search engines index pages faster and more accurately, improving your rankings in the SERPs.
If your website doesn’t have a sitemap, then creating one should be at the top of your to-do list.
You can check if the website has a sitemap by looking at the site’s source code. Typically, the sitemap.xml file will be located in the root directory of your domain name.
If you don’t find the file, check your CMS or access logs to see if the sitemap has been generated.
Alternatively, try typing some common naming conventions, such as “sitemap.xml” and “sitemap_index.xml,” into the browser address bar to see if it exists.
If you still can’t find the sitemap, consider checking it using a tool such as SEO Site Checkup.
If all the above options fail, then perhaps you should consider creating one.
That can be done manually or using a popular plugin such as Yoast SEO for WordPress.
Once your sitemap is up and running, don’t forget to submit it to the search engines via Google Search Console and Bing Webmaster Tools.
Step 11: Robots.txt File
Robots.txt is a text file that tells search engine robots which pages to crawl and which to ignore.
It also reveals the website’s content and how it should be indexed or crawled by search engine bots.
You can check if your website has a robots.txt file by entering “https://yourdomain.com/robots.txt” into your browser’s address bar.
See if they have disallowed any pages that should have been indexed or allowed any pages that should be blocked from crawling.
For instance, you might find a line like this:
User-agent: * Disallow: /
In the above example, the robots.txt file tells search engine bots not to crawl any page on the website.
It’s among a few things you want to fix before proceeding with anything.
You also want to ensure the robots.txt file contains your sitemap link.
If your website doesn’t have a robots.txt file, you can create one and upload it to the root directory of your domain name.
Here’s an article you want to read to learn more about robots.txt:
Step 12: Check for Canonicalization Issues
Canonicalization is the process of consolidating multiple URLs into one or signalling to search engines which version of a page should be indexed.
For example, if two versions of the same page (www.example.com and www.example.com/index.html), a canonical tag can help you signal to search engines which page should be indexed.
This is important to avoid duplicate content issues.
You can use a site like Siteliner to scan the website for duplicate content.
You’ll need to add the canonical tag if the website has multiple page versions.
<link rel=”canonical” href=”https://example.com/page/” />
This tag tells search engines that the version of the page found at this URL is the original.
That way, Google will ignore the other versions and only index the one specified in the tag.
Alternatively, if you’re using WordPress, you can use a plugin like Yoast SEO to automatically generate the canonical tags.
Step 13: Thin Content Pages
Thin content is content that has too little value and doesn’t add any real value to the user experience.
Identifying and fixing thin or low-quality pages can help improve your website’s overall SEO performance.
Identifying and fixing thin or low-quality pages can help improve your website’s overall SEO performance.
Use tools like Screaming Frog SEO Spider to scan your website for thin pages, after which you want to delete them, add more relevant content, or de-index them.
Note that some pages are meant to be thin, such as contact pages, login pages, or thank you pages.
You want to de-index them, so search engines don’t waste their time indexing them.
Broken links can be a real pain for SEO. Not only do they make it difficult for search engine bots to crawl your website, but it also harms user experience and reduces conversions.
You can use a tool like Screaming Frog to check for broken links.
Screaming frog is a free tool you can install on a desktop (Windows, macOS, Linux, etc.) and will crawl the website for broken links and other issues.
301 Redirects means that when a user or bot visits one URL, they’ll automatically be redirected to another.
These are important because they can help you keep the website’s link juice and prevent links from going to 404 pages.
You want to go through all the 301 redirects and ensure they’re redirecting to the right page.
If you find them generating a 404 page, you’ll want to update them and redirect them to the right page.
Another tool you can use to check a site for broken links is the broken link checker.
Enter your website’s URL, and it will display a list of all the broken links it has.
You can then fix or update them to ensure they’re redirecting to the right page.
Step 15: Check for Meta Titles and Meta Descriptions
Meta titles and meta descriptions appear in the search engine results and are a great way to get users to click through to your website.
They should be unique for each page and contain relevant keywords related to that page.
While they don’t directly affect your rankings, they can be a great way to get more clicks and boost your website’s visibility.
You’ll want to ensure the titles and descriptions are optimised for the keywords you’re targeting.
If missing or not optimised correctly, that could lead to fewer clicks and less traffic from search engines.
To check your meta titles and descriptions, you can use Screaming Frog or a similar tool to check for any issues.
With Screaming Frog, just click on “Meta Description” at the top of the page, and it will display all of the titles and descriptions.
In the left column, the tool will highlight the number of missing meta descriptions, duplicate meta descriptions, those that exceed the recommended 155 characters, those below 70 characters, etc.
We suggest you fix them all by creating a unique meta description for each page, ensuring they contain relevant keywords for that particular page, and keeping them under 155 characters and not less than 70 characters.
H1 and H2 tags are used to structure content on a page.
The H1 tag should be the main title of the page and should contain your primary keyword.
The h2 tags should be used for subheadings and should also contain relevant keywords.
These tags help search engines understand what the page is about and how to index it.
Again, use Screaming frog to check for any potential issues.
First, check which indexable pages are missing H1 tags.
Then, make sure the H1 tags are optimised for the keywords you’re targeting.
You want to go through all the pages one by one, checking to see if each contains a unique H1 tag optimised with a relevant keyword.
The same goes for the H2 tags.
Make sure each page has unique H2 tags, optimised for relevant keywords related to that page.
Off-page SEO Audit
A backlink profile is the number and quality of links pointing to your website from other websites.
Google uses these links to determine how authoritative a website is, so it’s important to ensure it has good-quality, relevant links.
You can check the backlink profile with tools like Ahrefs or Majestic.
You can begin by signing up for an Ahrefs account (the free versions can do, but after you’ve verified the domain).
Once you’re in, you first want to look at the “referring domain” section.
That displays all the domains that are linking to your website. Next to it will be the number of backlinks the site has.
The screenshot above shows the site has 104 referring domains and 4.45K backlinks.
That means 104 websites have linked to the site 4.45K times.
Now, you first want to analyse if that is a weird ratio.
104 sites linking 4.45K times to a site doesn’t seem normal.
It could be a sign of unnatural link building.
Next, click on “Backlinks” to see what these sites are.
You can then analyse the quality of these links — if they’re from reputable websites and the anchor text looks natural.
If you find any suspicious-looking backlinks, you can use tools like Ahrefs or Google Disavow to disavow them all.
Step 18: How Trustworthy is the Site in the Eyes of Google?
Use MajesticSEO to check how trustworthy the website is in the eyes of Google.
Go to the site and enter the website’s URL.
You’ll see two scores:
Trust Flow and Citation Flow.
The Trust Flow score indicates the website’s trustworthiness, while the Citation Flow measures its influence (the number of backlinks).
A trust flow of between 10 and 49 is considered average, while anything above that is considered very good.
Next, you want to look at the Trust and Citation flow ratio.
A ratio of 1:2 or something thereabout is ideal.
For example, if the trust flow is 14, the citation flow should be around 28.
If the ratio is far off, like 10:50 or 20:80, it could mean some suspicious backlinks are pointing to the site.
Step 19: Citation Audit for Local Directories
Citations are mentions of your business name, address, and phone number (NAP) in local directories.
These citations help search engines determine the accuracy of the information being provided by a business, as well as its relevance to certain locations.
These NAP details must be consistent across different local listing sites.
You can use tools like Whitespark or BrightLocal to do a citation audit.
These tools will give you an overview of where your business is listed and if there are any discrepancies in the NAP details.
Step 20: Analytics and Conversion Tracking
The next step is checking if the website has installed analytics and conversion tracking.
Analytics will help you understand how users interact with the website, what amount of traffic it’s getting, and from where.
Go to Google Analytics and follow the instructions provided to set up analytics.
Scroll down and set the account to 90 days.
The first thing you want to look at is the bounce rate. Ideally, the bounce rate should be between 26 and 40%.
If it’s too high, then that means people are leaving the website without engaging with it further.
You must do something to optimise it for conversion.