In technical SEO, the more improvements you can implement to your site, the better. Implementing more fixes should improve the technical foundations of your website and, therefore, its odds of your website climbing up the Google SERPs (search engine results pages).
To identify potential areas for improvement, we use many different tools. These include our favorite Screaming Frog as well as automated cloud-based site crawling tools by Ahrefs, Moz, and Semrush, to name a few. But as convenient as they are, we don’t normally rely on fully-automated SEO audits because of the blind spots and false positives these tools usually return. Also, there are areas in technical SEO that only Google can help you with. This is where Google Search Console comes in.
In this article, we’ll teach you where to find the GSC Coverage report as well as how to use it to address common technical and on-page SEO issues.
What is the Google Search Console?
The Google Search Console (GSC) allows you to look at your site’s performance from Google’s perspective. It offers overviews of a site’s technical and on-page elements and can be used to perform SEO audits of a target site. The Google Search Console also offers recommendations on how to address issues that it uncovers during an SEO audit.
The GSC offers several types of reports. However, the Coverage report (also known as the Index Coverage report) is probably the most useful. The Coverage report tells you which pages in a site are crawled or indexed successfully, which pages return errors, as well as which pages are excluded for various reasons.
Why is the Google Search Console Coverage Report Important?
When beginning an SEO campaign, it’s important to check the site’s GSC Coverage report as it will give you an idea of how Google views its performance. If their site has any technical or on-page SEO deficiencies, the coverage report should help you identify priority areas that you could start working on.
If you’re in charge of maintaining a site, you should also be periodically calling up the GSC Coverage report to find any new errors as well as unintended indexing or non-indexing of pages. Checking the coverage report regularly is especially important for very large websites that are constantly being updated, as individually checking hundreds or even thousands of pages for crawlability may be impossible.
Note that errors, warnings, as well as unintended exclusions and inclusions are practically impossible to avoid in large, frequently-updated websites. A small number of errors relative to a large number of pages is not usually a cause for concern.
However, if the number of problematic URLs is allowed to build up, they will eventually degrade a site’s technical SEO characteristics, causing it to fall behind its competitors on Google Search. For this reason, SEOs and site administrators should try to check their site’s GSC Coverage report and address any serious issues at least once a month.
How to Pull Up the GSC Coverage Report
Below are the basic steps you need to perform to pull up the Google Search Console Coverage report. Check out the video above starting at 3:04 to get the best idea of how to find and use the GSC coverage report.
- Log in to the Google Search Console.
- If you’re on the GSC homepage on a desktop device, the Coverage report should be available on the menu column at the left side of your screen, under “Index”.
- You should be presented with a report containing four main tabs named Error, Valid with warning, Valid, and Excluded.
How to Handle Issues Found Through the GSC Coverage Report
The Coverage report will flag a site’s URLs under the Error, Valid with warning, Valid, and Excluded tabs. Below are some of the common issues you’ll find under each category as well as the possible fixes you can implement to resolve them.
Error
URLs in this category should be immediately addressed, as Google may view sites with a high number of errors as technically problematic, which may result in poor performance on Google Search. Common errors you’ll find flagged in the Error tab include but are not limited to the following:
- Server Error (5xx)
Solution #1: Block the site using robot.txt to have Google stop crawling the pages.
Solution #2: Have your web developer turn these pages into hard 404s, thereby excluding them from search.
- Submitted URL seems to be a Soft 404
Solution: Turn these pages into hard 404s.
- Submitted URL not found (404)
Solution: If the page is a legitimate deletion, you can leave the URL alone. Google’s bots should “forget” about the page shortly.
After fixing the flagged URLs, click on the “Validate Fix” button. This will send a signal to Google that you have attempted to fix the issues that they have flagged. Validation should take anywhere from two weeks to a few months. However, even if you don’t validate your fixes, Google should eventually update your site URLs’ statuses after some time.
Valid with warning
These pages are considered valid but with potential issues that may negatively impact the site’s performance on Google Search.
One of the most commonly flagged “Valid with warning” issues is indexed dynamic URLs. When this happens, we usually have the web developer set up the dynamic URLs to automatically have a “no index” meta tag. With time, Google should stop indexing the site’s dynamic URLs, preserving link equity and helping the site perform better on Google Search.
Valid
So-called valid pages may sometimes be problematic. For example, crawled pages that are not on the site map may be flagged as “Indexed, not submitted in sitemap”. In these cases, the first thing to do is to update the sitemap to include the page. If there are multiple pages with this status, investigate the URLs further to see if there are other underlying issues.
Excluded
Exclusions are often intentional. For instance, pages with hard 404s, pages blocked with robots.txt, canonicalized pages, pages with no-index tags, as well as pages with redirects are usually intentionally excluded to preserve link equity.
However, you’ll want to look into excluded URLs that return soft 404s as well as those that have been crawled but not indexed. The latter may indicate a qualitative issue with the site content. Likewise, duplicate pages should also be investigated further. You could also consider removing excluded duplicate pages from the site’s XML sitemap, particularly if they’re returning canonical errors.
Final Thoughts
Being familiar with the Google Search Console’s Coverage report will give you the ability to quickly identify site indexing issues that need your attention. However, it’s also important to understand what the various issues flagged by the Coverage report mean in context.
Understanding the issues flagged by the GSC Coverage report and knowing how to address them will ensure optimal search visibility, conservation of link equity, as well as the efficient use of time and other resources. For these reasons, SEO professionals should make it a habit to check the report at the onset of a campaign and at least once a month afterward or even more frequently, if necessary.
If you want to learn more about our technical SEO practices or how we use widely available SEO tools to help our clients dominate Google Search, feel free to get in touch with our team.