Index Coverage report

Definition

The term Index Coverage report refers to a core feature within Google Search Console (GSC) designed to show website owners the indexing status of their website’s pages as known to Google. As of early 2025, this functionality is primarily located within the Pages report, found under the Indexing section in the GSC sidebar, though many SEOs still refer to it by its legacy name “Index Coverage.”

This report details which pages from your site Google has successfully added to its index (making them eligible to appear in search results) and which pages it discovered but did not index, along with the specific reasons why. It categorizes URLs into statuses like:

  • Indexed: Pages successfully indexed.
  • Not indexed: Pages Google knows about but hasn’t indexed, broken down by reasons such as:
    • Server error (5xx)
    • Redirect error
    • Blocked by robots.txt
    • Excluded by ‘noindex’ tag
    • Not found (404)
    • Crawled – currently not indexed
    • Discovered – currently not indexed
    • Duplicate without user-selected canonical
    • Duplicate, Google chose different canonical than user
    • Page with redirect
    • Alternate page with proper canonical tag

Essentially, the Pages (Index Coverage) report acts as a vital diagnostic tool, helping website owners, SEOs, and developers understand how Googlebot interacts with their site and identify technical issues that might prevent important content from being indexed and found by users in Google Search.

Is It Still Relevant?

Absolutely. The Pages report (incorporating the former Index Coverage data) is arguably one of the most critical tools within Google Search Console for technical SEO and website health monitoring in 2025. Its relevance stems from several key points:

  • Foundation of Visibility: A page must be indexed before it can rank. This report provides direct feedback from Google on whether this fundamental step is occurring for your site’s URLs.
  • Error Detection: It’s the primary place to discover technical errors that prevent indexing, such as accidental `robots.txt` blocks, `noindex` tags on important pages, widespread 404 errors, or server issues impacting Googlebot.
  • Troubleshooting Indexing Issues: When valuable content isn’t appearing in search, this report is the first place to check its indexing status and diagnose the potential cause directly from Google’s perspective.
  • Validating Technical SEO Changes: After implementing technical fixes (like removing a `noindex` tag or fixing canonicals), this report (along with the “Validate Fix” feature) helps confirm if Google has recognized and processed the changes.
  • Content Quality Insights: Reasons like “Crawled – currently not indexed” can sometimes indicate that Google crawled the page but deemed it not valuable or unique enough to index, prompting content quality reviews.
  • Scale Management: For large websites with thousands or millions of URLs, this report is indispensable for understanding indexing patterns and identifying systemic issues at scale.

Monitoring and understanding this report is essential for anyone serious about maintaining or improving their website’s organic search visibility.

Real-world Context

SEOs and website managers use the Pages (Index Coverage) report in GSC for various practical tasks:

  • Post-Launch Checks: After launching a new website or a new section, checking the report to ensure the new URLs move from “Discovered” or other “Not indexed” statuses to “Indexed.”
  • Identifying Accidental Blocks: Noticing a surge in pages under “Blocked by robots.txt” helps quickly identify and rectify incorrect rules in the `robots.txt` file that might be blocking important content.
  • Diagnosing Missing Pages: If a key page disappears from search results, checking its status here might reveal it’s now a “Not found (404)” or “Excluded by ‘noindex’ tag,” pointing directly to the problem.
  • Detecting Canonicalization Issues: Seeing many pages listed under “Duplicate, Google chose different canonical than user” indicates potential conflicts in canonical signals that need investigation and correction.
  • Monitoring Site Migrations: Tracking the decline of old URLs under “Page with redirect” and the corresponding rise of new URLs under “Indexed” provides crucial feedback during a website migration.
  • Finding Server Problems: A spike in “Server error (5xx)” suggests potential hosting or server configuration issues affecting Googlebot’s ability to crawl the site.
  • Assessing Content Value Perception: A large number of pages falling into “Crawled – currently not indexed” might signal to Google that these pages lack sufficient quality or uniqueness, prompting a content audit and improvement strategy.

Background

The functionality provided by the current Pages (Indexing) report evolved over time within Google’s tools for webmasters.

  • Early Days (Google Webmaster Tools): Initial versions offered more basic reports, primarily focusing on explicit crawl errors like 404s and server errors, along with separate reporting on blocked URLs and index counts.
  • Introduction of Index Coverage (c. 2017-2018): Google launched the dedicated “Index Coverage” report as part of a major overhaul of Google Search Console. This was a significant upgrade, consolidating various indexing signals into a single, more comprehensive interface. It aimed to give a clearer picture of *all* known URLs and their status (valid, warning, error, excluded).
  • Integration into Pages Report (Recent Years): In ongoing efforts to streamline GSC, Google integrated the detailed indexing status information directly into the main **Pages** report, accessible under the “Indexing” section of the navigation. While the dedicated “Coverage” name is less prominent in the UI, the core data and functionality remain. This allows users to see indexing status alongside other page-related data more easily.

Throughout its evolution, the core purpose has remained consistent: to provide webmasters with transparent, actionable data about how Google crawls and indexes their website, enabling them to identify and resolve technical issues affecting search visibility.

What to Focus on Today

To effectively use the GSC Pages (Indexing) report in 2025, follow these best practices:

  • Schedule Regular Reviews: Monitor the report consistently – weekly for dynamic sites, monthly for more static ones. Look for sudden changes in the number of indexed pages or spikes in specific “Not indexed” reasons.
  • Address Errors First: Prioritize investigating and fixing URLs listed under critical error statuses like “Server error (5xx),” “Not found (404),” “Excluded by ‘noindex’ tag” (if unintentional), and “Blocked by robots.txt” (if unintentional), as these directly prevent indexing of potentially important pages.
  • Analyze Excluded Pages: Don’t ignore the other “Not indexed” categories. Understand *why* pages are excluded:
    • “Crawled/Discovered – currently not indexed”: May indicate quality issues, duplication, or crawl prioritization. Consider content improvement or technical fixes (like internal linking).
    • Duplicate statuses: Investigate canonical tag implementation and content similarity.
    • “Page with redirect” / “Alternate page…”: Usually expected if redirects and canonicals are set up correctly, but worth spot-checking.
  • Utilize the URL Inspection Tool: For any specific URL listed in the report, click the magnifying glass icon next to it to use the URL Inspection tool. This provides granular details about Google’s last crawl, detected canonicals, mobile usability, and indexing status for that single page.
  • Leverage “Validate Fix”: Once you believe you’ve fixed the underlying cause for a group of errors (e.g., removed a bad `robots.txt` rule causing blocks), go into the specific issue details in the report and click “Validate Fix.” This signals Google to recrawl and re-evaluate the affected URLs. Monitor the validation progress.
  • Cross-Check with Sitemaps: Compare the number of indexed URLs reported here with the URLs submitted via your XML sitemaps (in the Sitemaps report) to ensure Google is aware of and processing your intended pages.
  • Filter and Export Data: Use the report’s filtering options (e.g., filter by sitemap, filter by status) to analyze specific sections of your site. Export data for deeper analysis in spreadsheets if needed, especially for large sites.

Effectively using the Pages (Indexing) report is fundamental to proactive technical SEO, helping ensure your valuable content is accessible to Google and eligible to rank.

Winning online isn’t luck - it’s strategy.
We turn traffic into sales, clicks into customers, and data into growth.