Google Search Console Guide
Google Search Console (GSC) is the essential tool for monitoring your site's search performance, indexing status, and technical issues. Master it to improve SEO systematically.
1) What is Google Search Console?
GSC is a free tool from Google that lets you:
- Monitor how Google crawls and indexes your site
- See which search queries bring traffic
- Identify and fix technical SEO issues
- Submit sitemaps and request indexing
- Track Core Web Vitals and mobile usability
- Analyze backlinks and internal linking
Unlike Google Analytics (which tracks user behavior on your site), GSC focuses on how your site appears in Google Search.
2) Why GSC Matters for SEO
GSC provides data you can't get anywhere else:
- Real search data: See actual queries people used to find your site
- Indexing insights: Know exactly which pages Google has indexed (or why they haven't)
- Early warnings: Get alerts for coverage errors, security issues, manual actions
- Performance tracking: Monitor Core Web Vitals directly from Google's perspective
Bottom line: if you care about Google search traffic, GSC is mandatory.
3) Setup and Verification
To start using GSC, you need to verify ownership of your site.
Step 1: Add Property
Go to search.google.com/search-console and add a new property. Choose:
- Domain property: Covers all subdomains and protocols (requires DNS verification)
- URL prefix: Single protocol and subdomain (e.g., https://www.example.com)
Recommended: Use Domain property if you control DNS, otherwise use URL prefix.
Step 2: Verify Ownership
Common verification methods:
- HTML file upload: Upload a verification file to your root directory
- HTML meta tag: Add a meta tag to your homepage <head>
- DNS record: Add a TXT record to your domain (required for Domain property)
- Google Analytics: If GA is already set up with the same Google account
- Google Tag Manager: If GTM is installed with the same account
For developers, HTML meta tag is usually the easiest:
<meta name="google-site-verification" content="YOUR_VERIFICATION_CODE" />Step 3: Submit Sitemap
Once verified, submit your sitemap (e.g., https://example.com/sitemap.xml) under Sitemaps in the left menu.
4) Key Features
4.1) Performance Report
Shows how your site performs in Google Search over the last 16 months.
- Clicks: How many times users clicked your result
- Impressions: How many times your URL appeared in search results
- CTR (Click-Through Rate): Clicks / Impressions. Higher CTR = better title/description.
- Average Position: Your average ranking for queries. Lower number = better.
How to use it:
- Filter by query to see which keywords drive traffic
- Filter by page to see how individual URLs perform
- Compare date ranges to track growth or declines
- Find queries where you rank 11-20 (page 2) — easy opportunities to optimize
4.2) Coverage (Index Status)
Shows which URLs Google has indexed and any indexing errors.
Four categories:
- Error: Pages with critical issues (404, server error, etc.)
- Valid with warnings: Indexed but with minor issues
- Valid: Successfully indexed
- Excluded: Not indexed (could be intentional or a problem)
Common exclusion reasons:
- Noindex tag: Intentional (robots meta)
- Crawled - currently not indexed: Low quality or duplicate content
- Duplicate, Google chose different canonical: Google picked a different URL as canonical
- Blocked by robots.txt: Check your robots.txt file
4.3) Sitemaps
View submitted sitemaps and their indexing status.
- See how many URLs were discovered vs. indexed
- Get errors if sitemap has issues (invalid XML, unreachable URLs, etc.)
- Submit multiple sitemaps (main, news, image, video)
Tip: If your sitemap has 1000 submitted URLs but only 100 indexed, investigate Coverage report for issues.
4.4) URL Inspection
Inspect individual URLs to see:
- Whether the URL is indexed
- Coverage status and indexing errors
- Canonical URL Google chose
- Mobile usability issues
- Structured data (Schema) detected
You can also click "Test Live URL" to see how Googlebot renders the page right now (useful for debugging CSR/SSR issues).
After fixing issues, use "Request Indexing" to ask Google to re-crawl. Note: this doesn't guarantee immediate indexing, but it helps.
4.5) Mobile Usability
Reports mobile-specific issues:
- Text too small to read
- Clickable elements too close
- Content wider than screen
- Mobile viewport not set
Fix these to improve mobile rankings (Google uses mobile-first indexing).
4.6) Core Web Vitals
Shows LCP, INP, and CLS performance for your URLs, categorized as Good / Needs Improvement / Poor.
- Data is grouped by similar pages (e.g., all blog posts)
- Based on real user data (CrUX report)
- Click through to see specific URLs with issues
For more details, check our Core Web Vitals guide.
4.7) Links Report
Shows:
- External links: Which sites link to you and which pages they link to
- Internal links: Which pages on your site link to each other
Use this to:
- Identify your most-linked pages (often your strongest content)
- Find orphan pages (pages with few/no internal links)
- Monitor backlink growth
5) Troubleshooting Common Issues
Issue: "Crawled - currently not indexed"
Cause: Google crawled the page but decided not to index it (low quality, duplicate content, thin content).
Fix:
- Improve content quality and uniqueness
- Add more internal links to the page
- Check if canonical is set correctly
- Ensure the page provides unique value
Issue: "Submitted URL not found (404)"
Cause: A URL in your sitemap returns 404.
Fix: Remove the URL from your sitemap or fix the broken link.
Issue: "Duplicate, Google chose different canonical"
Cause: Your page has a canonical tag pointing to itself, but Google thinks another URL is the canonical version (usually due to near-duplicate content).
Fix: Either consolidate duplicate pages or use rel=canonical more aggressively. Check URL Inspection to see which URL Google chose.
Issue: "Server error (5xx)"
Cause: Your server returned a 500-level error when Googlebot tried to crawl.
Fix: Check server logs, fix the error, and request re-indexing.
Issue: "Blocked by robots.txt"
Cause: Your robots.txt file blocks Googlebot from accessing the URL.
Fix: Update robots.txt to allow crawling of important pages. Don't block critical CSS/JS needed for rendering.
6) Advanced Tips
Use filters to find quick wins
In Performance report, filter by Position 11-20. These are keywords where you're on page 2 — small improvements can move you to page 1 and drive significant traffic.
Compare date ranges
Compare this month vs. last month (or year-over-year) to spot trends. Look for sudden drops — they often indicate technical issues or algorithm updates.
Export data for deeper analysis
Click "Export" in Performance or Coverage reports to download data as CSV/Excel. Combine with Google Analytics data for full funnel analysis.
Set up email alerts
GSC will email you about critical issues (manual actions, security issues, coverage errors). Make sure alerts go to the right team.
Use Search Console API
For large sites or automated reporting, use the Search Console API to pull data programmatically.
Monitor competitors (if you have access)
If you manage multiple sites, compare performance across properties to identify what works.
Related Tools
FAQ
Common questions about Google Search Console.