A technical SEO audit is the most important diagnostic work you can do for an underperforming website. It doesn’t matter how good your content is or how many backlinks you’ve built — if your technical foundation is broken, you’re fighting uphill every single day. This technical SEO audit checklist covers 80 specific points across every critical technical domain. It’s the same framework top agencies use when they take over a site and need to understand what’s actually going wrong — and what to fix first.
Crawlability and Indexation (Points 1–15)
Before anything else, Google needs to be able to find, access, and index your pages. Crawlability failures are the most expensive technical SEO problems because they make every other optimization invisible.
Robots.txt Configuration
- robots.txt exists and is accessible at /robots.txt
- No critical pages or directories blocked — check for accidental blocks on /wp-admin/admin-ajax.php, CSS, JS, or product/category pages
- Sitemap URL declared in robots.txt
- Crawl-delay directive appropriate for server capacity (if used)
- AI crawler permissions — GPTBot, ClaudeBot, PerplexityBot access confirmed or intentionally denied
XML Sitemap
- XML sitemap exists and is submitted in Google Search Console
- Sitemap contains only indexable URLs — no noindex pages, redirects, or 4xx errors in sitemap
- Sitemap is auto-updated on content changes
- Sitemap file size under 50MB and under 50,000 URLs per file
- Sitemap index used for large sites with multiple sitemap files
Google Search Console Health
- No “Discovered — currently not indexed” spike in Coverage report
- No “Crawled — currently not indexed” patterns indicating quality or duplicate issues
- Crawl budget not exhausted on large sites — verify in Crawl Stats report
- No server errors (5xx) in Coverage report
- All important pages confirmed indexed via site:domain.com or URL Inspection tool
Site Architecture and URL Structure (Points 16–25)
Architecture determines how link equity flows through your site and how clearly you communicate topical relationships to search engines.
URL Structure
- URLs are clean and descriptive — no dynamic parameters on indexable pages where avoidable
- URL depth under 4 clicks from homepage for all important pages
- Consistent URL format — all lowercase, hyphens not underscores, no trailing slash inconsistency
- No URL parameter issues creating duplicate content — configure parameter handling in GSC
- Subdomain vs. subdirectory decision validated — blog and resources on main domain, not subdomain
Internal Linking
- Homepage links to all major category/pillar pages
- Pillar pages link to all relevant cluster content
- No orphan pages — every indexable page has at least one internal link
- Internal link anchor text is descriptive and varies appropriately
- Pagination implemented correctly — rel=”next”/”prev” deprecated; use proper canonical or paginated URLs
Duplicate Content and Canonicalization (Points 26–35)
Duplicate content dilutes ranking signals and confuses search engines about which page to rank. This is one of the most common technical SEO failures on large sites.
- Canonical tags present on all indexable pages
- Self-referencing canonicals correct — canonical points to the page itself, not a different version
- No conflicting canonicals and noindex on the same page
- www vs. non-www resolved to single canonical version with 301 redirect
- HTTP vs. HTTPS resolved — all HTTP URLs 301 redirect to HTTPS
- Trailing slash consistency — pick one and enforce with redirects
- Pagination pages not creating duplicate content
- Category/tag archive pages managed — noindex or canonicalized as appropriate
- Boilerplate content not triggering duplicate flags — unique enough across templates
- Hreflang implemented correctly for multilingual/multi-regional sites
Page Speed and Core Web Vitals (Points 36–48)
Core Web Vitals are a confirmed ranking factor and a direct signal of user experience quality. Failing these isn’t just a technical problem — it’s a competitive disadvantage.
Largest Contentful Paint (LCP)
- LCP under 2.5 seconds on mobile and desktop
- LCP element identified and is an image or text block, not a low-priority resource
- LCP image preloaded with <link rel=”preload”> where applicable
- Server response time (TTFB) under 600ms
- Render-blocking resources eliminated — defer non-critical JS, inline critical CSS
Cumulative Layout Shift (CLS)
- CLS under 0.1 on mobile and desktop
- Image dimensions explicitly set in HTML to prevent layout shift
- Ads and embeds have reserved space — no surprise layout shifts on load
- Web fonts loaded with font-display: swap to prevent invisible text
Interaction to Next Paint (INP)
- INP under 200ms on mobile
- Long tasks identified and broken up in JavaScript execution
- Third-party scripts audited — remove or defer non-critical third-party JS
- Total page weight under 2MB on critical pages
General Performance
- Images compressed and in next-gen formats (WebP or AVIF)
- Browser caching enabled with appropriate cache-control headers
- CDN implemented for global audience sites
- Gzip or Brotli compression enabled on server
Structured Data and Schema Markup (Points 53–62)
Schema markup is non-negotiable for modern SEO. It enables rich results, improves AI citation eligibility, and communicates content meaning to search engines with precision.
- Organization schema on homepage with name, logo, social profiles, founding date
- Article schema on all blog/content pages with author, datePublished, dateModified
- Person schema for all named authors with credentials and sameAs links
- BreadcrumbList schema on all non-homepage pages
- FAQ schema on pages with Q&A sections
- HowTo schema on instructional/step-by-step content
- Product schema with price, availability, and review data on product pages
- LocalBusiness schema for local SEO sites with address and hours
- SiteLinksSearchBox schema for brand search on large sites
- No schema validation errors — test all schema in Google’s Rich Results Test
Mobile and HTTPS (Points 63–70)
Mobile-first indexing means Google crawls and indexes your mobile version primarily. HTTPS is a baseline requirement. Both are table stakes that still fail on surprising numbers of sites.
- Mobile-responsive design confirmed — no horizontal scrolling, appropriately sized tap targets
- Text legible on mobile without zooming — font size minimum 16px for body text
- No mobile-specific interstitials blocking content (penalized by Google)
- Mobile page speed passing Core Web Vitals independently from desktop
- HTTPS implemented site-wide with valid SSL certificate
- SSL certificate not expiring within 30 days — set up auto-renewal
- No mixed content warnings — all resources loaded over HTTPS
- HSTS header implemented to prevent SSL stripping attacks
On-Page Technical Factors (Points 71–80)
These on-page technical elements directly impact how search engines understand and rank your content.
- Unique title tags on every page — 50-60 characters, primary keyword near front
- Unique meta descriptions on every page — 150-160 characters, action-oriented
- Single H1 per page that matches search intent for target keyword
- Heading hierarchy logical — H2s for major sections, H3s for subsections
- Image alt text present and descriptive on all meaningful images
- No broken internal or external links — crawl with Screaming Frog quarterly
- 301 redirects for all changed URLs — no 302 redirects for permanent moves
- No redirect chains longer than 1 hop
- Open Graph and Twitter Card tags on all shareable content
- JavaScript rendering not blocking critical content — key content in HTML source, not JS-rendered
Need a Professional Technical SEO Audit?
Our technical SEO team has run audits for hundreds of sites across every industry. We find the issues that free tools miss and deliver a prioritized fix list tied to revenue impact — not a 200-page report nobody reads.
How to Prioritize Your Technical SEO Fixes
Running through 80 audit points will produce a list of issues. Not all issues are equal. Use this prioritization framework to focus effort where it generates the most impact.
Tier 1: Revenue-Blocking Issues (Fix Immediately)
Crawlability failures, mass deindexation, HTTPS issues, and Core Web Vitals failures on high-traffic pages are Tier 1. These have direct, immediate impact on rankings and revenue. Drop everything else and fix these first.
Tier 2: Authority Leakage (Fix Within 30 Days)
Broken internal links, redirect chains, duplicate content issues, and missing canonical tags are Tier 2. They’re not immediately catastrophic but they compound over time, bleeding authority and creating confusion for search engines. Address these systematically after Tier 1 is resolved.
Tier 3: Opportunity Capture (Fix Within 90 Days)
Schema markup gaps, image optimization opportunities, open graph tags, and structured data enhancements are Tier 3. They’re not causing active harm but fixing them captures available ranking and visibility opportunities. Build these into your regular content workflow rather than treating them as emergency fixes.
Tools Required for a Complete Technical SEO Audit
No single tool covers all 80 points. Here’s the toolkit that professional agencies use:
- Screaming Frog SEO Spider: Crawl analysis, redirect mapping, duplicate content detection, broken link identification
- Google Search Console: Index coverage, Core Web Vitals field data, mobile usability, rich results status
- Google PageSpeed Insights / Lighthouse: Lab-based Core Web Vitals measurement and performance recommendations
- Chrome DevTools: JavaScript rendering audit, network waterfall analysis, coverage report for unused CSS/JS
- Ahrefs or Semrush: Backlink audit, organic traffic data, keyword ranking correlation with technical issues
- Schema Markup Validator / Rich Results Test: Structured data validation and rich results eligibility testing
- GTmetrix or WebPageTest: Filmstrip view of page loading, waterfall chart for resource analysis
Frequently Asked Questions About Technical SEO Audits
How often should I run a technical SEO audit?
A comprehensive technical SEO audit should be performed at least quarterly for active websites, and immediately after any major site changes — CMS migration, redesign, URL restructuring, or hosting changes. Automated monitoring tools like ContentKing or Botify can provide continuous auditing between full manual reviews.
What’s the most important technical SEO factor?
Crawlability and indexability are the foundation — if Google can’t crawl and index your pages, nothing else matters. After that, Core Web Vitals (especially LCP and INP on mobile) have the most direct confirmed impact on rankings. Fix crawl and indexation issues first, then performance, then structured data and on-page factors.
How long does a technical SEO audit take?
For a professional agency, a thorough technical SEO audit typically takes 20-40 hours depending on site size and complexity. Small sites (under 1,000 pages) can be audited in 10-15 hours. Enterprise sites with 100,000+ pages require specialized tooling and may take 80+ hours. The 80-point checklist above can be worked through systematically with the right tools in place.
Can I run a technical SEO audit myself or do I need an agency?
You can perform a basic technical SEO audit yourself using Google Search Console, PageSpeed Insights, and a free Screaming Frog crawl (up to 500 URLs). However, interpreting crawl data, prioritizing fixes correctly, and catching advanced issues like JavaScript rendering problems, hreflang errors, or crawl budget waste typically requires experienced technical SEO expertise. The cost of getting it wrong — particularly on Tier 1 issues — usually exceeds the cost of professional help.
What’s the difference between a technical SEO audit and an SEO audit?
A technical SEO audit focuses exclusively on the technical infrastructure of a website: crawlability, indexation, site speed, schema markup, URL structure, and canonical management. A full SEO audit also includes content quality analysis, keyword strategy assessment, backlink profile evaluation, and competitive analysis. Technical SEO is a subset of overall SEO, but it’s the foundation everything else depends on.
How do I know if my technical SEO issues are causing ranking drops?
Correlate technical issue discovery with ranking and traffic data in Google Search Console and your analytics platform. A sudden drop in indexed pages corresponds to a traffic drop on the same timeline. Core Web Vitals failures correlating with ranking declines in the same period. Crawl coverage reports showing “Discovered — currently not indexed” growth while organic traffic stagnates are all strong signals that technical issues are the cause of ranking problems.



