A technical SEO audit without a systematic checklist is a gut-feel exercise. You’ll catch obvious problems, miss the ones that actually cost you rankings, and deliver recommendations you cannot prioritize or defend. After running technical SEO audit processes across thousands of client sites — from SMBs to enterprise — this is the 80-point technical SEO audit checklist we actually use.
Everything here is actionable. Every item maps to a specific tool, command, or check. Priority levels (Critical / High / Medium) tell you what to fix first. Work through this systematically and you’ll have a complete technical health picture of any site.
Crawlability and Indexation (20 Points)
Robots.txt (5 Points)
- [Critical] Robots.txt accessible at domain.com/robots.txt
- [Critical] No critical pages or directories incorrectly disallowed
- [High] XML sitemap URL(s) declared in robots.txt
- [High] No conflicting directives for the same paths
- [Medium] Wildcard patterns used correctly (no regex errors)
XML Sitemap (5 Points)
- [Critical] XML sitemap submitted to Google Search Console and Bing Webmaster Tools
- [Critical] Sitemap contains only canonical, indexable URLs (no redirects, no noindex pages)
- [High] Sitemap <lastmod> dates are accurate and dynamic (update on content change)
- [High] Sitemap image and video extensions implemented where applicable
- [Medium] Sitemap file size under 50MB, items under 50,000 (split if needed)
Crawl Budget and Indexation (5 Points)
- [Critical] No critical pages returning noindex tag or header
- [Critical] Correct pages in index (Google Search Console Coverage report — no unexpected excluded/indexed URLs)
- [High] Infinite crawl traps eliminated (faceted navigation, session IDs, calendar archives)
- [High] Pagination implemented correctly (no orphaned page-2+ URLs without internal links)
- [Medium] Internal search result pages noindexed or excluded via robots.txt
Canonicalization (5 Points)
- [Critical] Canonical tag present on all pages pointing to preferred URL version
- [Critical] www vs. non-www resolved — one version returns 301 to the other
- [Critical] HTTP to HTTPS redirect in place; no mixed content on canonical HTTPS pages
- [High] Trailing slash consistency enforced across the site
- [High] Parameter handling configured in GSC for URL parameters that do not change content
Technical Infrastructure (15 Points)
HTTPS and Security (4 Points)
- [Critical] SSL certificate valid, not expired, covers all subdomains
- [Critical] No mixed content errors (HTTP resources on HTTPS pages)
- [High] HSTS header implemented
- [Medium] Security headers present (X-Content-Type-Options, X-Frame-Options, CSP)
Server Configuration (4 Points)
- [Critical] Server response time under 200ms for TTFB (first byte)
- [High] GZip/Brotli compression enabled on HTML, CSS, JS, SVG
- [High] Browser caching implemented for static assets (long cache TTLs)
- [Medium] CDN in place for global audience delivery
Redirect Chains and Errors (4 Points)
- [Critical] No redirect chains longer than one hop (A→B→C should be A→C)
- [Critical] 404 error pages returning correct 404/410 HTTP status (not 200 “soft 404”)
- [High] Custom 404 page with navigation to help users find content
- [High] All 301 redirects reviewed — no outdated redirects pointing to pages that have moved again
URL Structure (3 Points)
- [High] URLs are clean, readable, keyword-relevant — no session IDs or dynamic parameters in canonical URLs
- [High] URL depth shallow enough that important pages are within 3 clicks from homepage
- [Medium] Hyphens used as word separators (not underscores)
Core Web Vitals and Page Speed (15 Points)
Core Web Vitals are a confirmed Google ranking factor. Our technical SEO audit service consistently finds these as the highest-impact performance improvements for competitive SERPs.
LCP — Largest Contentful Paint (4 Points)
- [Critical] LCP under 2.5 seconds on mobile (measured with field data in CrUX, not just lab data)
- [Critical] LCP element is a preloaded image or fast-rendering text (not a background-image)
- [High] Hero images served in modern formats (WebP/AVIF) with appropriate sizing
- [High] LCP resource has
fetchpriority="high"attribute
INP — Interaction to Next Paint (4 Points)
- [Critical] INP under 200ms (replaced FID as Core Web Vital in 2024)
- [High] Long tasks in main thread identified and broken up
- [High] Third-party scripts (chat, analytics, ads) not blocking interaction
- [Medium] Input handlers using passive event listeners where appropriate
CLS — Cumulative Layout Shift (4 Points)
- [Critical] CLS score under 0.1 on mobile
- [Critical] All images and embeds have explicit width/height attributes set
- [High] Ads, cookie banners, and dynamic insertions not causing layout shifts
- [Medium] Web fonts loaded with font-display: swap and preconnect for font origins
General Performance (3 Points)
- [High] JavaScript rendering not required for critical above-the-fold content
- [High] Render-blocking CSS and JS eliminated or deferred
- [Medium] Total page weight under 3MB for typical page (lower for mobile)
On-Page Technical Elements (10 Points)
Title Tags and Meta Descriptions (3 Points)
- [Critical] Every page has a unique title tag (50-60 characters) — check for duplicates in GSC
- [High] Meta descriptions unique per page (150-160 characters) — not duplicate or missing
- [High] No title/description truncation issues — verify with SERP simulator
Heading Structure (3 Points)
- [High] Single H1 per page matching the primary content topic
- [High] H2-H6 hierarchy logical — no jumping from H2 to H4
- [Medium] Headings contain semantic keyword variations (not identical to title tag)
Image Optimization (4 Points)
- [High] All images have descriptive alt text (not keyword-stuffed, not empty on informational images)
- [High] Images served via responsive srcset for different screen sizes
- [High] Lazy loading implemented for below-the-fold images (loading=”lazy”)
- [Medium] Image file names are descriptive (not IMG_4521.jpg)
Structured Data / Schema Markup (10 Points)
- [Critical] Organization or LocalBusiness schema on homepage with sameAs properties
- [Critical] BreadcrumbList schema on all inner pages
- [High] Article/BlogPosting schema with author Person schema on all content pages
- [High] FAQPage schema on key informational pages
- [High] Product + Offer + AggregateRating schema on e-commerce product pages
- [High] VideoObject schema on all pages with embedded video
- [High] No schema validation errors in Google’s Rich Results Test
- [Medium] LocalBusiness schema for each location on multi-location sites
- [Medium] WebSite schema with potentialAction (Sitelinks Search Box)
- [Medium] Schema properties match visible page content (no hidden or misleading schema claims)
Internal Linking Architecture (5 Points)
- [Critical] No orphaned pages (pages with zero internal links pointing to them)
- [High] Pillar pages receive disproportionately high internal link equity
- [High] Anchor text descriptive and keyword-relevant (not “click here”)
- [High] No broken internal links (crawl with Screaming Frog or Ahrefs)
- [Medium] Internal link depth — important pages within 3 clicks of homepage
International and Multi-Language SEO (5 Points)
- [Critical] Hreflang tags implemented correctly for multi-language/multi-region sites
- [Critical] Hreflang self-referencing canonical consistent with canonical tags
- [High] Return tags implemented — every hreflang points back to every other language version
- [High] x-default tag present for the fallback page
- [Medium] Language/region alternate pages are indexed and accessible (not blocked by robots.txt)
Mobile and Accessibility (5 Points)
- [Critical] Mobile-friendly test passes (Google’s Mobile-Friendly Test)
- [Critical] No intrusive interstitials on mobile (pop-ups blocking content on page load)
- [High] Touch targets minimum 48x48px with adequate spacing
- [High] Viewport meta tag correctly set
- [Medium] Accessible color contrast ratios (WCAG 2.1 AA minimum)
JavaScript SEO: The Technical SEO Audit Checklist Extension
Modern sites built on JavaScript frameworks (React, Vue, Angular, Next.js) require additional technical SEO audit checklist items beyond the standard 80 points. JavaScript SEO is a discipline of its own — and failures here can silently bury otherwise well-optimized sites.
The core issue: Googlebot renders JavaScript, but not instantly. There’s a delay between when Googlebot first crawls a page and when it renders the JavaScript. During this window, any content rendered by JavaScript is invisible. For sites where key SEO content (H1 tags, body copy, internal links, structured data) is injected by JavaScript rather than served in the initial HTML response, this creates indexation delays and ranking instability.
The audit question: serve your page URLs with JavaScript disabled in your browser. What content is visible? If your H1 tag, main body content, and primary navigation links disappear, you have a JavaScript SEO problem. The fix is server-side rendering (SSR) or static generation for critical content — which is why modern frameworks like Next.js default to SSR for exactly this reason.
Internal links generated by JavaScript are crawled but less reliably than HTML links. Navigation menus built as React components, infinite scroll pagination that loads links dynamically, and modal dialogs that reveal content are all crawlability risks. Audit these specifically — use a JavaScript-enabled crawler (Screaming Frog with Chrome rendering, or Botify) alongside a non-JS crawl and compare the link graphs. The differences reveal your JavaScript SEO exposure.
Structured data injected by JavaScript is processed by Google but should be verified with the URL Inspection tool in Google Search Console, which shows the rendered DOM — what Google actually sees after rendering. Schema in the initial HTML response is always more reliable than schema injected after page load.
Log File Analysis: What Most Technical Audits Miss
Log file analysis is the most underutilized component of a technical SEO audit checklist. While most audits rely on crawlers and GSC data, server log files tell you the ground truth of how Googlebot is actually behaving on your site — what it is crawling, how frequently, and where it is getting stuck.
Access your server logs (Apache access logs, Nginx access logs, or CDN logs from Cloudflare, Fastly, etc.) and filter for Googlebot user agent. What you’re looking for: which URLs Googlebot visits most frequently, which pages it never visits, where it encounters errors, and whether crawl budget is being wasted on low-value URLs like faceted navigation, pagination, or parameter variations.
Common log file discoveries that change audit priorities: Googlebot spending 40% of crawl budget on filter URLs that were supposed to be noindexed but aren’t. Googlebot not visiting key product pages that were recently published because internal link architecture hasn’t surfaced them yet. Googlebot encountering consistent 503 errors during peak traffic hours, indicating server capacity issues invisible in standard performance testing.
According to Google’s crawl budget documentation, crawl budget management is a real consideration for large sites and directly affects how quickly new content gets indexed. Log file analysis is the only way to see your actual crawl budget allocation rather than estimating from GSC data.
How to Run This Technical SEO Audit Checklist
The tools for a complete technical SEO audit checklist execution:
- Screaming Frog SEO Spider: Crawl all URL-level checks (title tags, meta, H1, canonicals, redirects, response codes, internal links)
- Google Search Console: Coverage, Core Web Vitals (CrUX field data), Manual Actions, Enhancements (schema)
- PageSpeed Insights: CWV lab and field data per URL
- Ahrefs or Semrush Site Audit: Broken links, orphaned pages, redirect chains
- Google Rich Results Test: Schema validation per page
- Chrome DevTools Lighthouse: Accessibility, performance, best practices per page
According to Moz’s Technical SEO Guide, systematic technical auditing is the foundation that separates high-performing sites from those left behind in competitive SERPs. For large sites (10,000+ pages), a full crawl needs to be run with JavaScript rendering enabled in Screaming Frog to catch issues with React/Vue/Angular frameworks. Many technical SEO issues on modern JavaScript-heavy sites only appear in rendered crawls.
According to Google’s JavaScript SEO documentation, Googlebot renders JavaScript but on a delayed basis — meaning JS-rendered content may lag in indexation compared to server-rendered content. This is a systemic risk for any site built on client-side rendering.
Prioritizing Technical SEO Audit Findings
Not all findings are equal. After completing this technical SEO audit checklist, prioritize by:
- Indexation blockers: Anything preventing pages from being crawled or indexed (robots.txt errors, noindex on key pages, canonicalization loops)
- Core Web Vitals failures: CWV below threshold are ranking-factor issues with direct fix paths
- Structured data errors: Rich result eligibility is binary — errors mean no eligibility
- Redirect chains: Link equity loss, user experience degradation
- Internal link gaps: Orphaned pages, poor link equity distribution
- On-page technical elements: Duplicate titles, missing meta descriptions, image optimization
If you need a professional technical SEO audit with implementation support, our SEO audit service covers all 80 of these points plus entity optimization, content gap analysis, and a prioritized implementation roadmap. For site-specific assessment, the fastest path is our qualification form.
The difference between a site that dominates its competitive space and one that underperforms despite good content is almost always technical. Get the infrastructure right first — everything else compounds from there.
Ready to Dominate AI Search Results?
Over The Top SEO has helped 2,000+ clients generate $89M+ in revenue through search. Let’s build your AI visibility strategy.
Frequently Asked Questions
How often should I run a technical SEO audit?
Minimum quarterly for active sites, monthly for large or frequently updated sites (e-commerce, news). Always run a full technical audit after major site changes: CMS migrations, redesigns, URL structure changes, significant template changes. Set up continuous monitoring with Google Search Console alerts for critical issues (coverage errors, Core Web Vitals failures, manual actions) between full audits.
What are the most critical technical SEO issues to fix first?
Indexation blockers come first: robots.txt errors, noindex on important pages, canonical misconfigurations. Second priority: Core Web Vitals failures (direct ranking factor). Third: structured data errors affecting rich result eligibility. After those three, the priority order depends on your site type — e-commerce sites benefit most from product schema optimization and crawl efficiency; content sites from internal link architecture and canonical management.
What tools do I need to run a technical SEO audit?
Essential: Screaming Frog SEO Spider (free tier covers up to 500 URLs, paid covers full sites), Google Search Console (free, essential for GSC-specific data), PageSpeed Insights (free). For comprehensive audits: Ahrefs or Semrush for backlink and on-page data, Chrome DevTools for rendering and performance debugging. Enterprise options: Botify, Lumar (DeepCrawl), or ContentKing for large-scale crawling and continuous monitoring.
Does technical SEO matter if my content is high quality?
Yes — always. High-quality content on a technically broken site fails to rank because Google cannot efficiently crawl, understand, or index it. I’ve seen excellent content buried on page 5 due to canonical errors, and mediocre content on technically clean sites outranking it. Technical SEO is the foundation — content quality is the advantage that compounds once the foundation is solid. Both matter; technical comes first.
What is the difference between a technical SEO audit and an SEO audit?
A technical SEO audit focuses exclusively on the infrastructure layer: crawlability, indexation, site speed, Core Web Vitals, structured data, URL structure, redirects, and server configuration. A full SEO audit includes technical SEO plus content analysis (keyword targeting, topical authority, content quality), backlink profile analysis, competitor gap analysis, and entity optimization. Technical SEO audit is a component of a comprehensive SEO audit — but it is often the most actionable starting point because technical issues have direct, fixable causes.

