Technical SEO Audit Checklist: The Complete 2026 Guide
Spent a week debugging why a client's rankings tanked. Turned out to be a single line in robots.txt. Here's the checklist I use now.

Last year I spent an entire week debugging why a client's rankings had cratered. Traffic down 40%. Revenue following. The usual panic.
Checked the content. Fine. Backlinks? Still there. No manual penalty in Search Console. I was pulling my hair out.
Turned out to be one line in robots.txt: Disallow: /
Someone on their team had pushed it during a staging migration and forgot to remove it. Four words. Blocked their entire site from Google for two weeks.
That's the thing about technical SEO—it's usually something stupid. A missing canonical tag. A redirect loop. JavaScript that breaks on mobile. Small stuff that compounds until your rankings are underwater.
I built this checklist after that disaster. Every audit starts here now.
Why technical audits matter more than you think
Look, technical SEO isn't glamorous. Nobody's writing LinkedIn posts about their XML sitemap structure. But it's the foundation everything else sits on.
You can write incredible content. Build backlinks from every major publication. Optimize every meta description. But if search engines can't crawl your site? Can't index your pages? None of that matters.
Here's what really drives this home: research from Google shows that when page load time goes from one to three seconds, bounce probability jumps 32%. That's not a small thing. That's a third of your visitors leaving before they even see what you're offering.
And it gets worse. Ahrefs ran a massive study on link rot and found that 66.5% of backlinks eventually become dead—many because of technical issues on the destination site. All those backlinks you worked hard to build? They're decaying. Every broken redirect, every 404 page, every server timeout is costing you link equity you'll never get back.
I've seen sites lose years of domain authority because someone forgot to set up redirects after a URL restructure. It's painful to watch.
The good news: most technical problems are fixable once you actually know they exist. That's the whole point of auditing.
The checklist (how I actually do this)
Seven categories. I work through them roughly in order because each one builds on the previous. But if you know something specific is broken, jump to that section.
1. Crawlability: can Google even see your site?
This is where I always start because nothing else matters if search engines can't access your pages. You could have the best content in the world sitting behind a misconfigured server.
The robots.txt check (yes, this one's personal)
After that disaster I mentioned, I'm borderline paranoid about robots.txt. Here's what I check:
Visit yoursite.com/robots.txt and make sure it exists. Then read it carefully. Look for any Disallow rules that might be blocking important pages. Check that your XML sitemap URL is referenced. And critically—test any changes with Search Console's robots.txt tester before deploying them to production.
I've seen so many variations of robots.txt problems. Staging environments that accidentally block everything. Leftover rules from migrations. Typos that block entire subdirectories. It's always something.
Testing actual crawl access
Use Google Search Console's URL Inspection tool on your key pages. Don't just check that Googlebot can access them—verify it can actually render them. JavaScript-heavy sites sometimes pass the access test but fail rendering.
Look specifically for "Crawled - currently not indexed" warnings. That usually means Google found the page but decided not to include it for some reason. Could be thin content, could be a technical issue, could be that Google just doesn't think it's important enough. Worth investigating either way.
I usually test 5-10 of my most important pages manually, then let the automated tools catch the rest.
Common crawl errors
Server errors (5xx codes) are the worst because they completely block crawling. DNS errors prevent Google from even reaching your server. Pages timing out during crawl mean Google gives up partway through. And redirect chains longer than 3 hops often don't get followed completely—each hop adds latency and increases the chance something breaks.
Internal linking structure
Every important page should have at least 2-3 internal links pointing to it. How backlinks work applies to internal links too—they pass authority and help search engines discover pages.
Find and fix orphan pages (pages with no internal links). Make sure your main navigation is in HTML that Googlebot can read. Yes, Googlebot handles JavaScript better than it used to, but why risk it?
2. Indexability: crawled doesn't mean indexed
This trips people up all the time. Google crawling your page and Google indexing your page are two different things. Plenty of sites get crawled but never make it into the index.
Check what's actually indexed
Run site:yoursite.com in Google. That number is roughly how many pages Google has indexed from your site. Compare it to your total page count. If there's a 30% or larger gap, something's wrong.
I had a client once with 5,000 pages on their site and only 800 indexed. Turned out they had a JavaScript-based navigation that Googlebot wasn't executing properly. Fixed the nav, indexed pages jumped to 4,500 within a month.
Hunt for noindex tags
Search your codebase for noindex. In bash: grep -r "noindex" /path/to/site. Check your HTTP headers too—X-Robots-Tag can sneak in there. Confirm your important pages don't have accidental noindex tags.
This happens more than you'd think. Someone adds noindex during development and forgets to remove it. A plugin adds it automatically based on some setting nobody remembers enabling. Content management systems sometimes have weird defaults.
Fixing indexing blockers
In Search Console, look for pages marked "Discovered - currently not indexed." These are pages Google knows about but hasn't bothered to index. Usually means Google doesn't think they're valuable enough—could be thin content, could be that they're too similar to other pages, could be they're just buried too deep in the site.
Add internal links to pages you want indexed. Improve thin content or consolidate duplicate pages. Sometimes merging three mediocre pages into one good page gets it indexed when none of the three would alone.
Canonical tags
Every page needs a canonical tag pointing to itself (or to the preferred version if there are duplicates). Make sure they point to the right version—I've seen sites where every page canonicalized to the homepage. That's... not ideal.
Watch out for canonical chains where A points to B and B points to C. Google might not follow the full chain.
3. Core Web Vitals and page speed
Speed is both a ranking factor and a user experience factor. Slow sites lose on both fronts. And with Google's emphasis on user experience signals, this matters more than ever.
Getting started with speed testing
Run your homepage through PageSpeed Insights. Then test 5-10 of your most important landing pages. Test mobile and desktop separately—mobile almost always performs worse and that's what matters more.
Target scores of 90+, but don't obsess over the last few points. Going from 85 to 95 probably won't move the needle on rankings. Going from 45 to 75 absolutely will.
Understanding Core Web Vitals in 2026
Here's something important that a lot of older guides miss: INP (Interaction to Next Paint) fully replaced FID back in September 2024. If you're still seeing FID mentioned in tools or documentation, that's outdated. INP is what matters now.
The three Core Web Vitals you need to know:
LCP (Largest Contentful Paint) measures how long it takes for the main content to appear. Target: under 2.5 seconds. This is usually about image optimization, server response time, and render-blocking resources.
INP (Interaction to Next Paint) measures responsiveness—how quickly your site responds when someone clicks or taps something. Target: under 200ms. This replaced FID because it measures the entire interaction lifecycle, not just the first input. Heavy JavaScript is usually the culprit when INP is bad.
CLS (Cumulative Layout Shift) measures visual stability—how much stuff jumps around while the page loads. Target: under 0.1. Those annoying pages where you go to click something and it moves at the last second? That's CLS. Usually caused by images without dimensions, ads loading late, or fonts swapping.
Check the Core Web Vitals report in Search Console for field data from real users. Lab data (like PageSpeed Insights) is useful for debugging, but field data tells you what actual visitors experience.
Quick wins for speed
Compress images and serve them in WebP format. Enable browser caching so returning visitors load faster. Minify CSS and JavaScript. Enable Gzip or Brotli compression on your server. Lazy load images below the fold. Use a CDN for static assets.
These are table stakes in 2026. If you're not doing all of these, start there before worrying about anything else.
For the overachievers
Once you've got the basics, you can get into server response time (TTFB under 200ms is the goal), HTTP/2 or HTTP/3 implementation, auditing third-party scripts (analytics, chat widgets, ad pixels—they add up fast), deferring non-critical JavaScript, and preloading critical resources.
4. Mobile optimization: your mobile site IS your site
Google completed the mobile-first indexing transition in July 2024. This isn't a future thing anymore—it's done. Google primarily uses the mobile version of your site for indexing and ranking. Desktop is secondary.
This means if something works on desktop but breaks on mobile, it might as well not work at all.
Testing mobile experience
Use Google's Mobile-Friendly Test. Check mobile usability issues in Search Console. But also—and this is important—actually test on real phones. Not just Chrome DevTools mobile simulation. Real phones. Borrow a friend's Android if you only have iPhone. The experience can be surprisingly different.
What to look for
Is text readable without zooming? Are tap targets big enough? (48x48 pixels minimum—this trips up a lot of sites with small links in footers.) Is there horizontal scrolling? Do forms work properly on mobile? Is the navigation accessible?
I worked on a site once where the hamburger menu icon was only 24x24 pixels. Technically visible, practically untappable on some phones. Mobile traffic converted at half the rate of desktop for months before anyone figured out why.
Mobile performance considerations
Test mobile page speed separately. Mobile connections are often slower and less stable than desktop. Optimize images specifically for mobile dimensions. Consider that mobile users might be on cellular data with per-megabyte costs.
Check mobile-specific Core Web Vitals. INP in particular tends to be worse on mobile because phone processors are slower than desktop.
5. Security: HTTPS is non-negotiable
HTTPS has been a ranking signal since 2014. Users don't trust sites without the padlock. Browsers actively warn people away from HTTP sites. This isn't even a question anymore—if your site isn't on HTTPS, fix that before anything else.
HTTPS basics
Your entire site should be on HTTPS. Not just the checkout page, not just the login page, everything. Check for mixed content warnings (HTTP resources loaded on HTTPS pages). All internal links should use HTTPS. Canonical tags should point to HTTPS versions. Set up 301 redirects from HTTP to HTTPS for all pages.
SSL certificate health
Confirm your certificate is valid and not expired. I've seen sites go down because someone forgot to renew the certificate. Set calendar reminders 30 days before expiration.
Make sure the certificate is trusted (not self-signed). Verify it covers all subdomains you use. Test your SSL configuration with SSL Labs—aim for an A rating.
Security headers (bonus points)
If you want to go further, implement HSTS to force HTTPS connections. Set up a Content Security Policy (CSP) to prevent XSS attacks. Add X-Frame-Options to prevent clickjacking. Configure X-Content-Type-Options to prevent MIME sniffing.
These don't directly affect rankings, but they do affect user trust and could prevent security incidents that indirectly hurt your SEO.
6. Structured data: help Google understand your content
Schema markup helps search engines understand what your content is about. It can also get you rich results in search—those fancy snippets with stars, FAQs, images, and other enhanced features that stand out in the SERP.
Testing what you have
Use Google's Rich Results Test to see what structured data Google can read on your pages. Check the Enhancements section in Search Console for errors. Validate your JSON-LD syntax.
Schema types worth implementing
Organization schema on your homepage establishes who you are. Article schema on blog posts helps Google understand your content. BreadcrumbList schema shows your site hierarchy. FAQ schema can get you those expandable FAQ sections in search results. Product schema is essential if you sell anything. LocalBusiness schema matters if you have physical locations.
The Schema.org documentation is the authoritative source, but it's dense. Google's developer guides for structured data are more practical.
Important: structured data deprecations
Google has deprecated several structured data types in 2025-2026. If you're using any of these, they won't generate rich results anymore:
- Practice problems and course info
- Estimated salary
- Learning video
- Special announcement
- Vehicle listing
- Dataset
Check Google's developer changelog for the latest changes. This stuff shifts more often than you'd think.
7. Site architecture: how everything fits together
How your site is organized affects crawlability, user experience, and how Google understands your content hierarchy. Bad architecture compounds every other problem.
URL structure
Keep URLs clean and readable. /blog/technical-seo-guide is better than /page?id=12847. Keep them under 100 characters. Use hyphens to separate words, not underscores. Avoid unnecessary parameters and session IDs. Make sure URL structure reflects site hierarchy.
XML sitemap
Create one and submit it to Search Console. Include all important pages. Exclude noindex pages and redirects—Google doesn't need a list of pages you don't want indexed. Keep it under 50MB and 50,000 URLs (split into multiple sitemaps if larger). Update it when you add or remove pages.
Navigation and site depth
Main navigation should be in HTML, not JavaScript-only. Implement breadcrumbs for better hierarchy signals. Keep important pages within 3 clicks of the homepage. Add internal links to key content throughout your site.
The deeper a page is buried, the less important Google thinks it is. If your best content is 8 clicks from the homepage, that's a problem.
Duplicate content
Identify pages with similar content. Set canonical tags to preferred versions. 301 redirect duplicate URLs to the canonical. Handle www vs non-www consistently. Handle trailing slashes consistently. These seem like small things but they fragment your authority across multiple URLs.
Pagination
For paginated content (like blog archives), use rel="next" and rel="prev" to indicate the relationship. Or use a "View All" page with canonical tags. Don't noindex paginated pages—they're legitimate content.
New Google Search Console features (late 2025)
GSC added some useful features late in 2025 that make auditing easier. If you haven't poked around the interface recently, check these out:
Branded queries filter (November 2025) lets you separate brand traffic from non-brand in Performance reports. Finally. This has been requested for years. Now you can see how your non-branded rankings are actually doing without your brand name inflating the numbers.
Custom chart annotations (November 2025) let you mark algorithm updates, site changes, and other events directly on your performance charts. Makes it way easier to correlate traffic changes with things you did (or things Google did).
Weekly and monthly views (December 2025) provide smoother trend analysis than the day-by-day noise. Good for spotting actual trends versus random fluctuation.
For a deeper dive, check out our complete Google Search Console guide.
Tools I actually use
Not going to lie—you can do a lot with free tools if you know what you're looking for.
Free tools
Google Search Console is essential. Crawl errors, indexing issues, Core Web Vitals, security issues—it's all there. Every site should have this set up.
PageSpeed Insights tests page speed and Core Web Vitals. Mobile-Friendly Test checks mobile optimization. Rich Results Test validates structured data. Screaming Frog in its free version crawls up to 500 URLs and finds broken links, duplicate content, missing tags, and a ton of other issues.
We also have a free backlink checker tool if you want to see what's linking to you.
Paid tools
Ahrefs Site Audit is my go-to for ongoing monitoring. Comprehensive checks, good prioritization, catches things I'd miss manually.
Semrush Site Audit is similar to Ahrefs with arguably better reporting.
Screaming Frog paid version gives you unlimited crawling and deep technical analysis.
Sitebulb has the best visual reporting I've seen. Good for presenting findings to clients or stakeholders who aren't technical.
Pick one paid tool. Use it monthly. Supplement with Google's free stuff.
How to prioritize fixes
You can't fix everything at once. Here's roughly how I think about prioritization.
Fix immediately
Server errors (5xx codes) blocking crawling. Site not indexed at all (check robots.txt, check server access). No HTTPS. Severe mobile usability errors. Broken homepage or key landing pages.
These are emergencies. Drop everything else until they're fixed.
Fix this week
Page speed scores under 50. Broken backlinks losing link equity. Pages accidentally noindexed. Redirect chains creating latency and losing authority. Missing or duplicate title tags.
For the backlink issues specifically, our guide on link reclamation covers how to find and fix these systematically.
Fix this month
Missing structured data opportunities. Thin content pages that need improvement or consolidation. Orphan pages needing internal links. Page speed scores in the 50-70 range. Missing image alt tags.
Ongoing maintenance
Minor speed optimizations. Additional schema types. URL cleanup. Internal linking optimization. Increasing domain authority over time.
How often to audit
Quarterly at minimum. Monthly is better.
Technical issues compound. You add pages. Update content. Change your CMS. Install new plugins. Each change can introduce new problems that slowly erode rankings.
Monthly audits catch issues before they hurt. Set up automated monitoring in Ahrefs or Semrush to alert you when new issues appear. Review those alerts weekly even if you only do full audits monthly.
Also run a full audit after:
- Site migrations or redesigns (always)
- Major platform or CMS updates
- Unexpected ranking drops
- Before big marketing campaigns
- After major algorithm updates
Speaking of things that can hurt rankings, check out our list of common SEO mistakes that kill rankings. Overlap with technical issues is significant.
The mistakes I see over and over
The robots.txt disaster. Already told you that story. Test before deploying. Always. Use Search Console's tester. Have someone else review changes before they go live. This one mistake can undo years of SEO work overnight.
Ignoring mobile. I still see teams obsessing over desktop PageSpeed scores while their mobile site is broken. Google indexes mobile first. Test mobile first. If you have to choose where to spend your optimization time, choose mobile.
Redirect chain hell. A redirects to B. B redirects to C. C redirects to D. Each hop loses roughly 5% of link equity. Each hop adds latency. Each hop increases the chance Google gives up and doesn't follow it. Audit your redirects and point everything directly to the final destination.
Waiting too long between audits. An issue affecting 5 pages now might affect 50 pages next quarter. Technical debt compounds. The longer you wait, the more work it becomes.
Fixing the wrong stuff. Optimizing alt tags while you have server errors? You're rearranging deck chairs. Critical issues first. Get the foundation right before worrying about the details.
Technical SEO and backlinks: they're not separate
Here's something most people miss: technical SEO and link building aren't separate disciplines. They're deeply connected.
If you have broken pages with backlinks pointing to them, you're wasting link equity. If your site is slow, people won't link to it—and the few who do might remove those links when their readers complain about load times. If pages aren't indexed, backlinks to those pages do nothing.
That's why link reclamation belongs in every technical audit. Finding and fixing broken backlinks should be a regular part of your maintenance routine.
And here's where Revised fits into all this. We acquire expired domains with quality backlinks from authoritative sources—Wikipedia, Reddit, news sites, educational institutions. Then we redirect those links to your site.
But here's the thing: if your site has technical issues, those redirects might not pass full authority. Google might not crawl the redirect. The linked page might return errors. The authority transfer gets diminished or lost entirely.
Fix the foundation first. Then build on top of it. See how the whole process works.
Getting started
Print this checklist. Or bookmark it. Whatever works for you.
Start with crawlability. If Google can't access your site, nothing else matters. Then indexability. Then speed and mobile. Work your way down.
Set up monthly reminders to review your audit reports. Most technical issues are totally fixable—you just have to know they exist. The sites that rank well aren't necessarily doing anything fancy. They just don't have the technical problems holding them back.
And if you want to skip the manual work of building backlinks while you sort out your technical debt, see how Revised automates backlink building with contextual links from authoritative sources.
Your site's probably in better shape than you think. It just needs the foundation work to show it.


