About Site Audit / Crawler
Crawl your entire website and get a comprehensive technical SEO audit. Find broken links, missing meta tags, duplicate content, redirect chains, orphan pages, and 40+ technical issues.
Frequently Asked Questions
What does a site audit check?
A comprehensive site audit analyzes technical SEO (crawlability, indexability, page speed), on-page SEO (titles, meta descriptions, headings, content), link health (internal and external), mobile-friendliness, Core Web Vitals, and security issues.
How often should I run a site audit?
Run a full audit quarterly and after major site changes (redesigns, migrations, CMS updates). Set up monthly monitoring for critical metrics. New sites should audit weekly during the first 3 months to catch issues early.
What should I fix first after a site audit?
Priority order: 1) Server errors (5xx), 2) Indexation blocks (accidental noindex/robots.txt), 3) Broken pages with backlinks, 4) Core Web Vitals failures, 5) Missing meta tags on high-traffic pages, 6) Duplicate content issues.
How deep should I crawl my site?
For most sites, 3 levels deep covers all important pages. Large sites (10,000+ pages) should start with 2 levels for speed, then deep-crawl priority sections. Crawling 5 levels ensures comprehensive coverage but takes longer.
How often should I run a site audit?
Monthly for active websites, weekly if you're publishing frequently or making technical changes. After a redesign or migration, run daily audits for the first two weeks to catch issues early.
What's the difference between critical and warning issues?
Critical issues directly prevent indexing or cause significant ranking damage (broken pages, noindex on important content, missing titles). Warnings are optimization opportunities that affect performance but won't break functionality (long titles, missing alt text).
How to Use Site Audit / Crawler
- 1
Enter your website URL and set crawl depth (1-5 levels)
- 2
Configure which page types to include in the audit
- 3
Click 'Start Audit' to begin the comprehensive crawl
- 4
Monitor crawl progress in real time on the dashboard
- 5
Review categorized issues: Critical, Warnings, and Notices
- 6
Export the full audit report for implementation
Why Use Site Audit / Crawler?
Technical SEO issues are invisible but devastating — a single misconfigured robots.txt can deindex your entire site. Our Site Audit crawls your website like Googlebot, identifying broken links, missing meta tags, duplicate content, slow pages, and crawl errors that suppress your rankings. Find and fix issues before Google does.
Key Features
- Multi-level site crawl (up to 5 levels deep)
- Broken link and redirect chain detection
- Duplicate title and description finder
- Missing alt text and heading issues
- Page speed and Core Web Vitals per page
- Canonical and hreflang validation
- Exportable audit report with fix priorities
Related Tools
Robots.txt Analyzer & Tester
Analyze and test robots.txt rules
XML Sitemap Generator
Generate XML sitemaps from your site
Schema Markup Generator (JSON-LD)
Generate JSON-LD structured data markup
Page Speed Test & Core Web Vitals
Test speed and Core Web Vitals scores
Broken Link Checker
Find broken links and 404 errors
Redirect Checker & Chain Analyzer
Trace redirect chains and status codes