Robots.txt Analyzer
Parses and validates robots.txt files. Checks for crawl issues, blocked resources, sitemap declarations, and common mistakes.
Install
pip install -r requirements.txtRun
python robots_txt_analyzer.py --url https://example.compython robots_txt_analyzer.py --file robots.txtpython robots_txt_analyzer.py --urls https://a.com https://b.com --output robots_audit.xlsxExport
Add --output report.xlsx to save results as a spreadsheet.
| Flag | Description |
|---|---|
--url | Url |
--urls | Urls. Multiple values allowed |
--file | Local robots.txt file |
--test-path | Test if a path is blocked |
--output | Save as XLSX |
python robots_txt_analyzer.py --helpRun as part of a full site audit. Export issues to XLSX, prioritize by severity, and create a fix roadmap for the dev team.
Run before launching a new site or after a migration. Catch technical issues before Google crawls the new version.
Schedule regular checks and compare outputs over time. Catch regressions early.
Combine with other tools for a complete workflow:
Requires: pandas, requests. All included in requirements.txt.
Get all 154 Python SEO tools — $49
One-time payment. Lifetime access. No monthly fees.
Learn 25 tools and get 25% back. Earn from client work and get 50% back.
AAIO Inc — aaioinc.com/tools/robots_txt_analyzer/