TL;DR
- Allow essential content, block admin and system paths.
- Always include your sitemap location.
- Avoid blocking CSS/JS needed for rendering.
What robots.txt Does (and Doesn’t)
robots.txt is a crawler directive file, not a security tool. It tells bots what to avoid, but it doesn’t block access for humans or malicious crawlers.
Safe Rules for WordPress
Use the robots.txt Generator to build a safe baseline.
- Disallow
/wp-admin/and allow/wp-admin/admin-ajax.php. - Allow
/wp-content/uploads/. - Add your sitemap URL.
Common Mistakes to Avoid
- Blocking
/wp-content/entirely. - Forgetting to update the sitemap URL after migrations.
- Using robots.txt to hide sensitive content (use authentication instead).
Frequently Asked Questions
Does robots.txt improve rankings?
It helps guide crawlers, but it doesn’t directly boost rankings. Use it to reduce crawl waste.
Should I block wp-admin?
Yes, but allow admin-ajax.php so front-end features still work.
Can robots.txt hide private content?
No. Use authentication or remove content instead.
Key Takeaways
- Use robots.txt to guide crawlers, not to secure content.
- Always include your sitemap.
- Don’t block assets needed for rendering.
Generate a clean robots.txt
Use the robots.txt Generator to build a safe file quickly.