Skip to content

Conversation

@CameronWhiteside
Copy link
Contributor

Summary

This PR introduces the AI Crawl Control: Robots.txt feature documentation. It includes a new changelog entry, a dedicated feature page (track-robots-txt.mdx), and updates to the main AI Crawl Control index and the analyze-ai-traffic.mdx page to incorporate the new functionality and update existing metrics.

The new Robots.txt tab allows users to:

  • Monitor robots.txt file availability and health status (e.g., 200 OK, 404 Not Found).
  • Track the total number of requests to the robots.txt file and identify high-traffic hostnames.
  • Identify crawlers violating robots.txt directives and view details like the violated path and specific directive.
  • See at a glance if robots.txt files contain Cloudflare-recommended Content Signals for AI usage.

Updates were also made to the existing Analyze AI Traffic documentation to refine and clarify the metrics available.

Documentation checklist

  • Is there a changelog entry (guidelines)? If you don't add one for something awesome and new (however small) — how will our customers find out? Changelogs are automatically posted to RSS feeds, the Discord, and X.
  • The change adheres to the documentation style guide.
  • If a larger change - such as adding a new page- an issue has been opened in relation to any incorrect or out of date information that this PR fixes.
  • Files which have changed name or location have been allocated redirects.

@CameronWhiteside CameronWhiteside merged commit fcf9128 into cloudflare:production Oct 24, 2025
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants