feat: Improve llms.txt and add Accept: text/markdown content negotiation#805
Merged
feat: Improve llms.txt and add Accept: text/markdown content negotiation#805
Conversation
There was a problem hiding this comment.
Cursor Bugbot has reviewed your changes and found 1 potential issue.
Bugbot Autofix is OFF. To automatically fix reported issues with Cloud Agents, enable Autofix in the Cursor dashboard.
Expand llms.txt from a bare server address to comprehensive setup docs covering endpoint scoping (org/project), query parameters, and copy-paste setup commands for Claude Code, Cursor, and VSCode. Add Accept: text/markdown content negotiation on GET / so AI agents requesting the homepage get the same useful markdown instead of SPA HTML. Extract getBaseUrl() helper to deduplicate origin derivation across route handlers. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Cloudflare's static asset serving intercepts GET / before the Hono app runs, so the Accept: text/markdown handler never fires. Move the check to the outer worker fetch handler in index.ts where it runs before asset routing. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Cloudflare's asset serving intercepts GET / before the worker runs, preventing Accept: text/markdown content negotiation. Enable run_worker_first so the worker handles all requests first, with an explicit fallback to env.ASSETS.fetch() for unmatched non-MCP routes. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Ensure /api, /oauth, and /.mcp paths return proper error responses instead of falling through to static asset (SPA) serving. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
All unmatched routes should fall through to static assets since run_worker_first is only needed for homepage content negotiation. Any route the worker handles will return a non-404 response. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Content negotiation on GET / required run_worker_first, a 404 fallback to env.ASSETS.fetch(), and changes to wrangler routing — too much complexity for one endpoint. The comprehensive setup docs are already served at /llms.txt which works without any routing changes. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
7353760 to
cf41928
Compare
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Expand llms.txt from a bare server address to comprehensive setup documentation
so AI agents can actually help users configure the MCP server correctly. Previously
the endpoint only said "The MCP's server address is: https://mcp.sentry.dev/mcp"
with no mention of org/project scoping, query parameters, or setup commands — leading
to repeated failed attempts when agents tried to help users connect.
The new content covers:
?experimental=1,?agent=1)Also adds
Accept: text/markdowncontent negotiation onGET /so AI agentsrequesting the homepage get the same useful markdown instead of the raw SPA HTML.
Extracts a
getBaseUrl()helper to deduplicate origin derivation across route handlers.