AI Discovery
Keep the owner workflow simple: start with the Workspace-generated production snippet, let Rover publish the minimized seed/presence as the visible Rover cue, then add the well-known artifacts only if you want stronger machine-readable discovery beyond the runtime.
Use the Workspace-generated install path first
</body>.Publish the generated well-known discovery files
If you do more than the snippet, publish the generated rover-site.json and agent-card.json files from Workspace. Publish them at:
/.well-known/rover-site.json
/.well-known/agent-card.json`rover-site.json` is the authoritative Rover-native capability profile. `agent-card.json` is the broad interop card that generic agents and `service-desc` pointers can consume. Both stay generated from Workspace instead of asking site owners to invent schema by hand.
Add `service-desc` only if you control head tags or headers
Workspace also generates pointer surfaces that tell generic agents where the interop card lives. Use one of these if you control HTML head output or server/CDN headers:
HTTP `Link` header
Link: </.well-known/agent-card.json>; rel="service-desc"; type="application/json"This belongs in app server, CDN, or reverse-proxy header config.
HTML head tag
<link rel="service-desc" href="/.well-known/agent-card.json" type="application/json" />This belongs in <head>, not in the Rover boot snippet.
Treat `llms.txt` as a supplement
Workspace can generate an llms.txt addendum, but Rover should already be discoverable from the production snippet plus the well-known card/site files. Use it only if you specifically want that extra read-only machine-readable surface.