I'd like to be able to see the requests made by Cursor in one of the observability tools. For example, in Helicone.ai.
ideally, that means that all the requests to the model should go through Helicone instead of being sent directly to the LLM provider.
That can be achieved by overriding the host URL and adding a few additional headers.
https://docs.helicone.ai/getting-started/integration-method/gateway