feat(proxy-auth): implement optional proxy authentication with API keys#14
feat(proxy-auth): implement optional proxy authentication with API keys#14hiandy24 wants to merge 1 commit intohankchiutw:mainfrom
Conversation
|
Could you elaborate more on your use cases? |
|
Sorry, I'm not quite clear on how your authorization implements authentication. Suppose I want to use this http_proxy with an LLM client, typically an API key is required. So, this modification is mainly for implementing authentication from the client to the http_proxy, in order to prevent client abuse and avoid the theft of the Copilot API. This feature can be disabled. |
I see. In that case, you may set I think the default token approach is more secure since the real token doesn't transfer between LLM client and http_proxy. One possible improvement is to make the dummy token |
|
If the forwarding is running on a server instead of locally, sharing my Copilot subscription with friends or using it on some LLM clients. If I set a default token, then anyone who knows this API port can use my Copilot proxy, assuming I map it to the public network, which is not secure. If I don't set a default token, my friends or the LLM client can see my Copilot key, which may lead to its leakage. |
Added proxy-level authentication via PROXY_API_KEYS env var; clients must send Authorization: Bearer <proxy_key> when enabled.
Introduced src/shared/api/ensure-proxy-auth.ts and wired it into /api/* to enforce auth before proxying.
Updated token resolution: when proxy auth is enabled, upstream GitHub OAuth token comes from X-Copilot-Token; otherwise the selected token is used. _ still means “use default”.
Reserved Authorization for proxy auth only (when enabled) to stay OpenAI-compatible.
Added config constants: PROXY_API_KEYS and COPILOT_TOKEN_OVERRIDE_HEADER (x-copilot-token).
Updated README with configuration, examples, and behavior notes; no behavior change when PROXY_API_KEYS is unset.