Skip to content

Comments

feat(proxy-auth): implement optional proxy authentication with API keys#14

Open
hiandy24 wants to merge 1 commit intohankchiutw:mainfrom
hiandy24:dev
Open

feat(proxy-auth): implement optional proxy authentication with API keys#14
hiandy24 wants to merge 1 commit intohankchiutw:mainfrom
hiandy24:dev

Conversation

@hiandy24
Copy link

Added proxy-level authentication via PROXY_API_KEYS env var; clients must send Authorization: Bearer <proxy_key> when enabled.
Introduced src/shared/api/ensure-proxy-auth.ts and wired it into /api/* to enforce auth before proxying.
Updated token resolution: when proxy auth is enabled, upstream GitHub OAuth token comes from X-Copilot-Token; otherwise the selected token is used. _ still means “use default”.
Reserved Authorization for proxy auth only (when enabled) to stay OpenAI-compatible.
Added config constants: PROXY_API_KEYS and COPILOT_TOKEN_OVERRIDE_HEADER (x-copilot-token).
Updated README with configuration, examples, and behavior notes; no behavior change when PROXY_API_KEYS is unset.

@hankchiutw
Copy link
Owner

Could you elaborate more on your use cases?
If the authorization is required (instead of authorized by the default token), you can unset the default token. I don't see benefits of wrapping github tokens with PROXY_API_KEYS.

@hiandy24
Copy link
Author

Sorry, I'm not quite clear on how your authorization implements authentication. Suppose I want to use this http_proxy with an LLM client, typically an API key is required. So, this modification is mainly for implementing authentication from the client to the http_proxy, in order to prevent client abuse and avoid the theft of the Copilot API. This feature can be disabled.

@hankchiutw
Copy link
Owner

hankchiutw commented Sep 15, 2025

Suppose I want to use this http_proxy with an LLM client, typically an API key is required

I see. In that case, you may set _ to your LLM client by setting a default token in the admin page.
If you don't want a default token, you need to set the github token to you LLM client (instead of _).

I think the default token approach is more secure since the real token doesn't transfer between LLM client and http_proxy.

One possible improvement is to make the dummy token _ configurable. In that way the default token can be revoked if need.

@hiandy24
Copy link
Author

If the forwarding is running on a server instead of locally, sharing my Copilot subscription with friends or using it on some LLM clients. If I set a default token, then anyone who knows this API port can use my Copilot proxy, assuming I map it to the public network, which is not secure. If I don't set a default token, my friends or the LLM client can see my Copilot key, which may lead to its leakage.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants