You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README-GEMINI-MCP.md
+14Lines changed: 14 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -66,10 +66,14 @@ You can configure the server's behavior via command-line arguments and environme
66
66
67
67
-`--port=<number>`: Specifies the port for the server to listen on.
68
68
-**Default**: `8765`
69
+
-**Note**: Can also be set via the `GEMINI_MCP_PORT` environment variable. Command-line argument takes precedence.
69
70
-`--debug`: Enables detailed debug logging to the console.
70
71
71
72
### Environment Variables
72
73
74
+
-`GEMINI_MCP_PORT`: Specifies the port for the server to listen on.
75
+
-**Default**: `8765`
76
+
-**Note**: Command-line argument `--port` takes precedence over this environment variable.
73
77
-`GEMINI_TOOLS_DEFAULT_MODEL`: Sets a default LLM model specifically for tools hosted by the server (like `google_web_search`).
74
78
-**Purpose**: When a tool needs to invoke an LLM during its execution (e.g., to summarize search results), it will use the model specified by this variable. This allows you to use a different (potentially faster or cheaper) model for tool execution than for the main chat.
0 commit comments