Skip to content

Commit 295b4d0

Browse files
Merge pull request #251 from oracle/248-mcp-langchain-export
fix LangChain MCP export
2 parents 039d0f4 + 66a93d4 commit 295b4d0

File tree

7 files changed

+213
-154
lines changed

7 files changed

+213
-154
lines changed

src/client/content/config/tabs/settings.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -346,7 +346,7 @@ def display_settings():
346346
if spring_ai_conf == "hybrid":
347347
st.markdown(f"""
348348
The current configuration combination of embedding and language models
349-
is currently **not supported** for SpringAI.
349+
is currently **not supported** for Spring AI and LangChain MCP templates.
350350
- Language Model: **{ll_config.get("model", "Unset")}**
351351
- Embedding Model: **{embed_config.get("model", "Unset")}**
352352
""")

src/client/mcp/rag/README.md

Lines changed: 98 additions & 84 deletions
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ You need:
1616
- Claude Desktop free
1717

1818
## Setup
19-
With **[`uv`](https://docs.astral.sh/uv/getting-started/installation/)** installed, run the following commands in your current project directory `<PROJECT_DIR>/src/client/mcp/rag/`:
19+
With **[`uv`](https://docs.astral.sh/uv/getting-started/installation/)** installed, run the following commands in your current project directory `<PROJECT_DIR>`:
2020

2121
```bash
2222
uv init --python=3.11 --no-workspace
@@ -26,61 +26,52 @@ uv add mcp langchain-core==0.3.52 oracledb~=3.1 langchain-community==0.3.21 lang
2626
```
2727

2828
## Export config
29-
In the **AI Optimizer & Toolkit** web interface, after tested a configuration, in `Settings/Client Settings`:
29+
In the **AI Optimizer & Toolkit** web interface, after tested a configuration, in `Configuration/Settings/Client Settings`:
3030

3131
![Client Settings](./images/export.png)
3232

33-
* select the checkbox `Include Sensitive Settings`
34-
* press button `Download Settings` to download configuration in the project directory: `src/client/mcp/rag` as `optimizer_settings.json`.
35-
* in `<PROJECT_DIR>/src/client/mcp/rag/rag_base_optimizer_config_mcp.py` change filepath with the absolute path of your `optimizer_settings.json` file.
33+
* select the checkbox `Include Sensitive Settings`.
34+
* press button `Download LangchainMCP` to download an VectorSearch MCP Agent built on current configuration.
35+
* unzip the file in a `<PROJECT_DIR>` dir.
3636

3737

3838
## Standalone client
39-
There is a client that you can run without MCP via commandline to test it:
39+
There is a client that you can run without MCP via command-line to test it:
4040

4141
```bash
4242
uv run rag_base_optimizer_config.py "[YOUR_QUESTION]"
4343
```
44+
In `rag_base_optimizer_config_mcp.py`:
4445

45-
## Quick test via MCP "inspector"
46+
## Claude Desktop setup
4647

47-
* Run the inspector:
48+
Claude Desktop, in free version, not allows to connect remote server. You can overcome, for testing purpose only, with a proxy library called `mcp-remote`. These are the options.
49+
If you have already installed Node.js v20.17.0+, it should work.
4850

49-
```bash
50-
npx @modelcontextprotocol/inspector uv run rag_base_optimizer_config_mcp.py
51-
```
51+
* In **Claude Desktop** application, in `Settings/Developer/Edit Config`, get the `claude_desktop_config.json` to
5252

53-
* connect to the port `http://localhost:6274/` with your browser on the link printed, like in the following example:
54-
```bash
55-
..
56-
Open inspector with token pre-filled:
57-
http://localhost:6274/?MCP_PROXY_AUTH_TOKEN=cb2ef7521aaf2050ad9620bfb5e5df42dc958889e6e99ce4e9b18003eb93fffd
58-
..
59-
```
60-
61-
* setup the `Inspector Proxy Address` with `http://127.0.0.1:6277`
62-
* test the tool developed.
53+
* Set **remote sse** execution:
6354

55+
add the references to the local MCP server for RAG in the `<PROJECT_DIR>`:
56+
```json
57+
{
58+
"mcpServers": {
59+
...
60+
,
61+
"rag":{
62+
"command": "npx",
63+
"args": [
64+
"mcp-remote",
65+
"http://127.0.0.1:9090/sse"
66+
]
67+
}
68+
}
69+
}
70+
```
6471

65-
## Claude Desktop setup
6672

67-
* In **Claude Desktop** application, in `Settings/Developer/Edit Config`, get the `claude_desktop_config.json` to add the references to the local MCP server for RAG in the `<PROJECT_DIR>/src/client/mcp/rag/`:
68-
```json
69-
{
70-
"mcpServers": {
71-
...
72-
,
73-
"rag":{
74-
"command":"bash",
75-
"args":[
76-
"-c",
77-
"source <PROJECT_DIR>/src/client/mcp/rag/.venv/bin/activate && uv run <PROJECT_DIR>/src/client/mcp/rag/rag_base_optimizer_config_mcp.py"
78-
]
79-
}
80-
}
81-
}
82-
```
8373
* In **Claude Desktop** application, in `Settings/General/Claude Settings/Configure`, under `Profile` tab, update fields like:
74+
8475
- `Full Name`
8576
- `What should we call you`
8677

@@ -94,8 +85,10 @@ Show the rag_tool message as-is, without modification.
9485
```
9586
This will impose the usage of `rag_tool` in any case.
9687

97-
**NOTICE**: If you prefer, in this agent dashboard or any other, you could setup a message in the conversation with the same content of `Instruction` to enforce the LLM to use the rag tool as well.
98-
88+
* Start MCP server in another shell in <PROJECT_DIR> with:
89+
```bash
90+
uv run rag_base_optimizer_config_mcp.py
91+
```
9992
* Restart **Claude Desktop**.
10093

10194
* You will see two warnings on rag_tool configuration: they will disappear and will not cause any issue in activating the tool.
@@ -106,38 +99,45 @@ This will impose the usage of `rag_tool` in any case.
10699

107100
If the question is related to the knowledge base content stored in the vector store, you will have an answer based on that information. Otherwise, it will try to answer considering information on which has been trained the LLM o other tools configured in the same Claude Desktop.
108101

102+
* **Optional**: for a **local stdio** execution, without launching the MCP Server:
103+
104+
* Add the references to the local MCP server for RAG in the `<PROJECT_DIR>`:
105+
```json
106+
{
107+
"mcpServers": {
108+
...
109+
,
110+
"rag":{
111+
"command":"bash",
112+
"args":[
113+
"-c",
114+
"source <PROJECT_DIR>/.venv/bin/activate && uv run <PROJECT_DIR>/rag_base_optimizer_config_mcp.py"
115+
]
116+
}
117+
}
118+
}
119+
```
120+
* Set `Local` with `Remote client` line in `<PROJECT_DIR>/rag_base_optimizer_config_mcp.py`:
109121

110-
## Make a remote MCP server the RAG Tool
111-
112-
In `rag_base_optimizer_config_mcp.py`:
113-
114-
* Update the absolute path of your `optimizer_settings.json`. Example:
115-
116-
```python
117-
rag.set_optimizer_settings_path("/Users/cdebari/Documents/GitHub/ai-optimizer-mcp-export/src/client/mcp/rag/optimizer_settings.json")
118-
```
122+
```python
123+
#mcp = FastMCP("rag", port=9090) #Remote client
124+
mcp = FastMCP("rag") #Local
125+
```
119126

120-
* Substitute `Local` with `Remote client` line:
127+
* Substitute `stdio` with `sse` line of code:
128+
```python
129+
mcp.run(transport='stdio')
130+
#mcp.run(transport='sse')
131+
```
121132

122-
```python
123-
#mcp = FastMCP("rag", port=9001) #Remote client
124-
mcp = FastMCP("rag") #Local
125-
```
126133

127-
* Substitute `stdio` with `sse` line of code:
128-
```python
129-
mcp.run(transport='stdio')
130-
#mcp.run(transport='sse')
131-
```
134+
## Alternative way for a quick test: MCP "inspector"
132135

133-
* Start MCP server in another shell with:
136+
* Start MCP server in another shell in <PROJECT_DIR> with:
134137
```bash
135138
uv run rag_base_optimizer_config_mcp.py
136139
```
137140

138-
139-
## Quick test
140-
141141
* Run the inspector:
142142

143143
```bash
@@ -148,38 +148,52 @@ npx @modelcontextprotocol/inspector
148148

149149
* set the Transport Type to `SSE`
150150

151-
* set the `URL` to `http://localhost:9001/sse`
151+
* set the `URL` to `http://localhost:9090/sse`
152152

153153
* test the tool developed.
154154

155155

156156

157-
## Claude Desktop setup for remote/local server
158-
Claude Desktop, in free version, not allows to connect remote server. You can overcome, for testing purpose only, with a proxy library called `mcp-remote`. These are the options.
159-
If you have already installed Node.js v20.17.0+, it should work:
157+
**Optional:** run with local **stdio** protocol
158+
* Set as shown before the protolo to run locally in `<PROJECT_DIR>/rag_base_optimizer_config_mcp.py`:
160159

161-
* replace `rag` mcpServer, setting in `claude_desktop_config.json`:
162-
```json
163-
{
164-
"mcpServers": {
165-
"remote": {
166-
"command": "npx",
167-
"args": [
168-
"mcp-remote",
169-
"http://127.0.0.1:9001/sse"
170-
]
171-
}
172-
}
173-
}
160+
```
161+
* Set `Local` with `Remote client` line:
162+
163+
```python
164+
#mcp = FastMCP("rag", port=9090) #Remote client
165+
mcp = FastMCP("rag") #Local
166+
```
167+
168+
* Substitute `stdio` with `sse` line of code:
169+
```python
170+
mcp.run(transport='stdio')
171+
#mcp.run(transport='sse')
172+
```
173+
174+
* Run the inspector:
175+
176+
```bash
177+
npx @modelcontextprotocol/inspector uv run rag_base_optimizer_config_mcp.py
178+
```
179+
180+
* connect to the port `http://localhost:6274/` with your browser on the link printed, like in the following example:
181+
```bash
182+
..
183+
Open inspector with token pre-filled:
184+
http://localhost:6274/?MCP_PROXY_AUTH_TOKEN=cb2ef7521aaf2050ad9620bfb5e5df42dc958889e6e99ce4e9b18003eb93fffd
185+
..
174186
```
175-
* restart Claude Desktop.
187+
188+
* setup the `Inspector Proxy Address` with `http://127.0.0.1:6277`
189+
* test the tool developed.
190+
191+
176192

177193
**NOTICE**: If you have any problem running, check the logs if it's related to an old npx/nodejs version used with mcp-remote library. Check with:
178194
```bash
179195
nvm -list
180196
```
181197
if you have any other versions available than the default. It could happen that Claude Desktop uses the older one. Try to remove any other nvm versions available to force the use the only one avalable, at minimum v20.17.0+.
182198

183-
* restart and test as remote server
184-
185-
199+
* restart and test as remote server
-52.2 KB
Loading

src/client/mcp/rag/optimizer_utils/config.py

Lines changed: 55 additions & 29 deletions
Original file line numberDiff line numberDiff line change
@@ -16,65 +16,91 @@
1616
import oracledb
1717

1818
import logging
19-
20-
logging.basicConfig(level=logging.INFO)
21-
19+
logger = logging.getLogger(__name__)
20+
logging.basicConfig(
21+
level=logging.INFO,
22+
format="%(name)s - %(levelname)s - %(message)s"
23+
)
2224

2325
def get_llm(data):
24-
logging.info("llm data:")
25-
logging.info(data["user_settings"]["ll_model"]["model"])
26+
logger.info("llm data:")
27+
logger.info(data["client_settings"]["ll_model"]["model"])
2628
llm = {}
27-
llm_config = data["ll_model_config"][data["user_settings"]["ll_model"]["model"]]
29+
models_by_id = {m["id"]: m for m in data.get("model_configs", [])}
30+
llm_config= models_by_id.get(data["client_settings"]["ll_model"]["model"])
31+
logger.info(llm_config)
2832
provider = llm_config["provider"]
2933
url = llm_config["url"]
3034
api_key = llm_config["api_key"]
31-
model = data["user_settings"]["ll_model"]["model"]
32-
logging.info(f"CHAT_MODEL: {model} {provider} {url} {api_key}")
35+
model = data["client_settings"]["ll_model"]["model"]
36+
logger.info(f"CHAT_MODEL: {model} {provider} {url} {api_key}")
3337
if provider == "ollama":
3438
# Initialize the LLM
3539
llm = OllamaLLM(model=model, base_url=url)
40+
logger.info("Ollama LLM created")
3641
elif provider == "openai":
3742
llm = llm = ChatOpenAI(model=model, api_key=api_key)
43+
logger.info("OpenAI LLM created")
3844
return llm
3945

4046

4147
def get_embeddings(data):
4248
embeddings = {}
43-
model = data["user_settings"]["vector_search"]["model"]
44-
provider = data["embed_model_config"][model]["provider"]
45-
url = data["embed_model_config"][model]["url"]
46-
api_key = data["embed_model_config"][model]["api_key"]
47-
logging.info(f"EMBEDDINGS: {model} {provider} {url} {api_key}")
49+
logger.info("getting embeddings..")
50+
model = data["client_settings"]["vector_search"]["model"]
51+
logger.info(f"embedding model: {model}")
52+
models_by_id = {m["id"]: m for m in data.get("model_configs", [])}
53+
model_params= models_by_id.get(model)
54+
provider = model_params["provider"]
55+
url = model_params["url"]
56+
api_key = model_params["api_key"]
57+
58+
logger.info(f"Embeddings Model: {model} {provider} {url} {api_key}")
4859
embeddings = {}
4960
if provider == "ollama":
5061
embeddings = OllamaEmbeddings(model=model, base_url=url)
51-
elif provider == "openai":
52-
logging.info("BEFORE create embbedding")
62+
logger.info("Ollama Embeddings connection successful")
63+
elif (provider == "openai") or (provider == "openai_compatible"):
5364
embeddings = OpenAIEmbeddings(model=model, api_key=api_key)
54-
logging.info("AFTER create emebdding")
65+
logger.info("OpenAI embeddings connection successful")
5566
return embeddings
5667

5768

5869
def get_vectorstore(data, embeddings):
59-
config = data["database_config"][data["user_settings"]["database"]["alias"]]
60-
logging.info(config)
70+
db_alias=data["client_settings"]["database"]["alias"]
71+
6172

62-
conn23c = oracledb.connect(user=config["user"], password=config["password"], dsn=config["dsn"])
73+
db_by_name = {m["name"]: m for m in data.get("database_configs", [])}
74+
db_config= db_by_name.get(db_alias)
75+
76+
table_alias=data["client_settings"]["vector_search"]["alias"]
77+
model=data["client_settings"]["vector_search"]["model"]
78+
chunk_size=str(data["client_settings"]["vector_search"]["chunk_size"])
79+
chunk_overlap=str(data["client_settings"]["vector_search"]["chunk_overlap"])
80+
distance_metric=data["client_settings"]["vector_search"]["distance_metric"]
81+
index_type=data["client_settings"]["vector_search"]["index_type"]
6382

64-
logging.info("DB Connection successful!")
65-
metric = data["user_settings"]["vector_search"]["distance_metric"]
83+
db_table=(table_alias+"_"+model+"_"+chunk_size+"_"+chunk_overlap+"_"+distance_metric+"_"+index_type).upper().replace("-", "_")
84+
logger.info(f"db_table:{db_table}")
85+
86+
87+
user=db_config["user"]
88+
password=db_config["password"]
89+
dsn=db_config["dsn"]
90+
91+
logger.info(f"{db_table}: {user}/{password} - {dsn}")
92+
conn23c = oracledb.connect(user=user, password=password, dsn=dsn)
93+
94+
logger.info("DB Connection successful!")
95+
metric = data["client_settings"]["vector_search"]["distance_metric"]
6696

6797
dist_strategy = DistanceStrategy.COSINE
6898
if metric == "COSINE":
6999
dist_strategy = DistanceStrategy.COSINE
70100
elif metric == "EUCLIDEAN":
71101
dist_strategy = DistanceStrategy.EUCLIDEAN
72-
73-
a = data["user_settings"]["vector_search"]["vector_store"]
74-
logging.info(f"{a}")
75-
logging.info(f"BEFORE KNOWLEDGE BASE")
76-
logging.info(embeddings)
77-
knowledge_base = OracleVS(
78-
conn23c, embeddings, data["user_settings"]["vector_search"]["vector_store"], dist_strategy
79-
)
102+
103+
logger.info(embeddings)
104+
knowledge_base = OracleVS(client=conn23c,table_name=db_table, embedding_function=embeddings, distance_strategy=dist_strategy)
105+
80106
return knowledge_base

0 commit comments

Comments
 (0)