Skip to content

Commit 0fa780b

Browse files
jssmithtconley1428
authored andcommitted
OpenAI Agents SDK - MCP Samples (#250)
* file system mcp server example * typo * stateful and stateless demos * wip * update to provider * provider * title * add more examples * Update SDK version to 1.18 * Update openai breaking change * cleanup + formatting * cleanup * WIP sequential thinking * memory stateful mcp * cleanup * interface updates * minimize dependency version changes * naming consistency * openai agents upgrade --------- Co-authored-by: Tim Conley <timothy.conley@temporal.io>
1 parent 4cbd468 commit 0fa780b

24 files changed

+1159
-152
lines changed

openai_agents/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -30,9 +30,9 @@ Each directory contains a complete example with its own README for detailed inst
3030
- **[Tools](./tools/README.md)** - Demonstrates available tools such as file search, image generation, and others.
3131
- **[Handoffs](./handoffs/README.md)** - Agents collaborating via handoffs.
3232
- **[Hosted MCP](./hosted_mcp/README.md)** - Using the MCP client functionality of the OpenAI Responses API.
33+
- **[MCP](./mcp/README.md)** - Local MCP servers (filesystem/stdio, streamable HTTP, SSE, prompt server) integrated with Temporal workflows.
3334
- **[Model Providers](./model_providers/README.md)** - Using custom LLM providers (e.g., Anthropic via LiteLLM).
3435
- **[Research Bot](./research_bot/README.md)** - Multi-agent research system with specialized roles: a planner agent, search agent, and writer agent working together to conduct comprehensive research.
3536
- **[Customer Service](./customer_service/README.md)** - Interactive customer service agent with escalation capabilities, demonstrating conversational workflows.
3637
- **[Reasoning Content](./reasoning_content/README.md)** - Example of how to retrieve the thought process of reasoning models.
3738
- **[Financial Research Agent](./financial_research_agent/README.md)** - Multi-agent financial research system with planner, search, analyst, writer, and verifier agents collaborating.
38-

openai_agents/mcp/README.md

Lines changed: 91 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,91 @@
1+
# MCP Examples
2+
3+
Integration with MCP (Model Context Protocol) servers using OpenAI agents in Temporal workflows.
4+
5+
*Adapted from [OpenAI Agents SDK MCP examples](https://github.com/openai/openai-agents-python/tree/main/examples/mcp)*
6+
7+
Before running these examples, be sure to review the [prerequisites and background on the integration](../README.md).
8+
9+
10+
## Running the Examples
11+
12+
### Stdio MCP
13+
14+
First, start the worker:
15+
```bash
16+
uv run openai_agents/mcp/run_file_system_worker.py
17+
```
18+
19+
Run the workflow:
20+
```bash
21+
uv run openai_agents/mcp/run_file_system_workflow.py
22+
```
23+
24+
This sample assumes that the worker and `run_file_system_workflow.py` are on the same machine.
25+
26+
27+
### Streamable HTTP MCP
28+
29+
First, start the worker:
30+
```bash
31+
uv run openai_agents/mcp/servers/tools_server.py --transport=streamable-http
32+
```
33+
34+
Then start the worker:
35+
```bash
36+
uv run openai_agents/mcp/run_streamable_http_worker.py
37+
```
38+
39+
Finally, run the workflow:
40+
```bash
41+
uv run openai_agents/mcp/run_streamable_http_workflow.py
42+
```
43+
44+
### SSE MCP
45+
46+
First, start the MCP server:
47+
```bash
48+
uv run openai_agents/mcp/servers/tools_server.py --transport=sse
49+
```
50+
51+
Then start the worker:
52+
```bash
53+
uv run openai_agents/mcp/run_sse_worker.py
54+
```
55+
56+
Finally, run the workflow:
57+
```bash
58+
uv run openai_agents/mcp/run_sse_workflow.py
59+
```
60+
61+
### Prompt Server MCP
62+
63+
First, start the MCP server:
64+
```bash
65+
uv run openai_agents/mcp/servers/prompt_server.py
66+
```
67+
68+
Then start the worker:
69+
```bash
70+
uv run openai_agents/mcp/run_prompt_server_worker.py
71+
```
72+
73+
Finally, run the workflow:
74+
```bash
75+
uv run openai_agents/mcp/run_prompt_server_workflow.py
76+
```
77+
78+
79+
### Memory MCP (Research Scratchpad)
80+
81+
Demonstrates durable note-taking with the Memory MCP server: write seed notes, query by tags, synthesize a brief with citations, then update and delete notes.
82+
83+
Start the worker:
84+
```bash
85+
uv run openai_agents/mcp/run_memory_research_scratchpad_worker.py
86+
```
87+
88+
Run the research scratchpad workflow:
89+
```bash
90+
uv run openai_agents/mcp/run_memory_research_scratchpad_workflow.py
91+
```
Lines changed: 62 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,62 @@
1+
from __future__ import annotations
2+
3+
import asyncio
4+
import logging
5+
import os
6+
from datetime import timedelta
7+
8+
from agents.mcp import MCPServerStdio
9+
from temporalio.client import Client
10+
from temporalio.contrib.openai_agents import (
11+
ModelActivityParameters,
12+
OpenAIAgentsPlugin,
13+
StatelessMCPServerProvider,
14+
)
15+
from temporalio.worker import Worker
16+
17+
from openai_agents.mcp.workflows.file_system_workflow import FileSystemWorkflow
18+
19+
20+
async def main():
21+
logging.basicConfig(level=logging.INFO)
22+
current_dir = os.path.dirname(os.path.abspath(__file__))
23+
samples_dir = os.path.join(current_dir, "sample_files")
24+
25+
file_system_server = StatelessMCPServerProvider(
26+
lambda: MCPServerStdio(
27+
name="FileSystemServer",
28+
params={
29+
"command": "npx",
30+
"args": ["-y", "@modelcontextprotocol/server-filesystem", samples_dir],
31+
},
32+
)
33+
)
34+
35+
# Create client connected to server at the given address
36+
client = await Client.connect(
37+
"localhost:7233",
38+
plugins=[
39+
OpenAIAgentsPlugin(
40+
model_params=ModelActivityParameters(
41+
start_to_close_timeout=timedelta(seconds=60)
42+
),
43+
mcp_server_providers=[file_system_server],
44+
),
45+
],
46+
)
47+
48+
worker = Worker(
49+
client,
50+
task_queue=f"openai-agents-mcp-filesystem-task-queue",
51+
workflows=[
52+
FileSystemWorkflow,
53+
],
54+
activities=[
55+
# No custom activities needed for these workflows
56+
],
57+
)
58+
await worker.run()
59+
60+
61+
if __name__ == "__main__":
62+
asyncio.run(main())
Lines changed: 29 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,29 @@
1+
import asyncio
2+
3+
from temporalio.client import Client
4+
from temporalio.contrib.openai_agents import OpenAIAgentsPlugin
5+
6+
from openai_agents.mcp.workflows.file_system_workflow import FileSystemWorkflow
7+
8+
9+
async def main():
10+
# Create client connected to server at the given address
11+
client = await Client.connect(
12+
"localhost:7233",
13+
plugins=[
14+
OpenAIAgentsPlugin(),
15+
],
16+
)
17+
18+
# Execute a workflow
19+
result = await client.execute_workflow(
20+
FileSystemWorkflow.run,
21+
id="file-system-workflow",
22+
task_queue="openai-agents-mcp-filesystem-task-queue",
23+
)
24+
25+
print(f"Result: {result}")
26+
27+
28+
if __name__ == "__main__":
29+
asyncio.run(main())
Lines changed: 59 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,59 @@
1+
from __future__ import annotations
2+
3+
import asyncio
4+
import logging
5+
from datetime import timedelta
6+
7+
from agents.mcp import MCPServerStdio
8+
from temporalio.client import Client
9+
from temporalio.contrib.openai_agents import (
10+
ModelActivityParameters,
11+
OpenAIAgentsPlugin,
12+
StatefulMCPServerProvider,
13+
)
14+
from temporalio.worker import Worker
15+
16+
from openai_agents.mcp.workflows.memory_research_scratchpad_workflow import (
17+
MemoryResearchScratchpadWorkflow,
18+
)
19+
20+
21+
async def main():
22+
logging.basicConfig(level=logging.INFO)
23+
24+
memory_server_provider = StatefulMCPServerProvider(
25+
lambda: MCPServerStdio(
26+
name="MemoryServer",
27+
params={
28+
"command": "npx",
29+
"args": ["-y", "@modelcontextprotocol/server-memory"],
30+
},
31+
)
32+
)
33+
34+
# Create client connected to server at the given address
35+
client = await Client.connect(
36+
"localhost:7233",
37+
plugins=[
38+
OpenAIAgentsPlugin(
39+
model_params=ModelActivityParameters(
40+
start_to_close_timeout=timedelta(seconds=60)
41+
),
42+
mcp_server_providers=[memory_server_provider],
43+
),
44+
],
45+
)
46+
47+
worker = Worker(
48+
client,
49+
task_queue="openai-agents-mcp-memory-task-queue",
50+
workflows=[
51+
MemoryResearchScratchpadWorkflow,
52+
],
53+
activities=[],
54+
)
55+
await worker.run()
56+
57+
58+
if __name__ == "__main__":
59+
asyncio.run(main())
Lines changed: 29 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,29 @@
1+
from __future__ import annotations
2+
3+
import asyncio
4+
5+
from temporalio.client import Client
6+
from temporalio.contrib.openai_agents import OpenAIAgentsPlugin
7+
8+
from openai_agents.mcp.workflows.memory_research_scratchpad_workflow import (
9+
MemoryResearchScratchpadWorkflow,
10+
)
11+
12+
13+
async def main():
14+
client = await Client.connect(
15+
"localhost:7233",
16+
plugins=[OpenAIAgentsPlugin()],
17+
)
18+
19+
result = await client.execute_workflow(
20+
MemoryResearchScratchpadWorkflow.run,
21+
id="memory-research-scratchpad-workflow",
22+
task_queue="openai-agents-mcp-memory-task-queue",
23+
)
24+
25+
print(f"Result:\n{result}")
26+
27+
28+
if __name__ == "__main__":
29+
asyncio.run(main())
Lines changed: 64 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,64 @@
1+
from __future__ import annotations
2+
3+
import asyncio
4+
import logging
5+
from datetime import timedelta
6+
7+
from agents.mcp import MCPServerStreamableHttp
8+
from temporalio.client import Client
9+
from temporalio.contrib.openai_agents import (
10+
ModelActivityParameters,
11+
OpenAIAgentsPlugin,
12+
StatelessMCPServerProvider,
13+
)
14+
from temporalio.worker import Worker
15+
16+
from openai_agents.mcp.workflows.prompt_server_workflow import PromptServerWorkflow
17+
18+
19+
async def main():
20+
logging.basicConfig(level=logging.INFO)
21+
22+
print("Setting up worker...\n")
23+
24+
try:
25+
prompt_server_provider = StatelessMCPServerProvider(
26+
lambda: MCPServerStreamableHttp(
27+
name="PromptServer",
28+
params={
29+
"url": "http://localhost:8000/mcp",
30+
},
31+
)
32+
)
33+
34+
# Create client connected to server at the given address
35+
client = await Client.connect(
36+
"localhost:7233",
37+
plugins=[
38+
OpenAIAgentsPlugin(
39+
model_params=ModelActivityParameters(
40+
start_to_close_timeout=timedelta(seconds=120)
41+
),
42+
mcp_server_providers=[prompt_server_provider],
43+
),
44+
],
45+
)
46+
47+
worker = Worker(
48+
client,
49+
task_queue="openai-agents-mcp-prompt-task-queue",
50+
workflows=[
51+
PromptServerWorkflow,
52+
],
53+
activities=[
54+
# No custom activities needed for these workflows
55+
],
56+
)
57+
await worker.run()
58+
except Exception as e:
59+
print(f"Worker failed: {e}")
60+
raise
61+
62+
63+
if __name__ == "__main__":
64+
asyncio.run(main())
Lines changed: 29 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,29 @@
1+
from __future__ import annotations
2+
3+
import asyncio
4+
5+
from temporalio.client import Client
6+
from temporalio.contrib.openai_agents import OpenAIAgentsPlugin
7+
8+
from openai_agents.mcp.workflows.prompt_server_workflow import PromptServerWorkflow
9+
10+
11+
async def main():
12+
# Create client connected to server at the given address
13+
client = await Client.connect(
14+
"localhost:7233",
15+
plugins=[OpenAIAgentsPlugin()],
16+
)
17+
18+
# Execute a workflow
19+
result = await client.execute_workflow(
20+
PromptServerWorkflow.run,
21+
id="prompt-server-workflow",
22+
task_queue="openai-agents-mcp-prompt-task-queue",
23+
)
24+
25+
print(f"Result: {result}")
26+
27+
28+
if __name__ == "__main__":
29+
asyncio.run(main())

0 commit comments

Comments
 (0)