Skip to content

Conversation

@jwm4
Copy link
Contributor

@jwm4 jwm4 commented Dec 11, 2025

What does this PR do?

Sample notebook shows how to migrate from the old Agents API to the newer Responses API.

Closes #3542

Test Plan

I ran the notebook. The outputs are included in the notebook itself.

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Meta Open Source bot. label Dec 11, 2025
Copy link
Member

@raghotham raghotham left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it possible to run the mcp server as part of the notebook as well?

@jwm4
Copy link
Contributor Author

jwm4 commented Dec 12, 2025

@raghotham , I tried this out in my local copy and it seems to work. It seemed like a good idea to me at first, but in trying it out, I am finding that it adds a lot of complexity. Here are two issues that I am running into:

  1. The NPS server has its own Python dependencies that are not a subset of the Llama Stack dependencies. Some users might not want the virtual environment in which they run the notebook to be the one where they install the server. That's handled now by just pointing the users at the server repo and telling them to install and run it -- then they can figure out whether they want it all in one environment or to set up separate environments for each.
  2. Since the server is a long running process, the notebook needs to kick off a separate subprocess and direct the logs to a file. Looking at the logs is a little clunkier (because you need to look at a file, not a console), but that's fine. A bigger issue is managing the lifecycle of the server. If I want to stop and restart the NPS server in a terminal window, I just do "^c" and then "!!", but if I want to do that when it is a subprocess of the notebook, I need to either terminate the notebook kernel or find the process via ps and then run kill on that process ID. That's OK too, I guess but clunky.

If these are not disqualifying, I think there are two different approaches we could take each with their own pros and cons:

Option 1: Continue to point to the AI Alliance repo for where to get the NPS server. Tell users to download the server from there and install the dependencies themselves (in the environment where the notebook is running, since it kicks off the server). The main drawback of this approach is that we haven't gotten much closer to making the notebook "just work" out of the box -- you still need to do most of the same setup, we just save users the one command to start the server at the end.

Option 2: Make a fork of the NPS server that lives in this directory and then have the notebook install all its dependencies for you (via a requirements.txt file or similar). The main drawbacks are that it adds clutter here (the server code, its requirements.txt) and also that now there are two separate forks of this server that are likely to get out of sync with each other.

What do you think? Should we go ahead anyway and if so which option do you prefer?

@jwm4
Copy link
Contributor Author

jwm4 commented Jan 1, 2026

@raghotham , I wanted to follow up again on this since it is still open. Do you have a view on which path seems better to you? I guess I lean toward just leaving it as is because of the added complexity of managing an MCP server in a notebook, but I don't feel very strongly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Meta Open Source bot.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Document Agents -> Responses migration story

2 participants