-
Notifications
You must be signed in to change notification settings - Fork 2.7k
[DOC] simple mcp client with sampling capability #1436
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
8c6d0e3
to
4e14428
Compare
@maxisbey @felixweinberger here's the first attempt, please have a look. |
@felixweinberger @maxisbey updated with the |
22fcf19
to
4e14428
Compare
4e14428
to
3b0b7c8
Compare
@felixweinberger @maxisbey while waiting for the review, can you please help me understand why is CI failing? The failures with respect to missing files or in files under |
Looks like you have a bunch of typing failures:
Are you using any features only available in Python 3.13+? Please make sure any examples fir the repo, i.e. they need to be compatible with 3.10+ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @yarnabrina thank you for this contribution.
Could you attach a screen recording or gif to your PR description demonstrating the example? That would make it easier to evaluate it.
Also for examples we aim to not introduce any dependencies on specific model providers like OpenAI or Anthropic - examples should be constructed in such a way that users can easily replace the relevant API calls with whatever provider they need or have access to.
@@ -0,0 +1 @@ | |||
OPENAI_API_KEY=YOUR_OPENAI_API_KEY |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we rename this to LLM_API_KEY
to not call out any specific model providers.
See simple chatbot example for inspiration how to provide an LLMClient
to illustrate that any provider can be used:
class LLMClient: |
] | ||
dependencies = [ | ||
"mcp>=1.16.0,<2", | ||
"openai>=2.1.0,<3", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please don't import specific provider dependencies, examples should ideally be such that users can easily substitute their preferred APIs as needed.
Hi @felixweinberger , I'll check on the openai to generic part, but I'm really not sure what I can do on the type checking part. This PR only adds new files under Does [I'll update openai to generic calls using requests in a day or two.] |
Motivation and Context
The is no documented MCP client example how to support MCP servers using sampling capability. This PR attempts to add a basic example.
How Has This Been Tested?
This changes are subset of my personal repository. It is not tested in a production system, but thoroughly tested locally.
Breaking Changes
No
Types of changes
Checklist
Additional context
Related #1205