-
Notifications
You must be signed in to change notification settings - Fork 4.2k
Python: Introducing support for using a MCP server as a plugin #11334
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Changes from all commits
Commits
Show all changes
19 commits
Select commit
Hold shift + click to select a range
1097527
Python: Add initial MCP Connector Version (#10778)
nmoeller 56934a7
WIP MCP
eavanvalkenburg bc240ea
updated MCP code and sample
eavanvalkenburg f10b7fd
updated docstring
eavanvalkenburg 194c168
test updates
eavanvalkenburg ee54dc5
updated tests
eavanvalkenburg 9d758e9
removed old test
eavanvalkenburg 9622a53
fix tests
eavanvalkenburg 09ff0e6
improved parsing of schema and tests
eavanvalkenburg 47a43da
added tests for funcs without params
eavanvalkenburg ce121a5
added prompts
eavanvalkenburg 80ba6c0
rebuilt as plugin
eavanvalkenburg b4e6801
add readme
eavanvalkenburg 5288476
updated numbering
eavanvalkenburg 18ae9f8
fixed assert
eavanvalkenburg 9a55c21
updated redis
eavanvalkenburg ae33ebf
added agent sample
eavanvalkenburg 480e6a4
added websocket support
eavanvalkenburg a3b233c
comment fixes
eavanvalkenburg File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,46 @@ | ||
# Model Context Protocol | ||
|
||
The model context protocol is a standard created by Anthropic to allow models to share context with each other. See the [official documentation](https://modelcontextprotocol.io/introduction) for more information. | ||
|
||
It consists of clients and servers, and servers can be hosted locally, or they can be exposed as a online API. | ||
|
||
Our goal is that Semantic Kernel can act as both a client and a server. | ||
|
||
In this folder the client side of things is demonstrated. It takes the definition of a server and uses that to create a Semantic Kernel plugin, this plugin exposes the tools and prompts of the server as functions in the kernel. | ||
|
||
Those can then be used with function calling in a chat or agent. | ||
|
||
## Server types | ||
|
||
There are two types of servers, Stdio and Sse based. The sample shows how to use the Stdio based server, which get's run locally, in this case by using [npx](https://docs.npmjs.com/cli/v8/commands/npx). | ||
|
||
Some other common runners are [uvx](https://docs.astral.sh/uv/guides/tools/), for python servers and [docker](https://www.docker.com/), for containerized servers. | ||
|
||
The code shown works the same for a Sse server, only then a MCPSsePlugin needs to be used instead of the MCPStdioPlugin. | ||
|
||
The reverse, using Semantic Kernel as a server, is not yet implemented, but will be in the future. | ||
|
||
## Running the sample | ||
|
||
1. Make sure you have the [Node.js](https://nodejs.org/en/download/) installed. | ||
2. Make sure you have the [npx](https://docs.npmjs.com/cli/v8/commands/npx) available in PATH. | ||
3. The Github MCP Server uses a Github Personal Access Token (PAT) to authenticate, see [the documentation](https://github.com/modelcontextprotocol/servers/tree/main/src/github) on how to create one. | ||
4. Install Semantic Kernel with the mcp extra: | ||
|
||
```bash | ||
pip install semantic-kernel[mcp] | ||
``` | ||
|
||
5. Run the sample: | ||
|
||
```bash | ||
cd python/samples/concepts/mcp | ||
python mcp_as_plugin.py | ||
``` | ||
|
||
or: | ||
|
||
```bash | ||
cd python/samples/concepts/mcp | ||
python agent_with_mcp_plugin.py | ||
``` |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,119 @@ | ||
# Copyright (c) Microsoft. All rights reserved. | ||
|
||
import asyncio | ||
|
||
from semantic_kernel.agents import ChatCompletionAgent, ChatHistoryAgentThread | ||
from semantic_kernel.connectors.ai.open_ai import AzureChatCompletion | ||
from semantic_kernel.connectors.mcp import MCPStdioPlugin | ||
|
||
""" | ||
The following sample demonstrates how to create a chat completion agent that | ||
answers questions about Github using a Semantic Kernel Plugin from a MCP server. | ||
The Chat Completion Service is passed directly via the ChatCompletionAgent constructor. | ||
Additionally, the plugin is supplied via the constructor. | ||
""" | ||
|
||
|
||
# Simulate a conversation with the agent | ||
USER_INPUTS = [ | ||
"What are the latest 5 python issues in Microsoft/semantic-kernel?", | ||
"Are there any untriaged python issues?", | ||
"What is the status of issue #10785?", | ||
] | ||
|
||
|
||
async def main(): | ||
# 1. Create the agent | ||
async with MCPStdioPlugin( | ||
name="Github", | ||
description="Github Plugin", | ||
command="npx", | ||
args=["-y", "@modelcontextprotocol/server-github"], | ||
) as github_plugin: | ||
agent = ChatCompletionAgent( | ||
service=AzureChatCompletion(), | ||
name="IssueAgent", | ||
instructions="Answer questions about the Microsoft semantic-kernel github project.", | ||
plugins=[github_plugin], | ||
) | ||
|
||
for user_input in USER_INPUTS: | ||
# 2. Create a thread to hold the conversation | ||
# If no thread is provided, a new thread will be | ||
# created and returned with the initial response | ||
thread: ChatHistoryAgentThread = None | ||
|
||
print(f"# User: {user_input}") | ||
# 3. Invoke the agent for a response | ||
response = await agent.get_response(messages=user_input, thread=thread) | ||
print(f"# {response.name}: {response} ") | ||
thread = response.thread | ||
|
||
# 4. Cleanup: Clear the thread | ||
await thread.delete() if thread else None | ||
|
||
""" | ||
Sample output: | ||
GitHub MCP Server running on stdio | ||
# User: What are the latest 5 python issues in Microsoft/semantic-kernel? | ||
# IssueAgent: Here are the latest 5 Python issues in the | ||
[Microsoft/semantic-kernel](https://github.com/microsoft/semantic-kernel) repository: | ||
|
||
1. **[Issue #11358](https://github.com/microsoft/semantic-kernel/pull/11358)** | ||
**Title:** Python: Bump Python version to 1.27.0 for a release. | ||
**Created by:** [moonbox3](https://github.com/moonbox3) | ||
**Created at:** April 3, 2025 | ||
**State:** Open | ||
**Comments:** 1 | ||
**Description:** Bump Python version to 1.27.0 for a release. | ||
|
||
2. **[Issue #11357](https://github.com/microsoft/semantic-kernel/pull/11357)** | ||
**Title:** .Net: Version 1.45.0 | ||
**Created by:** [markwallace-microsoft](https://github.com/markwallace-microsoft) | ||
**Created at:** April 3, 2025 | ||
**State:** Open | ||
**Comments:** 0 | ||
**Description:** Version bump for release 1.45.0. | ||
|
||
3. **[Issue #11356](https://github.com/microsoft/semantic-kernel/pull/11356)** | ||
**Title:** .Net: Fix bug in sqlite filter logic | ||
**Created by:** [westey-m](https://github.com/westey-m) | ||
**Created at:** April 3, 2025 | ||
**State:** Open | ||
**Comments:** 0 | ||
**Description:** Fix bug in sqlite filter logic. | ||
|
||
4. **[Issue #11355](https://github.com/microsoft/semantic-kernel/issues/11355)** | ||
**Title:** .Net: [MEVD] Validate that the collection generic key parameter corresponds to the model | ||
**Created by:** [roji](https://github.com/roji) | ||
**Created at:** April 3, 2025 | ||
**State:** Open | ||
**Comments:** 0 | ||
**Description:** We currently have validation for the TKey generic type parameter passed to the collection type, | ||
and we have validation for the key property type on the model. | ||
|
||
5. **[Issue #11354](https://github.com/microsoft/semantic-kernel/issues/11354)** | ||
**Title:** .Net: How to add custom JsonSerializer on a builder level | ||
**Created by:** [PawelStadnicki](https://github.com/PawelStadnicki) | ||
**Created at:** April 3, 2025 | ||
**State:** Open | ||
**Comments:** 0 | ||
**Description:** Inquiry about adding a custom JsonSerializer for handling F# types within the SDK. | ||
|
||
If you need more details about a specific issue, let me know! | ||
# User: Are there any untriaged python issues? | ||
# IssueAgent: There are no untriaged Python issues in the Microsoft semantic-kernel repository. | ||
# User: What is the status of issue #10785? | ||
# IssueAgent: The status of issue #10785 in the Microsoft Semantic Kernel repository is **open**. | ||
|
||
- **Title**: Port dotnet feature: Create MCP Sample | ||
- **Created at**: March 4, 2025 | ||
- **Comments**: 0 | ||
- **Labels**: python | ||
|
||
You can view the issue [here](https://github.com/microsoft/semantic-kernel/issues/10785). | ||
""" | ||
|
||
|
||
if __name__ == "__main__": | ||
asyncio.run(main()) |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,115 @@ | ||
# Copyright (c) Microsoft. All rights reserved. | ||
|
||
import asyncio | ||
|
||
from samples.concepts.setup.chat_completion_services import Services, get_chat_completion_service_and_request_settings | ||
from semantic_kernel import Kernel | ||
from semantic_kernel.connectors.ai.function_choice_behavior import FunctionChoiceBehavior | ||
from semantic_kernel.connectors.mcp import MCPStdioPlugin | ||
from semantic_kernel.contents import ChatHistory | ||
|
||
""" | ||
This sample demonstrates how to build a conversational chatbot | ||
using Semantic Kernel, | ||
it creates a Plugin from a MCP server config and adds it to the kernel. | ||
The chatbot is designed to interact with the user, call MCP tools | ||
as needed, and return responses. | ||
|
||
To run this sample, make sure to run: | ||
`pip install semantic-kernel[mcp]` | ||
|
||
or install the mcp package manually. | ||
|
||
In addition, different MCP Stdio servers need different commands to run. | ||
For example, the Github plugin requires `npx`, others use `uvx` or `docker`. | ||
|
||
Make sure those are available in your PATH. | ||
""" | ||
|
||
# System message defining the behavior and persona of the chat bot. | ||
system_message = """ | ||
You are a chat bot. And you help users interact with Github. | ||
You are especially good at answering questions about the Microsoft semantic-kernel project. | ||
You can call functions to get the information you need. | ||
""" | ||
|
||
# Create and configure the kernel. | ||
kernel = Kernel() | ||
|
||
# You can select from the following chat completion services that support function calling: | ||
# - Services.OPENAI | ||
# - Services.AZURE_OPENAI | ||
# - Services.AZURE_AI_INFERENCE | ||
# - Services.ANTHROPIC | ||
# - Services.BEDROCK | ||
# - Services.GOOGLE_AI | ||
# - Services.MISTRAL_AI | ||
# - Services.OLLAMA | ||
# - Services.ONNX | ||
# - Services.VERTEX_AI | ||
# - Services.DEEPSEEK | ||
# Please make sure you have configured your environment correctly for the selected chat completion service. | ||
chat_service, settings = get_chat_completion_service_and_request_settings(Services.OPENAI) | ||
|
||
# Configure the function choice behavior. Here, we set it to Auto, where auto_invoke=True by default. | ||
# With `auto_invoke=True`, the model will automatically choose and call functions as needed. | ||
settings.function_choice_behavior = FunctionChoiceBehavior.Auto() | ||
|
||
kernel.add_service(chat_service) | ||
|
||
# Create a chat history to store the system message, initial messages, and the conversation. | ||
history = ChatHistory() | ||
history.add_system_message(system_message) | ||
|
||
|
||
async def chat() -> bool: | ||
""" | ||
Continuously prompt the user for input and show the assistant's response. | ||
Type 'exit' to exit. | ||
""" | ||
try: | ||
user_input = input("User:> ") | ||
except (KeyboardInterrupt, EOFError): | ||
print("\n\nExiting chat...") | ||
return False | ||
if user_input.lower().strip() == "exit": | ||
print("\n\nExiting chat...") | ||
return False | ||
|
||
history.add_user_message(user_input) | ||
result = await chat_service.get_chat_message_content(history, settings, kernel=kernel) | ||
if result: | ||
print(f"Mosscap:> {result}") | ||
history.add_message(result) | ||
|
||
return True | ||
|
||
|
||
async def main() -> None: | ||
# Create a plugin from the MCP server config and add it to the kernel. | ||
# The MCP server plugin is defined using the MCPStdioPlugin class. | ||
# The command and args are specific to the MCP server you want to run. | ||
# For example, the Github MCP Server uses `npx` to run the server. | ||
# There is also a MCPSsePlugin, which takes a URL. | ||
async with MCPStdioPlugin( | ||
name="Github", | ||
description="Github Plugin", | ||
command="npx", | ||
args=["-y", "@modelcontextprotocol/server-github"], | ||
) as github_plugin: | ||
# instead of using this async context manager, you can also use: | ||
# await github_plugin.connect() | ||
# and then await github_plugin.close() at the end of the program. | ||
|
||
# Add the plugin to the kernel. | ||
kernel.add_plugin(github_plugin) | ||
|
||
# Start the chat loop. | ||
print("Welcome to the chat bot!\n Type 'exit' to exit.\n") | ||
chatting = True | ||
while chatting: | ||
chatting = await chat() | ||
|
||
|
||
if __name__ == "__main__": | ||
asyncio.run(main()) |
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.