10BC0 feat: Tools integrations for others SDKs [WIP] by renvins · Pull Request #284 · mcp-use/mcp-use · GitHub
[go: up one dir, main page]

Skip to content

Conversation

renvins
Copy link
Contributor
@renvins renvins commented Sep 30, 2025

Pull Request Description

This pull request introduces the first of our new integration adapters, allowing mcp-use to be used directly with other SDKs. This marks a significant step towards making mcp-use a universal tool provider that can plug into various LLM ecosystems beyond LangChain.

The initial part of this change is the OpenAIMCPAdapter, which handles the conversion of MCP tools into the format expected by OpenAI's tool-calling API. This allows developers to leverage the power of MCP servers while working directly with OpenAI's native tools.

Key Changes:

  • New OpenAIMCPAdapter: Added mcp_use/adapters/openai.py to manage the conversion of MCP tools to the OpenAI format.
  • Usage Example: Included examples/openai_integration_example.py to demonstrate how to use the new adapter.
  • Initial Documentation: Created a basic documentation page at docs/integration/openai.mdx.
Usage Example (`examples/openai_integration_example.py`)
import asyncio

from dotenv import load_dotenv
from openai import OpenAI

from mcp_use import MCPClient
from mcp_use.adapters import OpenAIMCPAdapter

# This example demonstrates how to use our integration
# adapaters to use MCP tools and convert to the right format.
# In particularly, this example uses the OpenAIMCPAdapter.

load_dotenv()


async def main():
    config = {
        "mcpServers": {
            "airbnb": {"command": "npx", "args": ["-y", "@openbnb/mcp-server-airbnb", "--ignore-robots-txt"]}
        }
    }

    try:
        client = MCPClient(config=config)

        # Creates the adapter for OpenAI's format
        adapter = OpenAIMCPAdapter()

        # Convert tools from active connectors to the OpenAI's format
        openai_tools = await adapter.create_tools(client)

        # Use tools with OpenAI's SDK (not agent in this case)
        openai = OpenAI()
        input_list = [{"role": "user", "content": "Search on Airbnb the cheapest hotel in Trapani for two nights."}]
        response = openai.chat.completions.create(model="gpt-4o", messages=input_list, tools=openai_tools)

        response_message = response.choices[0].message
        input_list.append(response_message)
        if not response_message.tool_calls:
            print("No tool call requested by the model")
            print(response_message.content)
            return

        for tool_call in response_message.tool_calls:
            import json

            function_name = tool_call.function.name
            arguments = json.loads(tool_call.function.arguments)

            # Use the adapter's map to get the correct connector
            connector = adapter.tool_to_connector_map[function_name]

            print(f"Executing tool: {function_name}({arguments})")
            tool_result = await connector.call_tool(name=function_name, arguments=arguments)

        # Handle and print the result
        if getattr(tool_result, "isError", False):
            print(f"Error: {tool_result.content}")
            return

        input_list.append(
            {"tool_call_id": tool_call.id, "role": "tool", "name": function_name, "content": tool_result.content}
        )

        # Send the tool result back to the model
        second_response = openai.chat.completions.create(model="gpt-4o", messages=input_list, tools=openai_tools)
        final_message = second_response.choices[0].message
        print("\n--- Final response from the model ---")
        print(final_message.content)

    except Exception as e:
        print(f"Error: {e}")


if __name__ == "__main__":
    asyncio.run(main())

Future Integrations

This PR establishes a pattern that we will follow to create adapters for other major LLM providers. Work will begin shortly on similar integrations for:

  • Anthropic
  • Groq
  • Google Gemini

By providing these adapters, we aim to offer developers maximum flexibility in how they build their agents and applications with mcp-use.

- Move common functions to base adapter
- Create initial documentation for OpenAI integration
- Develop OpenAIMCPAdapter and related example

# Create LangChain tools using the adapter with connectors
self._tools = await self.adapter._create_tools_from_connectors(connectors_to_use)
await self.adapter._create_tools_from_connectors(connectors_to_use)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there a way we can have these be called inside the functions and have adapter.tools always callable and fresh ?
So that there is not implicit ordering of operations, one here might not know that create_tools_from_connectors must be called and try directly .tools, and be disappointed because tools won't be there.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@pietrozullo you mean when you call adapter.tools you get already the operations done?

@pietrozullo
Copy link
Member

@renvins I think here we could also add Langchain, Langgraph even though it is quite obvious if one looks at MCPagent. Got this from here https://github.com/orgs/mcp-use/discussions/139
What do you think ?

@renvins
Copy link
Contributor Author
renvins commented Oct 7, 2025

@pietrozullo Don’t we already have the langchain one? We can do the one for langgraph!

@pietrozullo
Copy link
Member

I think it is okay to be redundant and show how to integrate with both, in some sense our agent is not an integration, is a custom integration. It is okay to show how to integrate with langchain more in general as well in other words, like how to use the adapter and not the agent.

@renvins
Copy link
Contributor Author
renvins commented Oct 7, 2025

I think it is okay to be redundant and show how to integrate with both, in some sense our agent is not an integration, is a custom integration. It is okay to show how to integrate with langchain more in general as well in other words, like how to use the adapter and not the agent.

Clear, maybe we can use the existing adapter to build an example. @pietrozullo

@pietrozullo
Copy link
Member

I was looking into this, if I were a user of mcp_use for the integrations I would like this to be:

from mcp_use.adapters import OpenAIAdapter 
from openai import OpenAI 

adapter = OpenAIAdapter(client)

openai = OpenAI()

messages = [{"role": "user", "content": "Please tell me the cheapest pisci spada for two people in Levanzo."}
response = openai.chat.completions.create(model="gpt-4o", messages=messages, tools=adapter.tools)

what do you think? Can we modify the API ?

@renvins
Copy link
Contributor Author
renvins commented Oct 10, 2025

@pietrozullo I will have a look into this. I think that if we want apply this to the base adapter we need a refactoring of the MCPAgent. Please have a look at it in the part of the ONLY connectors init of the agent etc. Because we should force to pass a client to the LangChain adapter. Let me know!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants

0