This is a ** sampleproject** demonstrating how to connect an OpenAI Agent or other LLMs (with tool-calling capability) to an MCP server through langchain or openai-agent framework for enhanced paper search and chat functionality.
It includes:
- Creating and running an MCP server
- Connecting an LLM (local or remote) to the MCP server
- Running an interactive chatbot that can search arXiv papers and store results
Below are usage examples for each file script.
It use arxiv lib to search and extract info from query of user. There are two function calling search and extract information.
poetry run python tool_function.py
An example of creating a local MCP server with stdio connection.
npx @modelcontextprotocol/inspector run research_server.py
You can run this project with a local vLLM for faster inference.
For GPT-OSS-20B, see GPT-OSS instructions.
Important: The LLM you choose must have tool-calling capability to work with the MCP server.
Open new terminal then run.
vllm serve openai/gpt-oss-20b
Then update init_agent()
to point to your vLLM endpoint (default: http://localhost:8000/v1
).
python chatbot_mcp.py
Initializes an agent and connects to the MCP server. Each agent maintains its own session.