febeeh/AI-Assistant-OpenAI-MCP-Setup
A Minimal AI assistant setup with OPENAI & MCP Server (Model Context Protocol) tool support and a Gradio chat UI. Built with FastAPI, FastMCP, and Gradio.
Platform-specific configuration:
{
"mcpServers": {
"AI-Assistant-OpenAI-MCP-Setup": {
"command": "npx",
"args": [
"-y",
"AI-Assistant-OpenAI-MCP-Setup"
]
}
}
}Add the config above to .claude/settings.json under the mcpServers key.
An AI assistant setup with OPENAI & MCP Server (Model Context Protocol) tool support and a Gradio chat UI. Built with FastAPI, FastMCP, and Gradio.This setup is fully customizable—follow the instructions to adapt it to your own environment.
User ↔ Gradio Chat UI ↔ OpenAI API ↔ MCP Server (FastAPI + FastMCP)
/chat /mcp/chat/mcp endpoint using FastMCPgit clone https://github.com/febeeh/AI-Assistant-OpenAI-MCP-Setup
cd AI-Assistant-OpenAI-MCP-Setuppython -m venv venv
source venv/bin/activate # Linux/Mac
venv\Scripts\activate # Windowspip install -r requirements.txtCreate a .env file in the project root:
OPENAI_API_KEY=sk-your-key-here
MODEL=gpt-4o-mini
HOST=localhost
PORT=8000python main.pyhttp://localhost:8000/chathttp://localhost:8000/mcpEdit assistant.md — update the system prompt:
Add new routes in api/mcp/router.py. Each FastAPI endpoint under the router automatically becomes an MCP tool:
# api/mcp/router.py
@router.get("/tools/my_new_tool")
def my_new_tool(param: str):
"""Description of what this tool does (shown to the LLM)."""
return {"result": "..."}The function name becomes the tool name, and the docstring becomes the tool description that OpenAI sees. Parameters are extracted from the function signature
Loading reviews...