In this section, we’ll create our sentiment analysis MCP server using Gradio. This server will expose a sentiment analysis tool that can be used by both human users through a web interface and AI models through the MCP protocol.
Gradio provides a straightforward way to create MCP servers by automatically converting your Python functions into MCP tools. When you set mcp_server=True in launch(), Gradio:
First, let’s create a new directory for our project and set up the required dependencies:
mkdir mcp-sentiment
cd mcp-sentiment
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
pip install "gradio[mcp]" textblobHugging face spaces needs an app.py file to build the space. So the name of the python file has to be app.py
Create a new file called app.py with the following code:
import gradio as gr
from textblob import TextBlob
def sentiment_analysis(text: str) -> dict:
"""
Analyze the sentiment of the given text.
Args:
text (str): The text to analyze
Returns:
dict: A dictionary containing polarity, subjectivity, and assessment
"""
blob = TextBlob(text)
sentiment = blob.sentiment
return {
"polarity": round(sentiment.polarity, 2), # -1 (negative) to 1 (positive)
"subjectivity": round(sentiment.subjectivity, 2), # 0 (objective) to 1 (subjective)
"assessment": "positive" if sentiment.polarity > 0 else "negative" if sentiment.polarity < 0 else "neutral"
}
# Create the Gradio interface
demo = gr.Interface(
fn=sentiment_analysis,
inputs=gr.Textbox(placeholder="Enter text to analyze..."),
outputs=gr.JSON(),
title="Text Sentiment Analysis",
description="Analyze the sentiment of text using TextBlob"
)
# Launch the interface and MCP server
if __name__ == "__main__":
demo.launch(mcp_server=True)Let’s break down the key components:
Function Definition:
sentiment_analysis function takes a text input and returns a dictionarystr and dict) help define the input/output schemaGradio Interface:
gr.Interface creates both the web UI and MCP serverMCP Server:
mcp_server=True enables the MCP serverhttp://localhost:7860/gradio_api/mcp/sseexport GRADIO_MCP_SERVER=TrueStart the server by running:
python app.py
You should see output indicating that both the web interface and MCP server are running. The web interface will be available at http://localhost:7860, and the MCP server at http://localhost:7860/gradio_api/mcp/sse.
You can test the server in two ways:
Web Interface:
http://localhost:7860 in your browserMCP Schema:
http://localhost:7860/gradio_api/mcp/schemaType Hints and Docstrings:
String Inputs:
strSSE Support:
mcp-remote:
{
"mcpServers": {
"gradio": {
"command": "npx",
"args": [
"mcp-remote",
"http://localhost:7860/gradio_api/mcp/sse"
]
}
}
}Connection Issues:
To make your server available to others, you can deploy it to Hugging Face Spaces:
Create a new Space on Hugging Face:
Create a requirements.txt file:
gradio[mcp] textblob
git init
git add app.py requirements.txt
git commit -m "Initial commit"
git remote add origin https://huggingface.co/spaces/YOUR_USERNAME/mcp-sentiment
git push -u origin mainYour MCP server will now be available at:
https://YOUR_USERNAME-mcp-sentiment.hf.space/gradio_api/mcp/sseNow that we have our MCP server running, we’ll create clients to interact with it. In the next sections, we’ll:
Let’s move on to building our first client!
< > Update on GitHub