Spaces:
Build error
Build error
A newer version of the Gradio SDK is available:
5.39.0
metadata
title: π LLM Conversation Transfer Tool
emoji: π
colorFrom: blue
colorTo: purple
sdk: gradio
sdk_version: 5.33.1
python_version: 3.1
app_file: app.py
pinned: false
tags:
- mcp-server-track
- llm
- conversation-transfer
- gradio
- mcp
π LLM Conversation Transfer Tool
Tags: mcp-server-track
A powerful tool that seamlessly transfers conversations between different LLM providers (ChatGPT, Claude, Mistral, etc.) and functions as both a Gradio web app and an MCP (Model Context Protocol) server.
Link to the Demo video
π Features
- Universal Conversation Parser: Supports JSON format, plain text, and various chat export formats
- Multiple LLM Providers: Transfer to Anthropic Claude, Mistral AI, and Hyperbolic Labs
- Dual Interface: Web app for interactive use + MCP server for programmatic access
- Smart Context Preservation: Maintains conversation flow and context during transfers
- Real-time Status: Live API key validation and connection status
π How to Use
Web Interface
- Paste Your Conversation: Copy from ChatGPT, Claude, or any chat interface
- Select Providers: Choose source and target LLM providers
- Transfer: Click the button and get a response from your target LLM
As MCP Server
This app can be used as an MCP server with any MCP-compatible client:
Available Tool: transfer_conversation
history_text
: Conversation in JSON or plain text formatsource_provider
: Source LLM name (ChatGPT, Claude, etc.)target_provider
: Target LLM (anthropic, mistral, hyperbolic)
π MCP Server Demo Video
π₯ Watch the MCP Server in Action
The video demonstrates:
- Setting up the MCP server with Claude Desktop
- Transferring a conversation from ChatGPT to Claude
- Using the tool within an MCP client environment
- Real-time conversation continuation across different LLMs
π§ Supported Formats
Input Formats
Plain Text:
User: Hello there!
Assistant: Hi! How can I help you?
JSON:
[
{"role": "user", "content": "Hello there!"},
{"role": "assistant", "content": "Hi! How can I help you?"}
]
Supported Providers
- β Anthropic (Claude 3 Haiku)
- β Mistral AI (Mistral Small)
- β Hyperbolic Labs (Llama 2)
π οΈ Technical Details
Built with:
- Gradio: Interactive web interface
- MCP (Model Context Protocol): Server functionality
- HTTPX: Async HTTP requests
- Modal: Cloud deployment platform
API Integration
- Anthropic Messages API
- Mistral Chat Completions API
- Hyperbolic Labs API
π¦ Setup & Configuration
Environment Variables
ANTHROPIC_API_KEY=your_anthropic_key
MISTRAL_API_KEY=your_mistral_key
HYPERBOLIC_API_KEY=your_hyperbolic_key
Local Development
# Install dependencies
pip install -r requirements.txt
# Run as web app
python main.py
# Run as MCP server
python mcp_server.py
Modal Deployment
modal deploy main.py
π― Use Cases
- LLM Comparison: Test how different models respond to the same conversation
- Context Migration: Move conversations between different AI assistants
- Model Evaluation: Compare responses across multiple LLM providers
- Workflow Integration: Embed in larger AI workflows via MCP protocol
π Status Dashboard
The app includes real-time status monitoring:
- API key validation
- Connection health checks
- Transfer success rates
- Error reporting
π Privacy & Security
- No conversation data is stored
- API keys are handled securely through environment variables
- All transfers happen in real-time without logging
- HTTPS connections for all API calls
π€ Contributing
This project is part of the MCP Server Track. Contributions welcome!
- Fork the repository
- Create a feature branch
- Submit a pull request
π License
MIT License - feel free to use and modify!
Made for the MCP Server Track π
Seamlessly bridging conversations across the AI ecosystem