Spaces:
Build error
Build error
title: π LLM Conversation Transfer Tool | |
emoji: π | |
colorFrom: blue | |
colorTo: purple | |
sdk: gradio | |
sdk_version: 5.33.1 | |
python_version: 3.1 | |
app_file: app.py | |
pinned: false | |
tags: | |
- mcp-server-track | |
- llm | |
- conversation-transfer | |
- gradio | |
- mcp | |
# π LLM Conversation Transfer Tool | |
**Tags:** mcp-server-track | |
A powerful tool that seamlessly transfers conversations between different LLM providers (ChatGPT, Claude, Mistral, etc.) and functions as both a Gradio web app and an MCP (Model Context Protocol) server. | |
## [Link to the Demo video](https://drive.google.com/file/d/1jHsv4c0yqhA2o0FBhkZOegBrf3A6IVyF/view?usp=sharing) | |
## π Features | |
- **Universal Conversation Parser**: Supports JSON format, plain text, and various chat export formats | |
- **Multiple LLM Providers**: Transfer to Anthropic Claude, Mistral AI, and Hyperbolic Labs | |
- **Dual Interface**: Web app for interactive use + MCP server for programmatic access | |
- **Smart Context Preservation**: Maintains conversation flow and context during transfers | |
- **Real-time Status**: Live API key validation and connection status | |
## π How to Use | |
### Web Interface | |
1. **Paste Your Conversation**: Copy from ChatGPT, Claude, or any chat interface | |
2. **Select Providers**: Choose source and target LLM providers | |
3. **Transfer**: Click the button and get a response from your target LLM | |
### As MCP Server | |
This app can be used as an MCP server with any MCP-compatible client: | |
**Available Tool:** `transfer_conversation` | |
- `history_text`: Conversation in JSON or plain text format | |
- `source_provider`: Source LLM name (ChatGPT, Claude, etc.) | |
- `target_provider`: Target LLM (anthropic, mistral, hyperbolic) | |
## π MCP Server Demo Video | |
[π₯ **Watch the MCP Server in Action**](https://your-demo-video-link-here.com) | |
*The video demonstrates:* | |
- Setting up the MCP server with Claude Desktop | |
- Transferring a conversation from ChatGPT to Claude | |
- Using the tool within an MCP client environment | |
- Real-time conversation continuation across different LLMs | |
## π§ Supported Formats | |
### Input Formats | |
``` | |
Plain Text: | |
User: Hello there! | |
Assistant: Hi! How can I help you? | |
JSON: | |
[ | |
{"role": "user", "content": "Hello there!"}, | |
{"role": "assistant", "content": "Hi! How can I help you?"} | |
] | |
``` | |
### Supported Providers | |
- β **Anthropic** (Claude 3 Haiku) | |
- β **Mistral AI** (Mistral Small) | |
- β **Hyperbolic Labs** (Llama 2) | |
## π οΈ Technical Details | |
Built with: | |
- **Gradio**: Interactive web interface | |
- **MCP (Model Context Protocol)**: Server functionality | |
- **HTTPX**: Async HTTP requests | |
- **Modal**: Cloud deployment platform | |
### API Integration | |
- Anthropic Messages API | |
- Mistral Chat Completions API | |
- Hyperbolic Labs API | |
## π¦ Setup & Configuration | |
### Environment Variables | |
```env | |
ANTHROPIC_API_KEY=your_anthropic_key | |
MISTRAL_API_KEY=your_mistral_key | |
HYPERBOLIC_API_KEY=your_hyperbolic_key | |
``` | |
### Local Development | |
```bash | |
# Install dependencies | |
pip install -r requirements.txt | |
# Run as web app | |
python main.py | |
# Run as MCP server | |
python mcp_server.py | |
``` | |
### Modal Deployment | |
```bash | |
modal deploy main.py | |
``` | |
## π― Use Cases | |
- **LLM Comparison**: Test how different models respond to the same conversation | |
- **Context Migration**: Move conversations between different AI assistants | |
- **Model Evaluation**: Compare responses across multiple LLM providers | |
- **Workflow Integration**: Embed in larger AI workflows via MCP protocol | |
## π Status Dashboard | |
The app includes real-time status monitoring: | |
- API key validation | |
- Connection health checks | |
- Transfer success rates | |
- Error reporting | |
## π Privacy & Security | |
- No conversation data is stored | |
- API keys are handled securely through environment variables | |
- All transfers happen in real-time without logging | |
- HTTPS connections for all API calls | |
## π€ Contributing | |
This project is part of the MCP Server Track. Contributions welcome! | |
1. Fork the repository | |
2. Create a feature branch | |
3. Submit a pull request | |
## π License | |
MIT License - feel free to use and modify! | |
--- | |
**Made for the MCP Server Track** π | |
*Seamlessly bridging conversations across the AI ecosystem* |