Spaces:
Build error
Build error
File size: 4,145 Bytes
c4acc5f 0349a93 c4acc5f 0349a93 1f27252 0349a93 c4acc5f 0349a93 7bfaddc 737da7c 7bfaddc c4acc5f |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 |
---
title: π LLM Conversation Transfer Tool
emoji: π
colorFrom: blue
colorTo: purple
sdk: gradio
sdk_version: 5.33.1
python_version: 3.1
app_file: app.py
pinned: false
tags:
- mcp-server-track
- llm
- conversation-transfer
- gradio
- mcp
---
# π LLM Conversation Transfer Tool
**Tags:** mcp-server-track
A powerful tool that seamlessly transfers conversations between different LLM providers (ChatGPT, Claude, Mistral, etc.) and functions as both a Gradio web app and an MCP (Model Context Protocol) server.
## [Link to the Demo video](https://drive.google.com/file/d/1jHsv4c0yqhA2o0FBhkZOegBrf3A6IVyF/view?usp=sharing)
## π Features
- **Universal Conversation Parser**: Supports JSON format, plain text, and various chat export formats
- **Multiple LLM Providers**: Transfer to Anthropic Claude, Mistral AI, and Hyperbolic Labs
- **Dual Interface**: Web app for interactive use + MCP server for programmatic access
- **Smart Context Preservation**: Maintains conversation flow and context during transfers
- **Real-time Status**: Live API key validation and connection status
## π How to Use
### Web Interface
1. **Paste Your Conversation**: Copy from ChatGPT, Claude, or any chat interface
2. **Select Providers**: Choose source and target LLM providers
3. **Transfer**: Click the button and get a response from your target LLM
### As MCP Server
This app can be used as an MCP server with any MCP-compatible client:
**Available Tool:** `transfer_conversation`
- `history_text`: Conversation in JSON or plain text format
- `source_provider`: Source LLM name (ChatGPT, Claude, etc.)
- `target_provider`: Target LLM (anthropic, mistral, hyperbolic)
## π MCP Server Demo Video
[π₯ **Watch the MCP Server in Action**](https://your-demo-video-link-here.com)
*The video demonstrates:*
- Setting up the MCP server with Claude Desktop
- Transferring a conversation from ChatGPT to Claude
- Using the tool within an MCP client environment
- Real-time conversation continuation across different LLMs
## π§ Supported Formats
### Input Formats
```
Plain Text:
User: Hello there!
Assistant: Hi! How can I help you?
JSON:
[
{"role": "user", "content": "Hello there!"},
{"role": "assistant", "content": "Hi! How can I help you?"}
]
```
### Supported Providers
- β
**Anthropic** (Claude 3 Haiku)
- β
**Mistral AI** (Mistral Small)
- β
**Hyperbolic Labs** (Llama 2)
## π οΈ Technical Details
Built with:
- **Gradio**: Interactive web interface
- **MCP (Model Context Protocol)**: Server functionality
- **HTTPX**: Async HTTP requests
- **Modal**: Cloud deployment platform
### API Integration
- Anthropic Messages API
- Mistral Chat Completions API
- Hyperbolic Labs API
## π¦ Setup & Configuration
### Environment Variables
```env
ANTHROPIC_API_KEY=your_anthropic_key
MISTRAL_API_KEY=your_mistral_key
HYPERBOLIC_API_KEY=your_hyperbolic_key
```
### Local Development
```bash
# Install dependencies
pip install -r requirements.txt
# Run as web app
python main.py
# Run as MCP server
python mcp_server.py
```
### Modal Deployment
```bash
modal deploy main.py
```
## π― Use Cases
- **LLM Comparison**: Test how different models respond to the same conversation
- **Context Migration**: Move conversations between different AI assistants
- **Model Evaluation**: Compare responses across multiple LLM providers
- **Workflow Integration**: Embed in larger AI workflows via MCP protocol
## π Status Dashboard
The app includes real-time status monitoring:
- API key validation
- Connection health checks
- Transfer success rates
- Error reporting
## π Privacy & Security
- No conversation data is stored
- API keys are handled securely through environment variables
- All transfers happen in real-time without logging
- HTTPS connections for all API calls
## π€ Contributing
This project is part of the MCP Server Track. Contributions welcome!
1. Fork the repository
2. Create a feature branch
3. Submit a pull request
## π License
MIT License - feel free to use and modify!
---
**Made for the MCP Server Track** π
*Seamlessly bridging conversations across the AI ecosystem* |