llm_chat_transfer / README.md
Varshith1909
P3
1f27252

A newer version of the Gradio SDK is available: 5.39.0

Upgrade
metadata
title: πŸ”„ LLM Conversation Transfer Tool
emoji: πŸ”„
colorFrom: blue
colorTo: purple
sdk: gradio
sdk_version: 5.33.1
python_version: 3.1
app_file: app.py
pinned: false
tags:
  - mcp-server-track
  - llm
  - conversation-transfer
  - gradio
  - mcp

πŸ”„ LLM Conversation Transfer Tool

Tags: mcp-server-track

A powerful tool that seamlessly transfers conversations between different LLM providers (ChatGPT, Claude, Mistral, etc.) and functions as both a Gradio web app and an MCP (Model Context Protocol) server.

Link to the Demo video

🌟 Features

  • Universal Conversation Parser: Supports JSON format, plain text, and various chat export formats
  • Multiple LLM Providers: Transfer to Anthropic Claude, Mistral AI, and Hyperbolic Labs
  • Dual Interface: Web app for interactive use + MCP server for programmatic access
  • Smart Context Preservation: Maintains conversation flow and context during transfers
  • Real-time Status: Live API key validation and connection status

πŸš€ How to Use

Web Interface

  1. Paste Your Conversation: Copy from ChatGPT, Claude, or any chat interface
  2. Select Providers: Choose source and target LLM providers
  3. Transfer: Click the button and get a response from your target LLM

As MCP Server

This app can be used as an MCP server with any MCP-compatible client:

Available Tool: transfer_conversation

  • history_text: Conversation in JSON or plain text format
  • source_provider: Source LLM name (ChatGPT, Claude, etc.)
  • target_provider: Target LLM (anthropic, mistral, hyperbolic)

πŸ“– MCP Server Demo Video

πŸŽ₯ Watch the MCP Server in Action

The video demonstrates:

  • Setting up the MCP server with Claude Desktop
  • Transferring a conversation from ChatGPT to Claude
  • Using the tool within an MCP client environment
  • Real-time conversation continuation across different LLMs

πŸ”§ Supported Formats

Input Formats

Plain Text:
User: Hello there!
Assistant: Hi! How can I help you?

JSON:
[
  {"role": "user", "content": "Hello there!"},
  {"role": "assistant", "content": "Hi! How can I help you?"}
]

Supported Providers

  • βœ… Anthropic (Claude 3 Haiku)
  • βœ… Mistral AI (Mistral Small)
  • βœ… Hyperbolic Labs (Llama 2)

πŸ› οΈ Technical Details

Built with:

  • Gradio: Interactive web interface
  • MCP (Model Context Protocol): Server functionality
  • HTTPX: Async HTTP requests
  • Modal: Cloud deployment platform

API Integration

  • Anthropic Messages API
  • Mistral Chat Completions API
  • Hyperbolic Labs API

🚦 Setup & Configuration

Environment Variables

ANTHROPIC_API_KEY=your_anthropic_key
MISTRAL_API_KEY=your_mistral_key
HYPERBOLIC_API_KEY=your_hyperbolic_key

Local Development

# Install dependencies
pip install -r requirements.txt

# Run as web app
python main.py

# Run as MCP server
python mcp_server.py

Modal Deployment

modal deploy main.py

🎯 Use Cases

  • LLM Comparison: Test how different models respond to the same conversation
  • Context Migration: Move conversations between different AI assistants
  • Model Evaluation: Compare responses across multiple LLM providers
  • Workflow Integration: Embed in larger AI workflows via MCP protocol

πŸ“Š Status Dashboard

The app includes real-time status monitoring:

  • API key validation
  • Connection health checks
  • Transfer success rates
  • Error reporting

πŸ”’ Privacy & Security

  • No conversation data is stored
  • API keys are handled securely through environment variables
  • All transfers happen in real-time without logging
  • HTTPS connections for all API calls

🀝 Contributing

This project is part of the MCP Server Track. Contributions welcome!

  1. Fork the repository
  2. Create a feature branch
  3. Submit a pull request

πŸ“œ License

MIT License - feel free to use and modify!


Made for the MCP Server Track πŸ†

Seamlessly bridging conversations across the AI ecosystem