Spaces:
Sleeping
Sleeping
Podcast Discussion Server
This is a FastAPI-based server that provides podcast discussion and analysis capabilities.
Environment Variables
The following environment variables need to be set in the Hugging Face Space:
OPENAI_API_KEY
: Your OpenAI API keyALLOWED_ORIGINS
: Comma-separated list of allowed origins (optional, defaults to Vercel domains and localhost)
API Endpoints
/chat
: Main chat endpoint for podcast discussions/podcast-chat/{podcast_id}
: Chat endpoint for specific podcast discussions/audio-list
: List available audio files/audio/{filename}
: Get specific audio file/podcast/{podcast_id}/context
: Get podcast context
Stack
- FastAPI
- OpenAI
- Langchain
- Qdrant
- GTTS
Deployment
Backend (Hugging Face Spaces)
This server is deployed on Hugging Face Spaces using their Docker deployment feature.
Frontend (Vercel)
When deploying the frontend to Vercel:
Set the API base URL in your frontend environment:
VITE_API_BASE_URL=https://your-username-your-space-name.hf.space
The server is already configured to accept requests from:
- All Vercel domains (*.vercel.app)
- Local development servers (localhost:3000, localhost:5173)
If you're using a custom domain, add it to the
ALLOWED_ORIGINS
environment variable in your Hugging Face Space:ALLOWED_ORIGINS=https://your-custom-domain.com,https://www.your-custom-domain.com
Security Features
- CORS protection with specific origin allowlist
- Security headers (HSTS, XSS Protection, etc.)
- Rate limiting
- SSL/TLS encryption (provided by Hugging Face Spaces)