|
--- |
|
title: langchain-streamlit-demo |
|
emoji: 🦜 |
|
colorFrom: green |
|
colorTo: red |
|
sdk: docker |
|
app_port: 7860 |
|
pinned: true |
|
tags: [langchain, streamlit, docker] |
|
--- |
|
|
|
# langchain-streamlit-demo |
|
|
|
[](https://opensource.org/licenses/MIT) |
|
[](https://www.python.org) |
|
[](https://github.com/PyCQA/bandit) |
|
[](https://github.com/charliermarsh/ruff) |
|
[](https://github.com/psf/black) |
|
[](https://github.com/pre-commit/pre-commit) |
|
[](http://mypy-lang.org/) |
|
|
|
[](https://hub.docker.com/r/joshuasundance/langchain-streamlit-demo) |
|
[](https://hub.docker.com/r/joshuasundance/langchain-streamlit-demo) |
|
[](https://huggingface.co/spaces/joshuasundance/langchain-streamlit-demo) |
|
|
|
|
|
This project shows how to build a simple chatbot UI with [Streamlit](https://streamlit.io) and [LangChain](https://langchain.com). |
|
|
|
This `README` was written by [Claude 2](https://www.anthropic.com/index/claude-2), an LLM from [Anthropic](https://www.anthropic.com/). |
|
|
|
# Features |
|
- Chat interface for talking to AI assistant |
|
- Supports models from |
|
- [OpenAI](https://openai.com/) |
|
- `gpt-3.5-turbo` |
|
- `gpt-4` |
|
- [Anthropic](https://www.anthropic.com/) |
|
- `claude-instant-v1` |
|
- `claude-2` |
|
- [Anyscale Endpoints](https://endpoints.anyscale.com/) |
|
- `meta-llama/Llama-2-7b-chat-hf` |
|
- `meta-llama/Llama-2-13b-chat-hf` |
|
- `meta-llama/Llama-2-70b-chat-hf` |
|
- Streaming output of assistant responses |
|
- Leverages LangChain for dialogue and memory management |
|
- Integrates with [LangSmith](https://smith.langchain.com) for tracing conversations |
|
- Allows giving feedback on assistant's responses |
|
- Tries reading API keys and default values from environment variables |
|
- Parameters in sidebar can be customized |
|
|
|
# Code Overview |
|
- `langchain-streamlit-demo/app.py` - Main Streamlit app definition |
|
- `langchain-streamlit-demo/llm_stuff.py` - LangChain helper functions |
|
- `Dockerfile`, `docker-compose.yml`: Docker deployment |
|
- `kubernetes/`: Kubernetes deployment files |
|
- `.github/workflows/`: CI/CD workflows |
|
|
|
# Deployment |
|
`langchain-streamlit-demo` is deployed as a [Docker image](https://hub.docker.com/r/joshuasundance/langchain-streamlit-demo) based on the [`python:3.11-slim-bookworm`](https://github.com/docker-library/python/blob/81b6e5f0643965618d633cd6b811bf0879dee360/3.11/slim-bookworm/Dockerfile) image. |
|
CI/CD workflows in `.github/workflows` handle building and publishing the image as well as pushing it to Hugging Face. |
|
|
|
## Run on HuggingFace Spaces |
|
[](https://huggingface.co/spaces/joshuasundance/langchain-streamlit-demo) |
|
|
|
## With Docker (pull from Docker Hub) |
|
|
|
1. _Optional_: Create a `.env` file based on `.env-example` |
|
2. Run in terminal: |
|
|
|
`docker run -p 7860:7860 joshuasundance/langchain-streamlit-demo:latest` |
|
|
|
or |
|
|
|
`docker run -p 7860:7860 --env-file .env joshuasundance/langchain-streamlit-demo:latest` |
|
|
|
3. Open http://localhost:7860 in your browser |
|
|
|
## Docker Compose (build locally) |
|
1. Clone the repo. Navigate to cloned repo directory |
|
2. _Optional_: Create a `.env` file based on `.env-example` |
|
3. Run in terminal: |
|
|
|
`docker compose up` |
|
|
|
or |
|
|
|
`docker compose up --env-file .env` |
|
|
|
4. Open http://localhost:7860 in your browser |
|
|
|
## Kubernetes |
|
1. Clone the repo. Navigate to cloned repo directory |
|
2. Create a `.env` file based on `.env-example` |
|
3. Run bash script: `/bin/bash ./kubernetes/deploy.sh` |
|
4. Get the IP address for your new service: `kubectl get service langchain-streamlit-demo` |
|
|
|
# Links |
|
- [Streamlit](https://streamlit.io) |
|
- [LangChain](https://langchain.com) |
|
- [LangSmith](https://smith.langchain.com) |
|
- [OpenAI](https://openai.com/) |
|
- [Anthropic](https://www.anthropic.com/) |
|
- [Anyscale Endpoints](https://endpoints.anyscale.com/) |
|
|