Spaces:
Running
Running
Update the description of Ollama steps with Git LFS instructions
Browse files
README.md
CHANGED
|
@@ -88,13 +88,20 @@ Offline LLMs are made available via Ollama. Therefore, a pre-requisite here is t
|
|
| 88 |
In addition, the `RUN_IN_OFFLINE_MODE` environment variable needs to be set to `True` to enable the offline mode. This, for example, can be done using a `.env` file or from the terminal. The typical steps to use SlideDeck AI in offline mode (in a `bash` shell) are as follows:
|
| 89 |
|
| 90 |
```bash
|
|
|
|
|
|
|
|
|
|
|
|
|
| 91 |
ollama list # View locally available LLMs
|
| 92 |
export RUN_IN_OFFLINE_MODE=True # Enable the offline mode to use Ollama
|
| 93 |
git clone https://github.com/barun-saha/slide-deck-ai.git
|
| 94 |
cd slide-deck-ai
|
|
|
|
|
|
|
| 95 |
python -m venv venv # Create a virtual environment
|
| 96 |
source venv/bin/activate # On a Linux system
|
| 97 |
pip install -r requirements.txt
|
|
|
|
| 98 |
streamlit run ./app.py # Run the application
|
| 99 |
```
|
| 100 |
|
|
|
|
| 88 |
In addition, the `RUN_IN_OFFLINE_MODE` environment variable needs to be set to `True` to enable the offline mode. This, for example, can be done using a `.env` file or from the terminal. The typical steps to use SlideDeck AI in offline mode (in a `bash` shell) are as follows:
|
| 89 |
|
| 90 |
```bash
|
| 91 |
+
# Install Git Large File Storage (LFS)
|
| 92 |
+
sudo apt install git-lfs
|
| 93 |
+
git lfs install
|
| 94 |
+
|
| 95 |
ollama list # View locally available LLMs
|
| 96 |
export RUN_IN_OFFLINE_MODE=True # Enable the offline mode to use Ollama
|
| 97 |
git clone https://github.com/barun-saha/slide-deck-ai.git
|
| 98 |
cd slide-deck-ai
|
| 99 |
+
git lfs pull # Pull the PPTX template files
|
| 100 |
+
|
| 101 |
python -m venv venv # Create a virtual environment
|
| 102 |
source venv/bin/activate # On a Linux system
|
| 103 |
pip install -r requirements.txt
|
| 104 |
+
|
| 105 |
streamlit run ./app.py # Run the application
|
| 106 |
```
|
| 107 |
|