Spaces:
Sleeping
Sleeping
Delete readme.md
Browse files
readme.md
DELETED
@@ -1,335 +0,0 @@
|
|
1 |
-
# π₯ AI SOAP Note Generator for Google Colab
|
2 |
-
|
3 |
-
> Transform unstructured medical notes into professional SOAP documentation using Google's Gemma 3N model - **Optimized for Google Colab**
|
4 |
-
|
5 |
-
[](https://colab.research.google.com/your-notebook-link)
|
6 |
-
|
7 |
-
## π Overview
|
8 |
-
|
9 |
-
The AI SOAP Note Generator is an intelligent medical documentation tool that converts informal doctor's notes, patient encounters, and clinical observations into structured SOAP (Subjective, Objective, Assessment, Plan) format. This tool leverages Google's Gemma 3N language model and runs seamlessly in Google Colab with GPU acceleration.
|
10 |
-
|
11 |
-
## β¨ Features
|
12 |
-
|
13 |
-
- **π Google Colab Ready**: No local setup required - runs entirely in the cloud
|
14 |
-
- **β‘ GPU Acceleration**: Leverages Colab's free GPU/TPU for fast processing
|
15 |
-
- **π§ Gemma 3N Integration**: Uses Google's latest medical-aware language model
|
16 |
-
- **π± Multiple Interfaces**:
|
17 |
-
- Interactive Jupyter widgets
|
18 |
-
- Modern Gradio web interface
|
19 |
-
- Direct function calls
|
20 |
-
- **π File Support**: Upload .txt files directly in Colab
|
21 |
-
- **π― Pre-loaded Examples**: Built-in medical scenarios for immediate testing
|
22 |
-
- **π Shareable Links**: Generate public links to share your interface
|
23 |
-
|
24 |
-
## π Quick Start in Google Colab
|
25 |
-
|
26 |
-
### 1. Open the Notebook
|
27 |
-
Click the "Open in Colab" badge above or create a new notebook in [Google Colab](https://colab.research.google.com/)
|
28 |
-
|
29 |
-
### 2. Set Runtime to GPU (Recommended)
|
30 |
-
```
|
31 |
-
Runtime β Change runtime type β Hardware accelerator β GPU
|
32 |
-
```
|
33 |
-
|
34 |
-
### 3. Install Dependencies
|
35 |
-
Run this cell first:
|
36 |
-
```python
|
37 |
-
# Install required packages
|
38 |
-
!pip install -q gradio torch transformers accelerate bitsandbytes
|
39 |
-
!pip install -q ipywidgets
|
40 |
-
|
41 |
-
# Import libraries
|
42 |
-
import gradio as gr
|
43 |
-
import torch
|
44 |
-
from transformers import pipeline
|
45 |
-
import ipywidgets as widgets
|
46 |
-
from IPython.display import display, HTML
|
47 |
-
```
|
48 |
-
|
49 |
-
### 4. Run All Cells
|
50 |
-
Execute the notebook cells in order to:
|
51 |
-
- Load the Gemma 3N model
|
52 |
-
- Set up the interface
|
53 |
-
- Start generating SOAP notes
|
54 |
-
|
55 |
-
### 5. Use the Interface
|
56 |
-
- **Gradio Interface**: Click the public URL to access the web interface
|
57 |
-
- **Colab Widgets**: Use the interactive widgets directly in the notebook
|
58 |
-
|
59 |
-
## π± Interface Options
|
60 |
-
|
61 |
-
### Option 1: Gradio Web Interface (Recommended)
|
62 |
-
```python
|
63 |
-
# Launches a web interface with public sharing
|
64 |
-
gradio_interface.launch(share=True)
|
65 |
-
```
|
66 |
-
**Benefits:**
|
67 |
-
- Modern, responsive design
|
68 |
-
- Public shareable links
|
69 |
-
- Mobile-friendly
|
70 |
-
- Copy-to-clipboard functionality
|
71 |
-
|
72 |
-
### Option 2: Jupyter Widgets
|
73 |
-
```python
|
74 |
-
# Interactive widgets within the notebook
|
75 |
-
display(main_interface)
|
76 |
-
```
|
77 |
-
**Benefits:**
|
78 |
-
- Runs directly in Colab
|
79 |
-
- No external links needed
|
80 |
-
- Integrated with notebook workflow
|
81 |
-
|
82 |
-
## π§ Colab-Specific Setup
|
83 |
-
|
84 |
-
### GPU Configuration
|
85 |
-
```python
|
86 |
-
# Check GPU availability
|
87 |
-
import torch
|
88 |
-
print(f"CUDA available: {torch.cuda.is_available()}")
|
89 |
-
print(f"GPU device: {torch.cuda.get_device_name(0) if torch.cuda.is_available() else 'None'}")
|
90 |
-
|
91 |
-
# Configure for Colab GPU
|
92 |
-
device = "cuda" if torch.cuda.is_available() else "cpu"
|
93 |
-
model_config = {
|
94 |
-
"device_map": "auto",
|
95 |
-
"torch_dtype": torch.float16 if device == "cuda" else torch.float32
|
96 |
-
}
|
97 |
-
```
|
98 |
-
|
99 |
-
### File Upload in Colab
|
100 |
-
```python
|
101 |
-
# Method 1: Direct file upload
|
102 |
-
from google.colab import files
|
103 |
-
uploaded = files.upload()
|
104 |
-
|
105 |
-
# Method 2: Google Drive integration
|
106 |
-
from google.colab import drive
|
107 |
-
drive.mount('/content/drive')
|
108 |
-
```
|
109 |
-
|
110 |
-
### Save Results to Drive
|
111 |
-
```python
|
112 |
-
# Save generated SOAP notes to Google Drive
|
113 |
-
def save_to_drive(soap_note, filename):
|
114 |
-
with open(f'/content/drive/MyDrive/{filename}', 'w') as f:
|
115 |
-
f.write(soap_note)
|
116 |
-
print(f"β
Saved to Google Drive: {filename}")
|
117 |
-
```
|
118 |
-
|
119 |
-
## π Usage Examples
|
120 |
-
|
121 |
-
### Example 1: Quick Test
|
122 |
-
```python
|
123 |
-
# Test with example data
|
124 |
-
test_note = """
|
125 |
-
Patient: 45yo male with chest pain x2 hours
|
126 |
-
Sharp substernal pain 7/10, radiates to left arm
|
127 |
-
SOB, diaphoresis, no nausea
|
128 |
-
Vitals: BP 150/90, HR 110, O2 96%
|
129 |
-
Anxious, diaphoretic appearance
|
130 |
-
"""
|
131 |
-
|
132 |
-
soap_result = generate_soap_note(test_note)
|
133 |
-
print(soap_result)
|
134 |
-
```
|
135 |
-
|
136 |
-
### Example 2: File Processing
|
137 |
-
```python
|
138 |
-
# Upload and process medical files
|
139 |
-
from google.colab import files
|
140 |
-
uploaded_files = files.upload()
|
141 |
-
|
142 |
-
for filename, content in uploaded_files.items():
|
143 |
-
medical_text = content.decode('utf-8')
|
144 |
-
soap_note = generate_soap_note(medical_text)
|
145 |
-
|
146 |
-
# Save result
|
147 |
-
output_filename = f"SOAP_{filename}"
|
148 |
-
with open(output_filename, 'w') as f:
|
149 |
-
f.write(soap_note)
|
150 |
-
|
151 |
-
# Download result
|
152 |
-
files.download(output_filename)
|
153 |
-
```
|
154 |
-
|
155 |
-
## π― Pre-loaded Medical Examples
|
156 |
-
|
157 |
-
The notebook includes three clinical scenarios:
|
158 |
-
|
159 |
-
1. **Chest Pain Case**: Acute coronary syndrome workup
|
160 |
-
2. **Diabetes Case**: New onset diabetes mellitus
|
161 |
-
3. **Pediatric Case**: Streptococcal pharyngitis
|
162 |
-
|
163 |
-
Click any example button to load and test immediately.
|
164 |
-
|
165 |
-
## π Model Information
|
166 |
-
|
167 |
-
### Gemma 3N Configuration
|
168 |
-
```python
|
169 |
-
model_name = "google/gemma-3n-7b" # Adjust based on availability
|
170 |
-
tokenizer_config = {
|
171 |
-
"max_length": 2048,
|
172 |
-
"temperature": 0.7,
|
173 |
-
"do_sample": True
|
174 |
-
}
|
175 |
-
```
|
176 |
-
|
177 |
-
### Memory Optimization for Colab
|
178 |
-
```python
|
179 |
-
# For Colab's memory constraints
|
180 |
-
torch.cuda.empty_cache()
|
181 |
-
model = model.half() # Use 16-bit precision
|
182 |
-
```
|
183 |
-
|
184 |
-
## β οΈ Colab-Specific Considerations
|
185 |
-
|
186 |
-
### Runtime Limitations
|
187 |
-
- **12-hour session limit**: Save work frequently
|
188 |
-
- **GPU quota**: Free tier has daily limits
|
189 |
-
- **Memory constraints**: ~12-15GB RAM available
|
190 |
-
|
191 |
-
### Best Practices
|
192 |
-
1. **Save frequently**: Download important results
|
193 |
-
2. **Use GPU wisely**: Enable only when needed
|
194 |
-
3. **Monitor resources**: Check RAM/GPU usage
|
195 |
-
4. **Backup notebooks**: Save to Drive regularly
|
196 |
-
|
197 |
-
## π οΈ Troubleshooting in Colab
|
198 |
-
|
199 |
-
### Common Issues
|
200 |
-
|
201 |
-
**"Runtime disconnected"**
|
202 |
-
```python
|
203 |
-
# Prevent disconnection
|
204 |
-
import time
|
205 |
-
while True:
|
206 |
-
time.sleep(60) # Keep session alive
|
207 |
-
```
|
208 |
-
|
209 |
-
**"Out of GPU memory"**
|
210 |
-
```python
|
211 |
-
# Clear GPU memory
|
212 |
-
torch.cuda.empty_cache()
|
213 |
-
# Restart runtime if needed: Runtime β Restart runtime
|
214 |
-
```
|
215 |
-
|
216 |
-
**"Package not found"**
|
217 |
-
```python
|
218 |
-
# Reinstall packages
|
219 |
-
!pip install --upgrade gradio transformers torch
|
220 |
-
```
|
221 |
-
|
222 |
-
**Gradio interface not loading**
|
223 |
-
```python
|
224 |
-
# Try without share link
|
225 |
-
gradio_interface.launch(share=False, debug=True)
|
226 |
-
```
|
227 |
-
|
228 |
-
## π Performance Tips
|
229 |
-
|
230 |
-
### Optimize for Colab
|
231 |
-
```python
|
232 |
-
# Batch processing for multiple notes
|
233 |
-
def batch_process_notes(note_list):
|
234 |
-
results = []
|
235 |
-
for i, note in enumerate(note_list):
|
236 |
-
print(f"Processing {i+1}/{len(note_list)}")
|
237 |
-
soap_note = generate_soap_note(note)
|
238 |
-
results.append(soap_note)
|
239 |
-
return results
|
240 |
-
```
|
241 |
-
|
242 |
-
### Monitor Resources
|
243 |
-
```python
|
244 |
-
# Check memory usage
|
245 |
-
!nvidia-smi
|
246 |
-
!cat /proc/meminfo | grep MemAvailable
|
247 |
-
```
|
248 |
-
|
249 |
-
## π Sharing Your Work
|
250 |
-
|
251 |
-
### Share Notebook
|
252 |
-
1. **File β Save a copy in Drive**
|
253 |
-
2. **Share β Get shareable link**
|
254 |
-
3. Set permissions to "Anyone with the link"
|
255 |
-
|
256 |
-
### Share Interface
|
257 |
-
```python
|
258 |
-
# Gradio creates public URLs automatically
|
259 |
-
gradio_interface.launch(share=True)
|
260 |
-
# Copy the public URL to share with others
|
261 |
-
```
|
262 |
-
|
263 |
-
## π Colab Notebook Structure
|
264 |
-
|
265 |
-
```
|
266 |
-
π SOAP_Note_Generator.ipynb
|
267 |
-
βββ π§ Setup & Installation
|
268 |
-
βββ π§ Model Loading
|
269 |
-
βββ π SOAP Generation Function
|
270 |
-
βββ π¨ Interface Creation
|
271 |
-
β βββ Gradio Web Interface
|
272 |
-
β βββ Jupyter Widgets
|
273 |
-
βββ π Example Cases
|
274 |
-
βββ π Launch Interface
|
275 |
-
βββ πΎ Save/Export Functions
|
276 |
-
```
|
277 |
-
|
278 |
-
## π Medical Disclaimer
|
279 |
-
|
280 |
-
> **βοΈ IMPORTANT**: This tool is for **educational and research purposes only**
|
281 |
-
> - Not intended for actual clinical use
|
282 |
-
> - Always consult qualified healthcare professionals
|
283 |
-
> - Remove patient identifiers before processing
|
284 |
-
> - Comply with HIPAA and privacy regulations
|
285 |
-
|
286 |
-
## π Getting Help
|
287 |
-
|
288 |
-
### In Colab:
|
289 |
-
1. Use `!pip list` to check installed packages
|
290 |
-
2. Check GPU with `!nvidia-smi`
|
291 |
-
3. Restart runtime if needed: `Runtime β Restart runtime`
|
292 |
-
|
293 |
-
### Common Commands:
|
294 |
-
```python
|
295 |
-
# Debug mode
|
296 |
-
import logging
|
297 |
-
logging.basicConfig(level=logging.DEBUG)
|
298 |
-
|
299 |
-
# Check versions
|
300 |
-
print(f"Torch: {torch.__version__}")
|
301 |
-
print(f"Transformers: {transformers.__version__}")
|
302 |
-
print(f"Gradio: {gr.__version__}")
|
303 |
-
```
|
304 |
-
|
305 |
-
## π Advanced Features
|
306 |
-
|
307 |
-
### Google Drive Integration
|
308 |
-
```python
|
309 |
-
# Mount Google Drive
|
310 |
-
from google.colab import drive
|
311 |
-
drive.mount('/content/drive')
|
312 |
-
|
313 |
-
# Save notebooks and results automatically
|
314 |
-
import shutil
|
315 |
-
shutil.copy('generated_soap_notes.txt', '/content/drive/MyDrive/')
|
316 |
-
```
|
317 |
-
|
318 |
-
### Scheduled Processing
|
319 |
-
```python
|
320 |
-
# Process notes at scheduled intervals
|
321 |
-
import schedule
|
322 |
-
import time
|
323 |
-
|
324 |
-
def scheduled_processing():
|
325 |
-
# Your processing logic here
|
326 |
-
pass
|
327 |
-
|
328 |
-
schedule.every(30).minutes.do(scheduled_processing)
|
329 |
-
```
|
330 |
-
|
331 |
-
---
|
332 |
-
|
333 |
-
**π¬ Ready to start generating professional SOAP notes in Google Colab!**
|
334 |
-
|
335 |
-
Click "Open in Colab" above and run all cells to get started immediately.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|