jatinmehra commited on
Commit
03fce4f
Β·
1 Parent(s): edab8ad

update README.md with detailed usage examples, API endpoints, and model information for NegaBot API

Browse files
Files changed (1) hide show
  1. README.md +260 -9
README.md CHANGED
@@ -1,11 +1,262 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
- title: NegaBot API
3
- emoji: πŸ”₯
4
- colorFrom: purple
5
- colorTo: indigo
6
- sdk: docker
7
- pinned: false
8
- license: apache-2.0
9
- ---
10
 
11
- Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
 
 
 
1
+ # NegaBot API
2
+
3
+ **Tweet Sentiment Classification using SmolLM 360M V2 Model**
4
+
5
+ NegaBot is a sentiment analysis API that detects positive and negative sentiment in tweets, particularly focusing on product criticism detection. Built with FastAPI and the `jatinmehra/NegaBot-Product-Criticism-Catcher` model.
6
+
7
+ ## Features
8
+
9
+ - **Advanced AI Model**: Uses fine-tuned SmolLM 360M V2 for accurate sentiment classification; Trained on real tweets data and can detect negative comments with sarcasm.
10
+ - **Fast API**: RESTful API built with FastAPI for high-performance predictions
11
+ - **Data Logging**: SQLite database for storing and analyzing predictions
12
+ - **Batch Processing**: Support for single and batch predictions
13
+ - **Built-in Dashboard**: HTML analytics dashboard with charts
14
+ - **Data Export**: Download predictions as CSV or JSON
15
+
16
+ ## Quick Start
17
+
18
+ 1. **Install Dependencies**
19
+ ```bash
20
+ pip install -r requirements.txt
21
+ ```
22
+
23
+ 2. **Start the API**
24
+ ```bash
25
+ uvicorn api:app --host 0.0.0.0 --port 8000
26
+ ```
27
+
28
+ 3. **Access the Services**
29
+ - API Documentation: http://localhost:8000/docs
30
+ - Analytics Dashboard: http://localhost:8000/dashboard
31
+
32
+ ## Usage Examples
33
+
34
+ ### API Usage
35
+
36
+ #### Single Prediction
37
+ ```bash
38
+ curl -X POST "http://localhost:8000/predict" \
39
+ -H "Content-Type: application/json" \
40
+ -d '{"text": "This product is amazing! Best purchase ever!"}'
41
+ ```
42
+
43
+ #### Batch Prediction
44
+ ```bash
45
+ curl -X POST "http://localhost:8000/batch_predict" \
46
+ -H "Content-Type: application/json" \
47
+ -d '{
48
+ "tweets": [
49
+ "Amazing product, highly recommend!",
50
+ "Terrible quality, waste of money",
51
+ "Its okay, nothing special"
52
+ ]
53
+ }'
54
+ ```
55
+
56
+ #### Python Client Example
57
+ ```python
58
+ import requests
59
+
60
+ # Single prediction
61
+ response = requests.post(
62
+ "http://localhost:8000/predict",
63
+ json={"text": "This product broke after one week!"}
64
+ )
65
+ result = response.json()
66
+ print(f"Sentiment: {result['sentiment']} (Confidence: {result['confidence']:.2%})")
67
+
68
+ # Batch prediction
69
+ response = requests.post(
70
+ "http://localhost:8000/batch_predict",
71
+ json={
72
+ "tweets": [
73
+ "Love this product!",
74
+ "Terrible experience",
75
+ "Pretty decent quality"
76
+ ]
77
+ }
78
+ )
79
+ results = response.json()
80
+ for result in results['results']:
81
+ print(f"'{result['text']}' -> {result['sentiment']}")
82
+ ```
83
+
84
+ ### Model Usage (Direct)
85
+
86
+ ```python
87
+ from model import NegaBotModel
88
+
89
+ # Initialize model
90
+ model = NegaBotModel()
91
+
92
+ # Single prediction
93
+ result = model.predict("This product is awful and broke within a week!")
94
+ print(f"Sentiment: {result['sentiment']}")
95
+ print(f"Confidence: {result['confidence']:.2%}")
96
+ print(f"Probabilities: {result['probabilities']}")
97
+
98
+ # Batch prediction
99
+ texts = [
100
+ "Amazing quality, highly recommend!",
101
+ "Terrible customer service",
102
+ "Pretty good value for money"
103
+ ]
104
+ results = model.batch_predict(texts)
105
+ for result in results:
106
+ print(f"{result['text']} -> {result['sentiment']}")
107
+ ```
108
+
109
+ ## API Endpoints
110
+
111
+ | Endpoint | Method | Description |
112
+ |----------|--------|-------------|
113
+ | `/` | GET | API information and available endpoints |
114
+ | `/health` | GET | Health check and model status |
115
+ | `/predict` | POST | Single tweet sentiment prediction |
116
+ | `/batch_predict` | POST | Batch tweet sentiment prediction |
117
+ | `/stats` | GET | Prediction statistics and analytics |
118
+ | `/dashboard` | GET | HTML analytics dashboard |
119
+ | `/dashboard/data` | GET | Dashboard data as JSON |
120
+ | `/download/predictions.csv` | GET | Download predictions as CSV |
121
+ | `/download/predictions.json` | GET | Download predictions as JSON |
122
+
123
+ ### Request/Response Schemas
124
+
125
+ #### Predict Request
126
+ ```json
127
+ {
128
+ "text": "string (1-1000 chars)",
129
+ "metadata": {
130
+ "optional": "metadata object"
131
+ }
132
+ }
133
+ ```
134
+
135
+ #### Predict Response
136
+ ```json
137
+ {
138
+ "text": "input text",
139
+ "sentiment": "Positive|Negative",
140
+ "confidence": 0.95,
141
+ "predicted_class": 0,
142
+ "probabilities": {
143
+ "positive": 0.95,
144
+ "negative": 0.05
145
+ },
146
+ "timestamp": "2024-01-01T12:00:00"
147
+ }
148
+ ```
149
+
150
+ ## Dashboard Features
151
+
152
+ The built-in analytics dashboard provides:
153
+
154
+ - **Real-time Metrics**: Total predictions, sentiment distribution, average confidence
155
+ - **Interactive Charts**: Pie charts showing sentiment distribution
156
+ - **Recent Predictions**: View latest prediction results
157
+ - **Data Export**: Download prediction data as CSV or JSON
158
+ - **Auto-refresh**: View updated statistics as new predictions are made
159
+
160
+ ## Testing
161
+
162
+ Test the API using the interactive documentation at http://localhost:8000/docs or use curl commands as shown in the usage examples above.
163
+
164
+ ## Project Structure
165
+
166
+ ```
167
+ NegaBot-API/
168
+ β”œβ”€β”€ api.py # FastAPI application
169
+ β”œβ”€β”€ model.py # NegaBot model wrapper
170
+ β”œβ”€β”€ database.py # SQLite database and logging
171
+ β”œβ”€β”€ requirements.txt # Python dependencies
172
+ β”œβ”€β”€ Dockerfile # Docker configuration
173
+ β”œβ”€β”€ README.md # This file
174
+ └── negabot_predictions.db # Database (created at runtime)
175
+ ```
176
+
177
+ ## Configuration
178
+
179
+ The API runs on port 8000 by default. You can modify the host and port by updating the uvicorn command:
180
+
181
+ ```bash
182
+ uvicorn api:app --host 127.0.0.1 --port 8080
183
+ ```
184
+
185
+ ## Model Information
186
+
187
+ - **Model**: `jatinmehra/NegaBot-Product-Criticism-Catcher`
188
+ - **Base Architecture**: SmolLM 360M V2
189
+ - **Task**: Binary sentiment classification
190
+ - **Classes**:
191
+ - 0: Positive sentiment
192
+ - 1: Negative sentiment (criticism/complaints)
193
+ - **Input**: Text (max 512 tokens)
194
+ - **Output**: Sentiment label + confidence scores
195
+
196
+ ### Performance Considerations
197
+
198
+ - **Memory Requirements**: Model requires ~2GB RAM minimum
199
+ - **API Scaling**: Use multiple worker processes with Gunicorn for production
200
+ - **Database**: Current SQLite setup is suitable for development and small-scale production
201
+
202
+ ## Logging and Monitoring
203
+
204
+ ### Database Schema
205
+
206
+ ```sql
207
+ CREATE TABLE predictions (
208
+ id INTEGER PRIMARY KEY AUTOINCREMENT,
209
+ text TEXT NOT NULL,
210
+ sentiment TEXT NOT NULL,
211
+ confidence REAL NOT NULL,
212
+ predicted_class INTEGER NOT NULL,
213
+ timestamp TEXT NOT NULL,
214
+ metadata TEXT,
215
+ created_at DATETIME DEFAULT CURRENT_TIMESTAMP
216
+ );
217
+ ```
218
+
219
+ ### Log Files
220
+
221
+ - Application logs: Console output
222
+ - Prediction logs: SQLite database
223
+ - Access logs: Uvicorn/Gunicorn logs
224
+
225
+ ## Contributing
226
+
227
+ 1. Fork the repository
228
+ 2. Create a feature branch
229
+ 3. Add tests for new features
230
+ 4. Ensure all tests pass
231
+ 5. Submit a pull request
232
+
233
+ ## License
234
+
235
+ This project is licensed under the Apache-2.0 License - see the [LICENSE](LICENSE) file for details.
236
+
237
+ ## Troubleshooting
238
+
239
+ ### Common Issues
240
+
241
+ 1. **Model Loading Errors**
242
+ - Ensure internet connection for downloading the model
243
+ - Check disk space (model is ~1.5GB)
244
+ - Verify transformers library version
245
+
246
+ 2. **Port Conflicts**
247
+ - Change ports using command line arguments
248
+ - Check if port 8000 is already in use
249
+
250
+ 3. **Database Permissions**
251
+ - Ensure write permissions in the project directory
252
+ - Check SQLite installation
253
+
254
+ 4. **Memory Issues**
255
+ - Model requires ~2GB RAM minimum
256
+ - Consider using CPU-only inference for smaller systems
257
+
258
  ---
 
 
 
 
 
 
 
 
259
 
260
+ **Built with FastAPI and the powerful NegaBot model.**
261
+
262
+ Model used in this app-https://github.com/Jatin-Mehra119/NegaBot-Product-Criticism-Catcher