Spaces:
Sleeping
license: apache-2.0
title: NegaBot
sdk: docker
emoji: π
colorFrom: blue
colorTo: pink
short_description: 'Product Negative Reviews detection & Analsis API. '
NegaBot API
Tweet Sentiment Classification using SmolLM 360M V2 Model
NegaBot is a sentiment analysis API that detects positive and negative sentiment in tweets, particularly focusing on product criticism detection. Built with FastAPI and the jatinmehra/NegaBot-Product-Criticism-Catcher
model.
Features
- Advanced AI Model: Uses fine-tuned SmolLM 360M V2 for accurate sentiment classification; Trained on real tweets data and can detect negative comments with sarcasm.
- Fast API: RESTful API built with FastAPI for high-performance predictions
- Data Logging: SQLite database for storing and analyzing predictions
- Batch Processing: Support for single and batch predictions
- Built-in Dashboard: HTML analytics dashboard with charts
- Data Export: Download predictions as CSV or JSON
Quick Start
- Install Dependencies
pip install -r requirements.txt
- Start the API
uvicorn api:app --host 0.0.0.0 --port 8000
- Access the Services
- API Documentation: http://localhost:8000/docs
- Analytics Dashboard: http://localhost:8000/dashboard
Usage Examples
API Usage
Single Prediction
curl -X POST "http://localhost:8000/predict" \
-H "Content-Type: application/json" \
-d '{"text": "This product is amazing! Best purchase ever!"}'
Batch Prediction
curl -X POST "http://localhost:8000/batch_predict" \
-H "Content-Type: application/json" \
-d '{
"tweets": [
"Amazing product, highly recommend!",
"Terrible quality, waste of money",
"Its okay, nothing special"
]
}'
Python Client Example
import requests
# Single prediction
response = requests.post(
"http://localhost:8000/predict",
json={"text": "This product broke after one week!"}
)
result = response.json()
print(f"Sentiment: {result['sentiment']} (Confidence: {result['confidence']:.2%})")
# Batch prediction
response = requests.post(
"http://localhost:8000/batch_predict",
json={
"tweets": [
"Love this product!",
"Terrible experience",
"Pretty decent quality"
]
}
)
results = response.json()
for result in results['results']:
print(f"'{result['text']}' -> {result['sentiment']}")
Model Usage (Direct)
from model import NegaBotModel
# Initialize model
model = NegaBotModel()
# Single prediction
result = model.predict("This product is awful and broke within a week!")
print(f"Sentiment: {result['sentiment']}")
print(f"Confidence: {result['confidence']:.2%}")
print(f"Probabilities: {result['probabilities']}")
# Batch prediction
texts = [
"Amazing quality, highly recommend!",
"Terrible customer service",
"Pretty good value for money"
]
results = model.batch_predict(texts)
for result in results:
print(f"{result['text']} -> {result['sentiment']}")
API Endpoints
Endpoint | Method | Description |
---|---|---|
/ |
GET | API information and available endpoints |
/health |
GET | Health check and model status |
/predict |
POST | Single tweet sentiment prediction |
/batch_predict |
POST | Batch tweet sentiment prediction |
/stats |
GET | Prediction statistics and analytics |
/dashboard |
GET | HTML analytics dashboard |
/dashboard/data |
GET | Dashboard data as JSON |
/download/predictions.csv |
GET | Download predictions as CSV |
/download/predictions.json |
GET | Download predictions as JSON |
Request/Response Schemas
Predict Request
{
"text": "string (1-1000 chars)",
"metadata": {
"optional": "metadata object"
}
}
Predict Response
{
"text": "input text",
"sentiment": "Positive|Negative",
"confidence": 0.95,
"predicted_class": 0,
"probabilities": {
"positive": 0.95,
"negative": 0.05
},
"timestamp": "2024-01-01T12:00:00"
}
Dashboard Features
The built-in analytics dashboard provides:
- Real-time Metrics: Total predictions, sentiment distribution, average confidence
- Interactive Charts: Pie charts showing sentiment distribution
- Recent Predictions: View latest prediction results
- Data Export: Download prediction data as CSV or JSON
- Auto-refresh: View updated statistics as new predictions are made
Testing
Test the API using the interactive documentation at http://localhost:8000/docs or use curl commands as shown in the usage examples above.
Project Structure
NegaBot-API/
βββ api.py # FastAPI application
βββ model.py # NegaBot model wrapper
βββ database.py # SQLite database and logging
βββ requirements.txt # Python dependencies
βββ Dockerfile # Docker configuration
βββ README.md # This file
βββ negabot_predictions.db # Database (created at runtime)
Configuration
The API runs on port 8000 by default. You can modify the host and port by updating the uvicorn command:
uvicorn api:app --host 127.0.0.1 --port 8080
Model Information
- Model:
jatinmehra/NegaBot-Product-Criticism-Catcher
- Base Architecture: SmolLM 360M V2
- Task: Binary sentiment classification
- Classes:
- 0: Positive sentiment
- 1: Negative sentiment (criticism/complaints)
- Input: Text (max 512 tokens)
- Output: Sentiment label + confidence scores
Performance Considerations
- Memory Requirements: Model requires ~2GB RAM minimum
- API Scaling: Use multiple worker processes with Gunicorn for production
- Database: Current SQLite setup is suitable for development and small-scale production
Logging and Monitoring
Database Schema
CREATE TABLE predictions (
id INTEGER PRIMARY KEY AUTOINCREMENT,
text TEXT NOT NULL,
sentiment TEXT NOT NULL,
confidence REAL NOT NULL,
predicted_class INTEGER NOT NULL,
timestamp TEXT NOT NULL,
metadata TEXT,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP
);
Log Files
- Application logs: Console output
- Prediction logs: SQLite database
- Access logs: Uvicorn/Gunicorn logs
Contributing
- Fork the repository
- Create a feature branch
- Add tests for new features
- Ensure all tests pass
- Submit a pull request
License
This project is licensed under the Apache-2.0 License - see the LICENSE file for details.
Troubleshooting
Common Issues
Model Loading Errors
- Ensure internet connection for downloading the model
- Check disk space (model is ~1.5GB)
- Verify transformers library version
Port Conflicts
- Change ports using command line arguments
- Check if port 8000 is already in use
Database Permissions
- Ensure write permissions in the project directory
- Check SQLite installation
Memory Issues
- Model requires ~2GB RAM minimum
- Consider using CPU-only inference for smaller systems
Built with FastAPI and the powerful NegaBot model.
Model used in this app-https://github.com/Jatin-Mehra119/NegaBot-Product-Criticism-Catcher