Spaces:
Running
title: README
emoji: π’
colorFrom: blue
colorTo: purple
sdk: static
pinned: false
Adaptive Classifier
A flexible, adaptive classification system that allows for dynamic addition of new classes and continuous learning from examples. Built on top of transformers from HuggingFace, this library provides an easy-to-use interface for creating and updating text classifiers.
Features
- π Works with any transformer classifier model
- π Continuous learning capabilities
- π― Dynamic class addition
- πΎ Safe and efficient state persistence
- π Prototype-based learning
- π§ Neural adaptation layer
Try Now
Installation
pip install adaptive-classifier
Quick Start
from adaptive_classifier import AdaptiveClassifier
# Initialize with any HuggingFace model
classifier = AdaptiveClassifier("bert-base-uncased")
# Add some examples
texts = [
"The product works great!",
"Terrible experience",
"Neutral about this purchase"
]
labels = ["positive", "negative", "neutral"]
classifier.add_examples(texts, labels)
# Make predictions
predictions = classifier.predict("This is amazing!")
print(predictions) # [('positive', 0.85), ('neutral', 0.12), ('negative', 0.03)]
# Save the classifier
classifier.save("./my_classifier")
# Load it later
loaded_classifier = AdaptiveClassifier.load("./my_classifier")
Advanced Usage
Adding New Classes Dynamically
# Add a completely new class
new_texts = [
"Error code 404 appeared",
"System crashed after update"
]
new_labels = ["technical"] * 2
classifier.add_examples(new_texts, new_labels)
Continuous Learning
# Add more examples to existing classes
more_examples = [
"Best purchase ever!",
"Highly recommend this"
]
more_labels = ["positive"] * 2
classifier.add_examples(more_examples, more_labels)
How It Works
The system combines three key components:
Transformer Embeddings: Uses state-of-the-art language models for text representation
Prototype Memory: Maintains class prototypes for quick adaptation to new examples
Adaptive Neural Layer: Learns refined decision boundaries through continuous training
Requirements
- Python β₯ 3.8
- PyTorch β₯ 2.0
- transformers β₯ 4.30.0
- safetensors β₯ 0.3.1
- faiss-cpu β₯ 1.7.4 (or faiss-gpu for GPU support)
Benefits of Adaptive Classification in LLM Routing
We evaluate the effectiveness of adaptive classification in optimizing LLM routing decisions. Using the arena-hard-auto-v0.1 dataset with 500 queries, we compared routing performance with and without adaptation while maintaining consistent overall success rates.
Key Results
Metric | Without Adaptation | With Adaptation | Impact |
---|---|---|---|
High Model Routes | 113 (22.6%) | 98 (19.6%) | 0.87x |
Low Model Routes | 387 (77.4%) | 402 (80.4%) | 1.04x |
High Model Success Rate | 40.71% | 29.59% | 0.73x |
Low Model Success Rate | 16.54% | 20.15% | 1.22x |
Overall Success Rate | 22.00% | 22.00% | 1.00x |
Cost Savings* | 25.60% | 32.40% | 1.27x |
*Cost savings calculation assumes high-cost model is 2x the cost of low-cost model
Analysis
The results highlight several key benefits of adaptive classification:
Improved Cost Efficiency: While maintaining the same overall success rate (22%), the adaptive classifier achieved 32.40% cost savings compared to 25.60% without adaptation - a relative improvement of 1.27x in cost efficiency.
Better Resource Utilization: The adaptive system routed more queries to the low-cost model (402 vs 387) while reducing high-cost model usage (98 vs 113), demonstrating better resource allocation.
Learning from Experience: Through adaptation, the system improved the success rate of low-model routes from 16.54% to 20.15% (1.22x increase), showing effective learning from successful cases.
ROI on Adaptation: The system adapted to 110 new examples during evaluation, leading to a 6.80% improvement in cost savings while maintaining quality - demonstrating significant return on the adaptation investment.
This real-world evaluation demonstrates that adaptive classification can significantly improve cost efficiency in LLM routing without compromising overall performance.
References
- RouteLLM: Learning to Route LLMs with Preference Data
- Transformer^2: Self-adaptive LLMs
- Lamini Classifier Agent Toolkit
- Protoformer: Embedding Prototypes for Transformers
- Overcoming catastrophic forgetting in neural networks
Citation
If you use this library in your research, please cite:
@software{adaptive_classifier,
title = {Adaptive Classifier: Dynamic Text Classification with Continuous Learning},
author = {Asankhaya Sharma},
year = {2025},
publisher = {GitHub},
url = {https://github.com/codelion/adaptive-classifier}
}