|
--- |
|
title: Submission Template |
|
emoji: 🔥 |
|
colorFrom: yellow |
|
colorTo: green |
|
sdk: docker |
|
pinned: false |
|
--- |
|
|
|
## Model Description |
|
|
|
This space is dedicated to the text task of the Frugal AI Challenge. The final model employed is a Qwen2.5-3B-Instruct with LoRA adapters trained on a diverse mix of approximately 95,000 samples, encompassing both real and synthetic data. |
|
The dataset was open-sourced at MatthiasPicard/Frugal-AI-Train-Data-88k. The fine-tuned model, along with training logs, was open-sourced at MatthiasPicard/ModernBERT_frugal_88k. |
|
|
|
To optimize inference time, the model was quantized to 8 bits to reduce memory usage and increase performance speed. |
|
|
|
### Note: The inference script includes both model and tokenizer loading. As a result, the first evaluation of our model in the submission space will consume more energy than subsequent evaluations. |
|
|
|
### Labels |
|
0. No relevant claim detected |
|
1. Global warming is not happening |
|
2. Not caused by humans |
|
3. Not bad or beneficial |
|
4. Solutions harmful/unnecessary |
|
5. Science is unreliable |
|
6. Proponents are biased |
|
7. Fossil fuels are needed |
|
``` |
|
|