File size: 1,105 Bytes
6fb5d57
9685f7b
6fb5d57
 
 
 
 
 
 
70f5f26
 
17f802c
 
70f5f26
17f802c
70f5f26
17f802c
70f5f26
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
---
title: Submission Template
emoji: 🔥
colorFrom: yellow
colorTo: green
sdk: docker
pinned: false
---

## Model Description

This space is dedicated to the text task of the Frugal AI Challenge. The final model employed is a Qwen2.5-3B-Instruct with LoRA adapters trained on a diverse mix of approximately 95,000 samples, encompassing both real and synthetic data.
The dataset was open-sourced at MatthiasPicard/Frugal-AI-Train-Data-88k. The fine-tuned model, along with training logs, was open-sourced at MatthiasPicard/ModernBERT_frugal_88k.

To optimize inference time, the model was quantized to 8 bits to reduce memory usage and increase performance speed.

### Note: The inference script includes both model and tokenizer loading. As a result, the first evaluation of our model in the submission space will consume more energy than subsequent evaluations.

### Labels
0. No relevant claim detected
1. Global warming is not happening
2. Not caused by humans
3. Not bad or beneficial
4. Solutions harmful/unnecessary
5. Science is unreliable
6. Proponents are biased
7. Fossil fuels are needed
```