ModernBERT-large fine-tune of spam classifier, using October 2024 dataset. Trained 19th Feb, 2025

{
  "epoch": 1.9073569482288828,
  "eval_eval/0_f1-score": 0.8833151581243184,
  "eval_eval/0_precision": 0.8920704845814978,
  "eval_eval/0_recall": 0.8747300215982722,
  "eval_eval/0_support": 463.0,
  "eval_eval/1_f1-score": 0.903863432165319,
  "eval_eval/1_precision": 0.8966131907308378,
  "eval_eval/1_recall": 0.9112318840579711,
  "eval_eval/1_support": 552.0,
  "eval_eval/accuracy": 0.8945812807881773,
  "eval_eval/macro avg_f1-score": 0.8935892951448187,
  "eval_eval/macro avg_precision": 0.8943418376561678,
  "eval_eval/macro avg_recall": 0.8929809528281216,
  "eval_eval/macro avg_support": 1015.0,
  "eval_eval/mps_allocated_gb": 4.776672768,
  "eval_eval/mps_reserved_gb": 27.440365568,
  "eval_eval/weighted avg_f1-score": 0.8944901800658279,
  "eval_eval/weighted avg_precision": 0.8945410006351291,
  "eval_eval/weighted avg_recall": 0.8945812807881773,
  "eval_eval/weighted avg_support": 1015.0,
  "eval_loss": 0.3420897126197815,
  "eval_runtime": 70.6993,
  "eval_samples_per_second": 14.357,
  "eval_steps_per_second": 1.796,
  "step": 700
}
Downloads last month
7
Safetensors
Model size
396M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.