finetune-instance-segmentation-ade20k-mini-mask2former

This model was trained from scratch on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 7.5549
  • Map: 1.0
  • Map 50: 1.0
  • Map 75: 1.0
  • Map Small: -1.0
  • Map Medium: -1.0
  • Map Large: 1.0
  • Mar 1: 1.0
  • Mar 10: 1.0
  • Mar 100: 1.0
  • Mar Small: -1.0
  • Mar Medium: -1.0
  • Mar Large: 1.0
  • Map Node 0: 1.0
  • Mar 100 Node 0: 1.0
  • Map Node 1: -1.0
  • Mar 100 Node 1: -1.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 128
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: constant
  • num_epochs: 40.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Map Map 50 Map 75 Map Small Map Medium Map Large Mar 1 Mar 10 Mar 100 Mar Small Mar Medium Mar Large Map Node 0 Mar 100 Node 0 Map Node 1 Mar 100 Node 1
12.0634 1.0 1 34.0342 0.0388 0.1429 0.0128 -1.0 -1.0 0.083 0.0 0.2 1.0 -1.0 -1.0 1.0 0.0388 1.0 -1.0 -1.0
14.6891 2.0 2 27.5947 0.0378 0.125 0.0159 -1.0 -1.0 0.0707 0.0 0.2 1.0 -1.0 -1.0 1.0 0.0378 1.0 -1.0 -1.0
12.1372 3.0 3 24.3069 0.0538 0.2 0.0172 -1.0 -1.0 0.225 0.0 0.2 1.0 -1.0 -1.0 1.0 0.0538 1.0 -1.0 -1.0
10.1768 4.0 4 21.7997 0.0325 0.0909 0.0179 -1.0 -1.0 0.0952 0.0 0.0 1.0 -1.0 -1.0 1.0 0.0325 1.0 -1.0 -1.0
10.9674 5.0 5 20.0476 0.0507 0.1667 0.0185 -1.0 -1.0 0.1833 0.0 0.3 1.0 -1.0 -1.0 1.0 0.0507 1.0 -1.0 -1.0
8.2576 6.0 6 19.0059 0.0582 0.25 0.0185 -1.0 -1.0 0.2054 0.0 0.2 1.0 -1.0 -1.0 1.0 0.0582 1.0 -1.0 -1.0
8.2583 7.0 7 18.1974 0.0629 0.3333 0.0204 -1.0 -1.0 0.195 0.0 0.2 1.0 -1.0 -1.0 1.0 0.0629 1.0 -1.0 -1.0
7.3192 8.0 8 17.3621 0.1302 0.5 0.0435 -1.0 -1.0 0.2841 0.0 0.2 1.0 -1.0 -1.0 1.0 0.1302 1.0 -1.0 -1.0
7.0464 9.0 9 16.5316 0.2514 1.0 0.0625 -1.0 -1.0 0.3417 0.2 0.2 1.0 -1.0 -1.0 1.0 0.2514 1.0 -1.0 -1.0
6.8925 10.0 10 15.5454 0.2157 1.0 0.0196 -1.0 -1.0 0.2333 0.2 0.2 1.0 -1.0 -1.0 1.0 0.2157 1.0 -1.0 -1.0
6.9519 11.0 11 14.5089 0.225 1.0 0.0312 -1.0 -1.0 0.2667 0.2 0.2 1.0 -1.0 -1.0 1.0 0.225 1.0 -1.0 -1.0
6.2326 12.0 12 13.8831 0.3733 1.0 0.2 -1.0 -1.0 0.4114 0.3 0.6 1.0 -1.0 -1.0 1.0 0.3733 1.0 -1.0 -1.0
6.1844 13.0 13 13.3770 0.6667 1.0 0.3333 -1.0 -1.0 0.6667 0.5 1.0 1.0 -1.0 -1.0 1.0 0.6667 1.0 -1.0 -1.0
5.7356 14.0 14 12.8322 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 -1.0 -1.0
5.466 15.0 15 12.5074 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 -1.0 -1.0
5.4641 16.0 16 11.8883 0.65 1.0 0.5 -1.0 -1.0 0.65 0.3 1.0 1.0 -1.0 -1.0 1.0 0.65 1.0 -1.0 -1.0
5.3664 17.0 17 11.4002 0.65 1.0 0.5 -1.0 -1.0 0.65 0.3 1.0 1.0 -1.0 -1.0 1.0 0.65 1.0 -1.0 -1.0
4.9014 18.0 18 10.9808 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 -1.0 -1.0
4.7852 19.0 19 10.7451 0.65 1.0 0.5 -1.0 -1.0 0.65 0.3 1.0 1.0 -1.0 -1.0 1.0 0.65 1.0 -1.0 -1.0
4.7773 20.0 20 10.5880 0.6167 1.0 0.3333 -1.0 -1.0 0.6167 0.4 1.0 1.0 -1.0 -1.0 1.0 0.6167 1.0 -1.0 -1.0
4.6423 21.0 21 10.3569 0.75 1.0 0.5 -1.0 -1.0 0.75 0.5 1.0 1.0 -1.0 -1.0 1.0 0.75 1.0 -1.0 -1.0
4.6973 22.0 22 10.0560 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 -1.0 -1.0
4.5107 23.0 23 9.9010 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 -1.0 -1.0
4.3641 24.0 24 9.8444 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 -1.0 -1.0
4.3039 25.0 25 9.7284 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 -1.0 -1.0
4.2061 26.0 26 9.4944 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 -1.0 -1.0
4.1906 27.0 27 9.3099 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 -1.0 -1.0
3.9988 28.0 28 9.0558 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 -1.0 -1.0
3.9956 29.0 29 8.9970 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 -1.0 -1.0
3.9154 30.0 30 8.8224 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 -1.0 -1.0
3.8152 31.0 31 8.6420 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 -1.0 -1.0
3.7358 32.0 32 8.4847 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 -1.0 -1.0
3.7624 33.0 33 8.4232 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 -1.0 -1.0
3.6491 34.0 34 8.2848 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 -1.0 -1.0
3.5853 35.0 35 8.0934 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 -1.0 -1.0
3.5897 36.0 36 8.1184 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 -1.0 -1.0
3.4895 37.0 37 7.9605 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 -1.0 -1.0
3.415 38.0 38 7.8289 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 -1.0 -1.0
3.3717 39.0 39 7.7094 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 -1.0 -1.0
3.3056 40.0 40 7.5549 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 1.0 -1.0 -1.0 1.0 1.0 1.0 -1.0 -1.0

Framework versions

  • Transformers 4.47.0.dev0
  • Pytorch 2.3.0
  • Datasets 3.1.0
  • Tokenizers 0.20.3
Downloads last month
39
Safetensors
Model size
47.4M params
Tensor type
I64
·
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no pipeline_tag.