YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Dummy Model for Lab4

This model is a fine-tuned version of bert-base-uncased on SST-2 dataset.

Results of the evaluation set:

Accuracy: 0.64

This model was fine-tuneded for personal research usage. with randomly selected 100 training datas and 100 evaluation datas from SST-2 dataset.

Evaluation

import evaluate predictions = trainer.predict(Resrt_eval) print(predictions.predictions.shape, predictions.label_ids.shape) preds = np.argmax(predictions.predictions, axis=-1)

metric = evaluate.load("glue", "sst2") metric.compute(predictions=preds, references=predictions.label_ids)

Training hyperparameters The following hyperparameters were used during training:

learning_rate: unset train_batch_size: unset eval_batch_size: unset seed of training dataset: 49282927487 seed of evaluation dataset:492829487

lr_scheduler_type: linear num_epochs: 3.0

Training results image/png

Downloads last month
113
Safetensors
Model size
109M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.