metadata
tags:
- generated_from_trainer
metrics:
- accuracy
- f1
widget:
- text: >-
Absolutely thrilled with my new wireless earbuds! The sound quality is
exceptional, and they stay securely in my ears during workouts. Plus, the
charging case is so convenient for on-the-go use.
- text: >-
This coffee maker has truly simplified my mornings. It brews quickly and
the programmable features allow me to wake up to the aroma of freshly
brewed coffee. Plus, the sleek design looks great on my countertop.
- text: >-
I'm impressed with the durability of this laptop backpack. It comfortably
fits my 15-inch laptop, charger, and other essentials without feeling
bulky. The USB charging port is a lifesaver for staying connected on the
move.
- text: >-
As someone who loves to cook, this chef's knife is a game-changer. The
sharp blade effortlessly cuts through vegetables, meats, and herbs, making
prep work a breeze. The ergonomic handle ensures comfort even during long
chopping sessions.
- text: >-
This smart thermostat has made managing my home's temperature a breeze.
The intuitive app allows me to adjust settings remotely, and the
energy-saving features have noticeably reduced my utility bills.
Installation was also a breeze thanks to clear instructions.
model-index:
- name: gpt2-amazon-sentiment-classifier
results: []
license: mit
datasets:
- McAuley-Lab/Amazon-Reviews-2023
language:
- en
library_name: transformers
gpt2-amazon-sentiment-classifier
This model was trained from scratch on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.0320
- Accuracy: 0.9680
- F1: 0.9680
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
Training results
Framework versions
- Transformers 4.39.3
- Pytorch 2.2.1+cu121
- Datasets 2.18.0
- Tokenizers 0.15.2