This model is a instruct-tuned EleutherAI/polyglot-ko-1.3b model.
Training hyperparameters
- learning_rate: 5e-5
- train_batch_size: 1
- seed: 42
- distributed_type: multi-GPU (A30 24G) + CPU Offloading (384GB)
- num_devices: 2
- gradient_accumulation_steps: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2.0
Framework versions
- Transformers 4.34.1
- Pytorch 2.0.1+cu117
- Datasets 2.11.0
- deepspeed 0.9.5
- Downloads last month
- 2,167
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for DILAB-HYU/koquality-polyglot-1.3b
Base model
EleutherAI/polyglot-ko-1.3b