Uploaded model

  • Developed by: Sakalti
  • License: apache2.0
  • Finetuned from model : Sakalti/Saba1-1.8B This qwen model was trained 2x faster with Unsloth and Huggingface's TRL library.

This models was using "kunishou/databricks-dolly-15k-ja" This dataset is licensed under CC BY SA 3.0

Last Update : 2023-05-28 paramaters(no embedding layer) 1.31B paramaters(yes embedding layer) 1.54B layers: 28layers

概要

Saba1.5はSaba1をフゑむンチγƒ₯γƒΌγƒ‹γƒ³γ‚°γ—γŸγƒ’γƒ‡γƒ«γ§γ™γ€‚ ζ€§θƒ½γ―γΎγ γ‚γ‹γ‚ŠγΎγ›γ‚“γŒδΈŠζ˜‡γ—γ¦γ‚‹γ§γ—γ‚‡γ†γ€‚

Downloads last month
140
Safetensors
Model size
1.54B params
Tensor type
FP16
Β·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for Sakalti/Saba1.5-1.5B

Finetunes
1 model
Merges
2 models
Quantizations
3 models

Collection including Sakalti/Saba1.5-1.5B