Saba1.5-1.5B / README.md
Sakalti's picture
Update README.md
e05f428 verified
|
raw
history blame
744 Bytes
metadata
license: apache-2.0
library_name: transformers
inference: true
tags:
  - unsloth
  - trl
  - sft

Uploaded model

  • Developed by: Sakalti
  • License: apache2.0
  • Finetuned from model : Sakalti/Saba1-1.8B This qwen model was trained 2x faster with Unsloth and Huggingface's TRL library.

paramaters(no embedding layer) 1.31B paramaters(yes embedding layer) 1.54B layers: 28layers

概要

Saba1.5はSaba1をフゑむンチγƒ₯γƒΌγƒ‹γƒ³γ‚°γ—γŸγƒ’γƒ‡γƒ«γ§γ™γ€‚ ζ€§θƒ½γ―γΎγ γ‚γ‹γ‚ŠγΎγ›γ‚“γŒδΈŠζ˜‡γ—γ¦γ‚‹γ§γ—γ‚‡γ†γ€‚