Join our discord

Server Link

License

cc-by-sa-4.0

Model Details

Base Model
maywell/Synatra-10.7B-v0.4

Trained On
A100 80GB * 8

Sionic AIμ—μ„œ GPU μžμ›μ„ 지원받아 μ œμž‘λ˜μ—ˆμŠ΅λ‹ˆλ‹€.

Instruction format

It follows Alpaca format.

Model Benchmark

TBD

Implementation Code

Since, chat_template already contains insturction format above. You can use the code below.

from transformers import AutoModelForCausalLM, AutoTokenizer

device = "cuda" # the device to load the model onto

model = AutoModelForCausalLM.from_pretrained("maywell/Synatra-kiqu-10.7B")
tokenizer = AutoTokenizer.from_pretrained("maywell/Synatra-kiqu-10.7B")

messages = [
    {"role": "user", "content": "μ—”λΉ„λ””μ•„λŠ” 뭐 ν•˜λŠ” 기업이야?"},
]

encodeds = tokenizer.apply_chat_template(messages, return_tensors="pt")

model_inputs = encodeds.to(device)
model.to(device)

generated_ids = model.generate(model_inputs, max_new_tokens=1000, do_sample=True)
decoded = tokenizer.batch_decode(generated_ids)
print(decoded[0])
Downloads last month
54
Safetensors
Model size
10.7B params
Tensor type
BF16
Β·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for maywell/Synatra-kiqu-10.7B

Quantizations
3 models

Spaces using maywell/Synatra-kiqu-10.7B 6