Built with Axolotl

The Bucharest series is mostly an experiment. Use Pallas series instead.

An instruct based fine tune of migtissera/Tess-10.7B-v1.5b.

This model is trained on a private dataset + Mihaiii/OpenHermes-2.5-1k-longest-curated, which is a subset of HuggingFaceH4/OpenHermes-2.5-1k-longest, which is a subset of teknium/OpenHermes-2.5.

Prompt Format:

SYSTEM: <ANY SYSTEM CONTEXT>
USER: 
ASSISTANT:

GGUF:

tsunemoto/Bucharest-0.2-GGUF

Downloads last month
64
Safetensors
Model size
10.7B params
Tensor type
BF16
Β·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model authors have turned it off explicitly.

Model tree for Mihaiii/Bucharest-0.2

Finetuned
(2)
this model
Quantizations
2 models

Dataset used to train Mihaiii/Bucharest-0.2

Spaces using Mihaiii/Bucharest-0.2 6