|
--- |
|
license: apache-2.0 |
|
language: |
|
- ja |
|
pipeline_tag: text-generation |
|
library_name: transformers |
|
base_model: |
|
- Qwen/Qwen2.5-1.5B-Instruct |
|
--- |
|
# SmolSwallow-1.5B |
|
|
|
π€ [Models](https://huggingface.co/SakanaAI) | π [Paper](https://arxiv.org/abs/TODO) | π [Blog](https://sakana.ai/taid/) | π¦ [Twitter](https://twitter.com/SakanaAILabs) |
|
|
|
**SmolSwallow-1.5B** is a Japanese compact language model created through TAID (Temporally Adaptive Interpolated Distillation), our new knowledge distillation method. |
|
We used [Qwen2.5-32B-Instruct](https://huggingface.co/Qwen/Qwen2.5-32B-Instruct) as the teacher model and [Qwen2.5-1.5B-Instruct](https://huggingface.co/Qwen/Qwen2.5-1.5B-Instruct) as the student model. |
|
The model has been further pre-trained on Japanese text data to enhance its Japanese language capabilities. |
|
|
|
If you are looking for an instruction-following model, check [SmolSwallow-1.5B-Instruct](https://huggingface.co/SakanaAI/SmolSwallow-1.5B-Instruct). |
|
|
|
## Model Details |
|
|
|
<!-- Provide a longer summary of what this model is. --> |
|
|
|
- **Developed by:** [Sakana AI](https://sakana.ai/) and [Swallow Team](https://swallow-llm.github.io/index.en.html) |
|
- **Model type:** Autoregressive Language Model |
|
- **Language(s):** Japanese |
|
- **License:** [Apache License, Version 2.0](./LICENSE) |
|
- **Paper:** https://arxiv.org/abs/TODO |
|
- **Blog:** https://sakana.ai/taid |
|
|
|
## Uses |
|
This model is provided for research and development purposes only and should be considered as an experimental prototype. |
|
It is not intended for commercial use or deployment in mission-critical environments. |
|
Use of this model is at the user's own risk, and its performance and outcomes are not guaranteed. |
|
Sakana AI shall not be liable for any direct, indirect, special, incidental, or consequential damages, or any loss arising from the use of this model, regardless of the results obtained. |
|
Users must fully understand the risks associated with the use of this model and use it at their own discretion. |
|
|
|
|
|
## Acknowledgement |
|
|
|
We would like to thank the developers of the source models for their contributions and for making their work available. |
|
|
|
## Citation |
|
|
|
```bibtex |
|
@misc{sakana2025taid, |
|
title = {TAID: Temporally Adaptive Interpolated Distillation for Efficient Knowledge Transfer in Language Models}, |
|
author. = {Makoto Shing and Ko Misaki and Han Bao and Sho Yokoi and Takuya Akiba}, |
|
year = {2025}, |
|
eprint = {TODO}, |
|
archivePrefix = {arXiv}, |
|
primaryClass = {TODO}, |
|
url = {https://arxiv.org/abs/TODO} |
|
} |
|
``` |