Meta Chameleon 30B

Repository for Meta Chameleon, a mixed-modal early-fusion foundation model from FAIR. See the Chameleon paper for more information.

The Chameleon collection on HuggingFace contains 7 billion parameter and 30 billion parameter model checkpoints.

[more details and usage examples coming soon]

Citation

To cite the paper, model, or software, please use the below:

@article{Chameleon_Team_Chameleon_Mixed-Modal_Early-Fusion_2024,
  author = {Chameleon Team},
  doi = {10.48550/arXiv.2405.09818},
  journal = {arXiv preprint arXiv:2405.09818},
  title = {Chameleon: Mixed-Modal Early-Fusion Foundation Models},
  url = {https://github.com/facebookresearch/chameleon},
  year = {2024}
}

License

Use of this repository and related resources are governed by the Chameleon Research License and this repository's LICENSE file.

Downloads last month
342
Safetensors
Model size
34.3B params
Tensor type
BF16
·
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Spaces using facebook/chameleon-30b 2

Collection including facebook/chameleon-30b