Pretrained Graph Encoder for Enhanced Molecular Understanding in LLMs

This pretrained graph encoder improves molecular understanding in large language models (LLMs), enabling enhanced performance in molecular design tasks.

πŸ“„ Paper: Multimodal Large Language Models for Inverse Molecular Design with Retrosynthetic Planning

πŸ“ Repository: https://github.com/liugangcode/Llamole

Downloads last month
10
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Space using liuganghuggingface/Llamole-Pretrained-GraphEncoder 1

Collection including liuganghuggingface/Llamole-Pretrained-GraphEncoder