Update README.md
Browse files
README.md
CHANGED
@@ -10,7 +10,7 @@ tags:
|
|
10 |
|
11 |
The Mistral-Chem-v1-15M Large Language Model (LLM) is a pretrained generative chemical molecule model with 1.9M parameters x 8 experts = 15.2M parameters.
|
12 |
It is derived from Mixtral-8x7B-v0.1 model, which was simplified for molecules: the number of layers and the hidden size were reduced.
|
13 |
-
The model was pretrained using 10M molecule SMILES strings from the
|
14 |
|
15 |
## Model Architecture
|
16 |
|
|
|
10 |
|
11 |
The Mistral-Chem-v1-15M Large Language Model (LLM) is a pretrained generative chemical molecule model with 1.9M parameters x 8 experts = 15.2M parameters.
|
12 |
It is derived from Mixtral-8x7B-v0.1 model, which was simplified for molecules: the number of layers and the hidden size were reduced.
|
13 |
+
The model was pretrained using 10M molecule SMILES strings from the PubChem database.
|
14 |
|
15 |
## Model Architecture
|
16 |
|