Yin Fang
commited on
Commit
·
51cb62c
1
Parent(s):
cfb4f6c
Update README.md
Browse files
README.md
CHANGED
@@ -10,7 +10,7 @@ MolGen was introduced in the paper ["Molecular Language Model as Multi-task Gene
|
|
10 |
MolGen is the first pre-trained model that only produces chemically valid molecules.
|
11 |
With a training corpus of over 100 million molecules in SELFIES representation, MolGen learns the intrinsic structural patterns of molecules by mapping corrupted SELFIES to their original forms.
|
12 |
Specifically, MolGen employs a bidirectional Transformer as its encoder and an autoregressive Transformer as its decoder.
|
13 |
-
Through its carefully designed multi-task prefix tuning (MPT), MolGen can generate molecules with desired properties, making it a valuable tool for molecular optimization.
|
14 |
|
15 |
|
16 |
### BibTeX entry and citation info
|
|
|
10 |
MolGen is the first pre-trained model that only produces chemically valid molecules.
|
11 |
With a training corpus of over 100 million molecules in SELFIES representation, MolGen learns the intrinsic structural patterns of molecules by mapping corrupted SELFIES to their original forms.
|
12 |
Specifically, MolGen employs a bidirectional Transformer as its encoder and an autoregressive Transformer as its decoder.
|
13 |
+
Through its carefully designed multi-task molecular prefix tuning (MPT), MolGen can generate molecules with desired properties, making it a valuable tool for molecular optimization.
|
14 |
|
15 |
|
16 |
### BibTeX entry and citation info
|