Finetuning on Combined Phosphosite Data

ESM-1b is finetuned by Masked Language Modeling objective. The data is combination of phosphosite data which are used to train isikz/esm1b_msa_mlm_pt_phosphosite and isikz/esm1b_mlm_pt_phosphosite. The total number of data is 1055221.

Downloads last month
12
Safetensors
Model size
652M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for isikz/esm1b_ft_phosphosite_data_combined

Finetuned
(8)
this model