paperswithcode_word2vec / 0_WordEmbeddings /wordembedding_config.json
lambdaofgod's picture
Add new SentenceTransformer model.
bab6ed6
raw
history blame contribute delete
164 Bytes
{
"tokenizer_class": "sentence_transformers.models.tokenizer.WhitespaceTokenizer.WhitespaceTokenizer",
"update_embeddings": false,
"max_seq_length": 1000000
}