Trent commited on
Commit
82e684b
·
1 Parent(s): f60ff22

Bold letters

Browse files
Files changed (1) hide show
  1. app.py +1 -1
app.py CHANGED
@@ -17,7 +17,7 @@ st.markdown('''
17
 
18
  Hi! This is the demo for the [flax sentence embeddings](https://huggingface.co/flax-sentence-embeddings) created for the **Flax/JAX community week 🤗**.
19
  We trained three general-purpose flax-sentence-embeddings models: a distilroberta base, a mpnet base and a minilm-l6. They were
20
- trained using Siamese network configuration with custom Contrastive Loss inspired by OpenAI CLIP. The models were trained on a dataset comprising of
21
  [1 Billion+ training corpus](https://huggingface.co/flax-sentence-embeddings/all_datasets_v4_MiniLM-L6#training-data) with the v3 setup.
22
 
23
  In addition, we trained [20 models](https://huggingface.co/flax-sentence-embeddings) focused on general-purpose, QuestionAnswering and Code search and **achieved SOTA on multiple benchmarks.**
 
17
 
18
  Hi! This is the demo for the [flax sentence embeddings](https://huggingface.co/flax-sentence-embeddings) created for the **Flax/JAX community week 🤗**.
19
  We trained three general-purpose flax-sentence-embeddings models: a distilroberta base, a mpnet base and a minilm-l6. They were
20
+ trained using Siamese network configuration with custom **Contrastive Loss** inspired by OpenAI CLIP. The models were trained on a dataset comprising of
21
  [1 Billion+ training corpus](https://huggingface.co/flax-sentence-embeddings/all_datasets_v4_MiniLM-L6#training-data) with the v3 setup.
22
 
23
  In addition, we trained [20 models](https://huggingface.co/flax-sentence-embeddings) focused on general-purpose, QuestionAnswering and Code search and **achieved SOTA on multiple benchmarks.**