Trent commited on
Commit
dd8d16b
·
1 Parent(s): 71ea169

Update readme

Browse files
Files changed (1) hide show
  1. app.py +5 -5
app.py CHANGED
@@ -6,7 +6,7 @@ from backend import inference
6
  from backend.config import MODELS_ID, QA_MODELS_ID, SEARCH_MODELS_ID
7
  from backend.utils import load_gender_data
8
 
9
- st.title('Flax-Sentence-Tranformers Demo')
10
 
11
  st.sidebar.image("./hf-sbert.jpg", width=300)
12
  st.sidebar.title('Navigation')
@@ -16,11 +16,11 @@ menu = st.sidebar.radio("", options=["Contributions & Evaluation", "Sentence Sim
16
  st.markdown('''
17
 
18
  Hi! This is the demo for the [flax sentence embeddings](https://huggingface.co/flax-sentence-embeddings) created for the **Flax/JAX community week 🤗**.
19
- We trained three general-purpose flax-sentence-embeddings models: a distilroberta base, a mpnet base and a minilm-l6. They were
20
- trained using Siamese network configuration with custom **Contrastive Loss** inspired by OpenAI CLIP. The models were trained on a dataset comprising of
21
- [1 Billion+ training corpus](https://huggingface.co/flax-sentence-embeddings/all_datasets_v4_MiniLM-L6#training-data) with the v3 setup.
22
 
23
- In addition, we trained [20 models](https://huggingface.co/flax-sentence-embeddings) focused on general-purpose, QuestionAnswering and Code search and **achieved SOTA on multiple benchmarks.**
24
  We also uploaded [8 datasets](https://huggingface.co/flax-sentence-embeddings) specialized for Question Answering, Sentence-Similiarity and Gender Evaluation.
25
  You can view our models and datasets [here](https://huggingface.co/flax-sentence-embeddings).
26
 
 
6
  from backend.config import MODELS_ID, QA_MODELS_ID, SEARCH_MODELS_ID
7
  from backend.utils import load_gender_data
8
 
9
+ st.title('Flax-Sentence-Tranformers')
10
 
11
  st.sidebar.image("./hf-sbert.jpg", width=300)
12
  st.sidebar.title('Navigation')
 
16
  st.markdown('''
17
 
18
  Hi! This is the demo for the [flax sentence embeddings](https://huggingface.co/flax-sentence-embeddings) created for the **Flax/JAX community week 🤗**.
19
+ We trained multiple general-purpose flax-sentence-embeddings models starting from different BERT models including
20
+ distilroberta, mpnet and MiniLM-l6. They were trained using Siamese network configuration with custom **Contrastive Loss**
21
+ inspired by OpenAI CLIP. The models were trained on a dataset comprising of [1 Billion+ training corpus](https://huggingface.co/flax-sentence-embeddings/all_datasets_v4_MiniLM-L6#training-data) with the v3 setup.
22
 
23
+ We have trained [20 models](https://huggingface.co/flax-sentence-embeddings) focused on general-purpose, QuestionAnswering and Code search and **achieved SOTA on multiple benchmarks.**
24
  We also uploaded [8 datasets](https://huggingface.co/flax-sentence-embeddings) specialized for Question Answering, Sentence-Similiarity and Gender Evaluation.
25
  You can view our models and datasets [here](https://huggingface.co/flax-sentence-embeddings).
26