Commit
·
9ac8515
1
Parent(s):
c2b8c92
Update README.md
Browse files
README.md
CHANGED
@@ -21,10 +21,19 @@ This model is based on [sentence-transformers/all-mpnet-base-v2](https://hugging
|
|
21 |
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search. This model has been finentuned for the biomedical domain. While it preserves a good ability to produce embeddings for general-purpose text, it will be more useful to you if you are trying to process medical documents such as EHR records or clinical notes. Both sentences and phrases can be embedded in the same latent space.
|
22 |
|
23 |
## Citation
|
24 |
-
This
|
25 |
|
26 |
```latex
|
27 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
28 |
```
|
29 |
|
30 |
## Usage (Sentence-Transformers)
|
|
|
21 |
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search. This model has been finentuned for the biomedical domain. While it preserves a good ability to produce embeddings for general-purpose text, it will be more useful to you if you are trying to process medical documents such as EHR records or clinical notes. Both sentences and phrases can be embedded in the same latent space.
|
22 |
|
23 |
## Citation
|
24 |
+
This model accompanies the [BioLORD: Learning Ontological Representations from Definitions](https://arxiv.org/abs/2210.11892) paper, accepted in the EMNLP 2022 Findings. When you use this model, please cite the original paper as follows:
|
25 |
|
26 |
```latex
|
27 |
+
@misc{https://doi.org/10.48550/arxiv.2210.11892,
|
28 |
+
title = {BioLORD: Learning Ontological Representations from Definitions (for Biomedical Concepts and their Textual Descriptions)},
|
29 |
+
author = {Remy, François and Demuynck, Kris and Demeester, Thomas},
|
30 |
+
url = {https://arxiv.org/abs/2210.11892},
|
31 |
+
doi = {10.48550/ARXIV.2210.11892},
|
32 |
+
keywords = {Computation and Language (cs.CL), Information Retrieval (cs.IR), FOS: Computer and information sciences, FOS: Computer and information sciences},
|
33 |
+
publisher = {arXiv},
|
34 |
+
year = {2022},
|
35 |
+
copyright = {Creative Commons Attribution 4.0 International}
|
36 |
+
}
|
37 |
```
|
38 |
|
39 |
## Usage (Sentence-Transformers)
|