File size: 1,620 Bytes
d8f8e92
 
 
 
 
 
 
f55ed24
d8f8e92
 
42ebe8f
f55ed24
42ebe8f
ddd623b
42ebe8f
f55ed24
6870193
9ce9fef
 
 
6f6e1be
f55ed24
 
 
 
 
fda4560
f55ed24
 
55898f4
f55ed24
 
 
 
60b5753
a46bb6a
 
 
8fb9357
 
a46bb6a
 
 
 
 
 
 
328bde1
a46bb6a
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
---
title: README
emoji: πŸ‘
colorFrom: yellow
colorTo: yellow
sdk: static
pinned: false
license: apache-2.0
---

**Hierarchy Transformer (HiT)** is a framework aimed to provide universal hierarchy embedding in hyperbolic space with transformer encoder-based language models. 

This Hugging Face page serves as a repository for both the **models** and **datasets**, the latter is also accessible on [Zenodo](https://zenodo.org/doi/10.5281/zenodo.10511042).

For more details, visit [HierarchyTransformers](https://github.com/KRR-Oxford/HierarchyTransformers) on GitHub.

## Get Started

Install `hierarchy_tranformers` (check our [repository](https://github.com/KRR-Oxford/HierarchyTransformers)) through `pip` or `GitHub`.

Use the following code to get started with HiTs:

```python
from hierarchy_transformers import HierarchyTransformer

# load the model
model = HierarchyTransformer.from_pretrained('Hierarchy-Transformers/HiT-MiniLM-L12-WordNetNoun')

# entity names to be encoded.
entity_names = ["computer", "personal computer", "fruit", "berry"]

# get the entity embeddings
entity_embeddings = model.encode(entity_names)
```


## Citation

Our paper has been accepted at NeurIPS 2024 (to appear).

Preprint on arxiv: https://arxiv.org/abs/2401.11374.

*Yuan He, Zhangdie Yuan, Jiaoyan Chen, Ian Horrocks.* **Language Models as Hierarchy Encoders.** arXiv preprint arXiv:2401.11374 (2024).

```
@article{he2024language,
  title={Language Models as Hierarchy Encoders},
  author={He, Yuan and Yuan, Zhangdie and Chen, Jiaoyan and Horrocks, Ian},
  journal={arXiv preprint arXiv:2401.11374},
  year={2024}
}
```