README / README.md
lawhy's picture
Update README.md
edce807 verified
|
raw
history blame
1.32 kB
metadata
title: README
emoji: πŸ‘
colorFrom: yellow
colorTo: yellow
sdk: static
pinned: false
license: apache-2.0

Hierarchy Transformer

Hierarchy Transformer (HiT) is a framework that enables transformer encoder-based language models (LMs) to learn hierarchical structures in hyperbolic space.

Get Started

Install hierarchy_tranformers (check our repository) through pip or GitHub.

Use the following code to get started with HiTs:

from hierarchy_transformers import HierarchyTransformer

# load the model
model = HierarchyTransformer.from_pretrained('Hierarchy-Transformers/HiT-MiniLM-L12-WordNetNoun')

# entity names to be encoded.
entity_names = ["computer", "personal computer", "fruit", "berry"]

# get the entity embeddings
entity_embeddings = model.encode(entity_names)

Citation

Yuan He, Zhangdie Yuan, Jiaoyan Chen, Ian Horrocks. Language Models as Hierarchy Encoders. Advances in Neural Information Processing Systems 37 (NeurIPS 2024).

@article{he2024language,
  title={Language models as hierarchy encoders},
  author={He, Yuan and Yuan, Moy and Chen, Jiaoyan and Horrocks, Ian},
  journal={Advances in Neural Information Processing Systems},
  volume={37},
  pages={14690--14711},
  year={2024}
}