Text Classification
Transformers
Safetensors
new
custom_code
princeton-nlp commited on
Commit
68d4926
·
verified ·
1 Parent(s): 6f15a91

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -2
README.md CHANGED
@@ -81,7 +81,7 @@ You can convert the `logits` of the model with a softmax to obtain a probability
81
  The full definitions of the categories can be found in the [taxonomy config](https://github.com/CodeCreator/WebOrganizer/blob/main/define_domains/taxonomies/topics.yaml).
82
 
83
  ##### Efficient Inference
84
- We recommend that you use the efficient gte-base-en-v1.5 implementation by enabling unpadding and memory efficient attention. This __requires installing `xformers`__ and loading the model like
85
  ```python
86
  AutoModelForSequenceClassification.from_pretrained(
87
  "WebOrganizer/TopicClassifier",
@@ -91,7 +91,6 @@ AutoModelForSequenceClassification.from_pretrained(
91
  torch_dtype=torch.bfloat16
92
  )
93
  ```
94
- See details [here](https://huggingface.co/Alibaba-NLP/new-impl#recommendation-enable-unpadding-and-acceleration-with-xformers).
95
 
96
  ## Citation
97
  ```bibtex
 
81
  The full definitions of the categories can be found in the [taxonomy config](https://github.com/CodeCreator/WebOrganizer/blob/main/define_domains/taxonomies/topics.yaml).
82
 
83
  ##### Efficient Inference
84
+ We recommend that you use the efficient gte-base-en-v1.5 implementation by enabling unpadding and memory efficient attention. This __requires installing `xformers`__ (see more [here](https://huggingface.co/Alibaba-NLP/new-impl#recommendation-enable-unpadding-and-acceleration-with-xformers)) and loading the model like:
85
  ```python
86
  AutoModelForSequenceClassification.from_pretrained(
87
  "WebOrganizer/TopicClassifier",
 
91
  torch_dtype=torch.bfloat16
92
  )
93
  ```
 
94
 
95
  ## Citation
96
  ```bibtex