Text Classification
Transformers
Safetensors
new
custom_code
princeton-nlp commited on
Commit
521747f
·
verified ·
1 Parent(s): 9193bf9

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -2
README.md CHANGED
@@ -77,7 +77,7 @@ You can convert the `logits` of the model with a softmax to obtain a probability
77
  The full definitions of the categories can be found in the [taxonomy config](https://github.com/CodeCreator/WebOrganizer/blob/main/define_domains/taxonomies/formats.yaml).
78
 
79
  ##### Efficient Inference
80
- We recommend that you use the efficient gte-base-en-v1.5 implementation by enabling unpadding and memory efficient attention. This __requires installing `xformers`__ and loading the model like
81
  ```python
82
  AutoModelForSequenceClassification.from_pretrained(
83
  "WebOrganizer/FormatClassifier-NoURL",
@@ -87,7 +87,6 @@ AutoModelForSequenceClassification.from_pretrained(
87
  torch_dtype=torch.bfloat16
88
  )
89
  ```
90
- See details [here](https://huggingface.co/Alibaba-NLP/new-impl#recommendation-enable-unpadding-and-acceleration-with-xformers).
91
 
92
  ## Citation
93
  ```bibtex
 
77
  The full definitions of the categories can be found in the [taxonomy config](https://github.com/CodeCreator/WebOrganizer/blob/main/define_domains/taxonomies/formats.yaml).
78
 
79
  ##### Efficient Inference
80
+ We recommend that you use the efficient gte-base-en-v1.5 implementation by enabling unpadding and memory efficient attention. This __requires installing `xformers`__ (see more [here](https://huggingface.co/Alibaba-NLP/new-impl#recommendation-enable-unpadding-and-acceleration-with-xformers)) and loading the model like:
81
  ```python
82
  AutoModelForSequenceClassification.from_pretrained(
83
  "WebOrganizer/FormatClassifier-NoURL",
 
87
  torch_dtype=torch.bfloat16
88
  )
89
  ```
 
90
 
91
  ## Citation
92
  ```bibtex