rubentito commited on
Commit
0599fdf
1 Parent(s): 3a97366

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +34 -1
README.md CHANGED
@@ -15,4 +15,37 @@ This is BigBird-base trained on TriviaQA from [Google hub](https://huggingface.c
15
 
16
  This model was used as a baseline in [Hierarchical multimodal transformers for Multi-Page DocVQA](https://arxiv.org/pdf/2212.05935.pdf).
17
  - Results on the MP-DocVQA dataset are reported in Table 2.
18
- - Training hyperparameters can be found in Table 8 of Appendix D
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
15
 
16
  This model was used as a baseline in [Hierarchical multimodal transformers for Multi-Page DocVQA](https://arxiv.org/pdf/2212.05935.pdf).
17
  - Results on the MP-DocVQA dataset are reported in Table 2.
18
+ - Training hyperparameters can be found in Table 8 of Appendix D.
19
+
20
+ ## How to use
21
+
22
+ Here is how to use this model to get the features of a given text in PyTorch:
23
+
24
+ ```python
25
+ from transformers import BigBirdForQuestionAnswering
26
+
27
+ # by default its in `block_sparse` mode with num_random_blocks=3, block_size=64
28
+ model = BigBirdForQuestionAnswering.from_pretrained("rubentito/BigBird-ITC-MPDocVQA")
29
+
30
+ # you can change `attention_type` to full attention like this:
31
+ model = BigBirdForQuestionAnswering.from_pretrained("rubentito/BigBird-ITC-MPDocVQA", attention_type="original_full")
32
+
33
+ # you can change `block_size` & `num_random_blocks` like this:
34
+ model = BigBirdForQuestionAnswering.from_pretrained("rubentito/BigBird-ITC-MPDocVQA", block_size=16, num_random_blocks=2)
35
+
36
+ question = "Replace me by any text you'd like."
37
+ context = "Put some context for answering"
38
+ encoded_input = tokenizer(question, context, return_tensors='pt')
39
+ output = model(**encoded_input)
40
+ ```
41
+
42
+ ## BibTeX entry
43
+
44
+ ```tex
45
+ @article{tito2022hierarchical,
46
+ title={Hierarchical multimodal transformers for Multi-Page DocVQA},
47
+ author={Tito, Rub{\`e}n and Karatzas, Dimosthenis and Valveny, Ernest},
48
+ journal={arXiv preprint arXiv:2212.05935},
49
+ year={2022}
50
+ }
51
+ ```