Spaces:
Running
on
CPU Upgrade
Running
on
CPU Upgrade
## Use Cases | |
### Frequently Asked Questions | |
You can use Question Answering (QA) models to automate the response to frequently asked questions by using a knowledge base (documents) as context. Answers to customer questions can be drawn from those documents. | |
⚡⚡ If you’d like to save inference time, you can first use [passage ranking models](/tasks/sentence-similarity) to see which document might contain the answer to the question and iterate over that document with the QA model instead. | |
## Task Variants | |
There are different QA variants based on the inputs and outputs: | |
- **Extractive QA:** The model **extracts** the answer from a context. The context here could be a provided text, a table or even HTML! This is usually solved with BERT-like models. | |
- **Open Generative QA:** The model **generates** free text directly based on the context. You can learn more about the Text Generation task in [its page](/tasks/text-generation). | |
- **Closed Generative QA:** In this case, no context is provided. The answer is completely generated by a model. | |
The schema above illustrates extractive, open book QA. The model takes a context and the question and extracts the answer from the given context. | |
You can also differentiate QA models depending on whether they are open-domain or closed-domain. Open-domain models are not restricted to a specific domain, while closed-domain models are restricted to a specific domain (e.g. legal, medical documents). | |
## Inference | |
You can infer with QA models with the 🤗 Transformers library using the `question-answering` pipeline. If no model checkpoint is given, the pipeline will be initialized with `distilbert-base-cased-distilled-squad`. This pipeline takes a question and a context from which the answer will be extracted and returned. | |
```python | |
from transformers import pipeline | |
qa_model = pipeline("question-answering") | |
question = "Where do I live?" | |
context = "My name is Merve and I live in İstanbul." | |
qa_model(question = question, context = context) | |
## {'answer': 'İstanbul', 'end': 39, 'score': 0.953, 'start': 31} | |
``` | |
## Useful Resources | |
Would you like to learn more about QA? Awesome! Here are some curated resources that you may find helpful! | |
- [Course Chapter on Question Answering](https://huggingface.co/course/chapter7/7?fw=pt) | |
- [Question Answering Workshop](https://www.youtube.com/watch?v=Ihgk8kGLpIE&ab_channel=HuggingFace) | |
- [How to Build an Open-Domain Question Answering System?](https://lilianweng.github.io/lil-log/2020/10/29/open-domain-question-answering.html) | |
- [Blog Post: ELI5 A Model for Open Domain Long Form Question Answering](https://yjernite.github.io/lfqa.html) | |
### Notebooks | |
- [PyTorch](https://github.com/huggingface/notebooks/blob/master/examples/question_answering.ipynb) | |
- [TensorFlow](https://github.com/huggingface/notebooks/blob/main/examples/question_answering-tf.ipynb) | |
### Scripts for training | |
- [PyTorch](https://github.com/huggingface/transformers/tree/main/examples/pytorch/question-answering) | |
- [TensorFlow](https://github.com/huggingface/transformers/tree/main/examples/tensorflow/question-answering) | |
- [Flax](https://github.com/huggingface/transformers/tree/main/examples/flax/question-answering) | |
### Documentation | |
- [Question answering task guide](https://huggingface.co/docs/transformers/tasks/question_answering) | |