File size: 1,789 Bytes
8255ca6 6a1d245 7f7b012 6a1d245 7f7b012 a9a2cb0 6a1d245 7f7b012 dbb80ea a9a2cb0 7f7b012 a9a2cb0 7f7b012 6a1d245 a9a2cb0 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 |
---
language: en
tags:
- bigbird
- question-answering
- squad-v2.2
license: apache-2.0
datasets:
- squad_v2
metrics:
- f1
- exact_match
library_name: adapter-transformers
pipeline_tag: question-answering
---
# FredNajjar/bigbird-QA-squad_v2.2
Fine-tuned [`google/bigbird-roberta-base`](https://huggingface.co/google/bigbird-roberta-base) model on the SQuAD 2.0 dataset for English extractive question answering.
## Model Details
- **Language Model**: [google/bigbird-roberta-base](https://huggingface.co/google/bigbird-roberta-base)
- **Language**: English
- **Task**: Extractive Question Answering
- **Data**: [SQuAD 2.0](https://rajpurkar.github.io/SQuAD-explorer/)
- **Infrastructure**: 1x NVIDIA A100-SXM4-40GB
## Training Hyperparameters
- Learning Rate: 3e-05
- Train Batch Size: 16
- Eval Batch Size: 8
- Seed: 42
- Gradient Accumulation Steps: 8
- Total Train Batch Size: 128
- Optimizer: Adam (betas=(0.9,0.999), epsilon=1e-08)
- LR Scheduler: Linear with 121 warmup steps
- Number of Epochs: 3
## Results on SQuAD 2.0
- **F1 Score**: 81.39%
- **Exact Match**: 77.82%
## Usage
```python
from transformers import AutoModelForQuestionAnswering, AutoTokenizer, pipeline
model_name = "FredNajjar/bigbird-QA-squad_v2.2"
nlp = pipeline('question-answering', model=model_name, tokenizer=model_name)
QA_input = {
'question': 'Your question here',
'context': 'Your context here'
}
res = nlp(QA_input)
```
- **Framework Versions**:
- Transformers: 4.34.0
- Pytorch: 2.0.1+cu118
- Datasets: 2.14.5
- Tokenizers: 0.14.1
## Limitations and Bias
This model inherits limitations and potential biases from the base BigBird model and the SQuAD 2.0 training data.
## Contact
For inquiries, please reach out via [LinkedIn](https://www.linkedin.com/in/frednajjar/).
--- |