|
--- |
|
pipeline_tag: zero-shot-classification |
|
language: |
|
- da |
|
- no |
|
- nb |
|
- sv |
|
license: mit |
|
datasets: |
|
- strombergnlp/danfever |
|
- KBLab/overlim |
|
- MoritzLaurer/multilingual-NLI-26lang-2mil7 |
|
model-index: |
|
- name: electra-small-nordic-nli-scandi |
|
results: [] |
|
widget: |
|
- example_title: Danish |
|
text: Mexicansk bokser advarer Messi - 'Du skal bede til gud, om at jeg ikke finder dig' |
|
candidate_labels: sundhed, politik, sport, religion |
|
- example_title: Norwegian |
|
text: Regjeringen i Russland hevder Norge fører en politikk som vil føre til opptrapping i Arktis og «den endelige ødeleggelsen av russisk-norske relasjoner». |
|
candidate_labels: helse, politikk, sport, religion |
|
- example_title: Swedish |
|
text: Så luras kroppens immunförsvar att bota cancer |
|
candidate_labels: hälsa, politik, sport, religion |
|
inference: |
|
parameters: |
|
hypothesis_template: "Dette eksempel handler om {}" |
|
--- |
|
|
|
# ScandiNLI - Natural Language Inference model for Scandinavian Languages |
|
|
|
This model is a fine-tuned version of [jonfd/electra-small-nordic](https://huggingface.co/jonfd/electra-small-nordic) for Natural Language Inference in Danish, Norwegian Bokmål and Swedish. |
|
|
|
It has been fine-tuned on a dataset composed of [DanFEVER](https://aclanthology.org/2021.nodalida-main.pdf#page=439) as well as machine translated versions of [MultiNLI](https://cims.nyu.edu/~sbowman/multinli/) and [CommitmentBank](https://doi.org/10.18148/sub/2019.v23i2.601) into all three languages, and machine translated versions of [FEVER](https://aclanthology.org/N18-1074/) and [Adversarial NLI](https://aclanthology.org/2020.acl-main.441/) into Swedish. |
|
|
|
The three languages are sampled equally during training, and they're validated on validation splits of [DanFEVER](https://aclanthology.org/2021.nodalida-main.pdf#page=439) and machine translated versions of [MultiNLI](https://cims.nyu.edu/~sbowman/multinli/) for Swedish and Norwegian Bokmål, sampled equally. |
|
|
|
|
|
## Quick start |
|
|
|
You can use this model in your scripts as follows: |
|
|
|
```python |
|
>>> from transformers import pipeline |
|
>>> classifier = pipeline( |
|
... "zero-shot-classification", |
|
... model="alexandrainst/electra-small-nordic-nli-scandi", |
|
... ) |
|
>>> classifier( |
|
... "Mexicansk bokser advarer Messi - 'Du skal bede til gud, om at jeg ikke finder dig'", |
|
... candidate_labels=['sundhed', 'politik', 'sport', 'religion'], |
|
... hypothesis_template="Dette eksempel handler om {}", |
|
... ) |
|
{'sequence': "Mexicansk bokser advarer Messi - 'Du skal bede til gud, om at jeg ikke finder dig'", |
|
'labels': ['religion', 'sport', 'politik', 'sundhed'], |
|
'scores': [0.4504755437374115, |
|
0.20737220346927643, |
|
0.1976872682571411, |
|
0.14446501433849335]} |
|
``` |
|
|
|
## Performance |
|
|
|
As Danish is, as far as we are aware, the only Scandinavian language with a gold standard NLI dataset, namely the [DanFEVER dataset](https://aclanthology.org/2021.nodalida-main.pdf#page=439), we report evaluation scores on the test split of that dataset. |
|
|
|
We report Matthew's Correlation Coefficient (MCC), macro-average F1-score as well as accuracy. |
|
|
|
| **Model** | **MCC** | **Macro-F1** | **Accuracy** | **Number of Parameters** | |
|
| :-------- | :------------ | :--------- | :----------- | :----------- | |
|
| [`alexandrainst/nb-bert-large-nli-scandi`](https://huggingface.co/alexandrainst/nb-bert-large-nli-scandi) | **73.80%** | **58.41%** | **86.98%** | 354M | |
|
| [`alexandrainst/nb-bert-base-nli-scandi`](https://huggingface.co/alexandrainst/nb-bert-base-nli-scandi) | 62.44% | 55.00% | 80.42% | 178M | |
|
| `alexandrainst/electra-small-nordic-nli-scandi` (this) | 47.28% | 48.88% | 73.46% | **22M** | |
|
|
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 2e-05 |
|
- train_batch_size: 32 |
|
- eval_batch_size: 32 |
|
- seed: 4242 |
|
- gradient_accumulation_steps: 1 |
|
- total_train_batch_size: 32 |
|
- optimizer: Adam with betas=(0.9, 0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- lr_scheduler_warmup_steps: 500 |
|
- max_steps: 50,000 |