Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,93 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
language:
|
3 |
+
- it
|
4 |
+
license: apache-2.0
|
5 |
+
tags:
|
6 |
+
- italian
|
7 |
+
- sequence-to-sequence
|
8 |
+
- style-transfer
|
9 |
+
- formality-style-transfer
|
10 |
+
datasets:
|
11 |
+
- yahoo/xformal_it
|
12 |
+
widget:
|
13 |
+
- text: "maronn qualcuno mi spieg' CHECCOSA SUCCEDE?!?!"
|
14 |
+
- text: "wellaaaaaaa, ma fraté sei proprio troppo simpatiko, grazieeee!!"
|
15 |
+
- text: "nn capisco xke tt i ragazzi lo fanno"
|
16 |
+
- text: "IT5 è SUPERMEGA BRAVISSIMO a capire tt il vernacolo italiano!!!"
|
17 |
+
metrics:
|
18 |
+
- rouge
|
19 |
+
- bertscore
|
20 |
+
model-index:
|
21 |
+
- name: mt5-base-informal-to-formal
|
22 |
+
results:
|
23 |
+
- task:
|
24 |
+
type: formality-style-transfer
|
25 |
+
name: "Informal-to-formal Style Transfer"
|
26 |
+
dataset:
|
27 |
+
type: xformal_it
|
28 |
+
name: "XFORMAL (Italian Subset)"
|
29 |
+
metrics:
|
30 |
+
- type: rouge1
|
31 |
+
value: 0.661
|
32 |
+
name: "Avg. Test Rouge1"
|
33 |
+
- type: rouge2
|
34 |
+
value: 0.471
|
35 |
+
name: "Avg. Test Rouge2"
|
36 |
+
- type: rougeL
|
37 |
+
value: 0.642
|
38 |
+
name: "Avg. Test RougeL"
|
39 |
+
- type: bertscore
|
40 |
+
value: 0.712
|
41 |
+
name: "Avg. Test BERTScore"
|
42 |
+
args:
|
43 |
+
- model_type: "dbmdz/bert-base-italian-xxl-uncased"
|
44 |
+
- lang: "it"
|
45 |
+
- num_layers: 10
|
46 |
+
- rescale_with_baseline: True
|
47 |
+
- baseline_path: "bertscore_baseline_ita.tsv"
|
48 |
+
co2_eq_emissions:
|
49 |
+
emissions: "40g"
|
50 |
+
source: "Google Cloud Platform Carbon Footprint"
|
51 |
+
training_type: "fine-tuning"
|
52 |
+
geographical_location: "Eemshaven, Netherlands, Europe"
|
53 |
+
hardware_used: "1 TPU v3-8 VM"
|
54 |
+
---
|
55 |
+
|
56 |
+
# mT5 Base for Informal-to-formal Style Transfer 🧐
|
57 |
+
|
58 |
+
This repository contains the checkpoint for the [mT5 Base](https://huggingface.co/google/mt5-base) model fine-tuned on Informal-to-formal style transfer on the Italian subset of the XFORMAL dataset as part of the experiments of the paper [IT5: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation](https://arxiv.org) by Gabriele Sarti and Malvina Nissim.
|
59 |
+
|
60 |
+
A comprehensive overview of other released materials is provided in the [gsarti/it5](https://github.com/gsarti/it5) repository. Refer to the paper for additional details concerning the reported scores and the evaluation approach.
|
61 |
+
|
62 |
+
## Using the model
|
63 |
+
|
64 |
+
Model checkpoints are available for usage in Tensorflow, Pytorch and JAX. They can be used directly with pipelines as:
|
65 |
+
|
66 |
+
```python
|
67 |
+
from transformers import pipelines
|
68 |
+
|
69 |
+
i2f = pipeline("text2text-generation", model='it5/mt5-base-informal-to-formal')
|
70 |
+
i2f("nn capisco xke tt i ragazzi lo fanno")
|
71 |
+
>>> [{"generated_text": "non comprendo perché tutti i ragazzi agiscono così"}]
|
72 |
+
```
|
73 |
+
|
74 |
+
or loaded using autoclasses:
|
75 |
+
|
76 |
+
```python
|
77 |
+
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
|
78 |
+
|
79 |
+
tokenizer = AutoTokenizer.from_pretrained("it5/mt5-base-informal-to-formal")
|
80 |
+
model = AutoModelForSeq2SeqLM.from_pretrained("it5/mt5-base-informal-to-formal")
|
81 |
+
```
|
82 |
+
|
83 |
+
If you use this model in your research, please cite our work as:
|
84 |
+
|
85 |
+
```bibtex
|
86 |
+
@article{sarti-nissim-2022-it5,
|
87 |
+
title={IT5: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation},
|
88 |
+
author={Sarti, Gabriele and Nissim, Malvina},
|
89 |
+
journal={ArXiv preprint TBD},
|
90 |
+
url={TBD},
|
91 |
+
year={2022}
|
92 |
+
}
|
93 |
+
```
|