Update README.md
Browse files
README.md
CHANGED
@@ -167,6 +167,18 @@ When fine-tuned on the downstream tasks, this model achieved the following perfo
|
|
167 |
| ViquiQuAD | F1 | 0.8870 |
|
168 |
| CatalanQA | F1 | 0.8962 |
|
169 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
170 |
|
171 |
## Additional information
|
172 |
|
|
|
167 |
| ViquiQuAD | F1 | 0.8870 |
|
168 |
| CatalanQA | F1 | 0.8962 |
|
169 |
|
170 |
+
### Evaluation results
|
171 |
+
|
172 |
+
| Task | NER (F1) | POS (F1) | STS-ca (Comb) | TeCla (Acc.) | TEca (Acc.) | VilaQuAD (F1/EM)| ViquiQuAD (F1/EM) | CatalanQA (F1/EM) | XQuAD-ca <sup>1</sup> (F1/EM) |
|
173 |
+
| ------------|:-------------:| -----:|:------|:------|:-------|:------|:----|:----|:----|
|
174 |
+
| RoBERTa-large-ca-v2 | **89.82** | **99.02** | **83.41** | **75.46** | **83.61** | **89.34/75.50** | **89.20**/75.77 | **90.72/79.06** |
|
175 |
+
| RoBERTa-base-ca-v2 | 89.29 | 98.96 | 79.07 | 74.26 | 83.14 | 87.74/72.58 | 88.72/**75.91** | 89.50/76.63 |
|
176 |
+
| Longformer-base-4096-ca | 88.49 | 98.98 | 78.37 | - | 83.89 | 87.59/72.33 | 88.70/**76.05** | 89.33/77.03 |
|
177 |
+
| BERTa | 89.76 | 98.96 | 80.19 | 73.65 | 79.26 | 85.93/70.58 | 87.12/73.11 | 89.17/77.14 |
|
178 |
+
| mBERT | 86.87 | 98.83 | 74.26 | 69.90 | 74.63 | 82.78/67.33 | 86.89/73.53 | 86.90/74.19 |
|
179 |
+
| XLM-RoBERTa | 86.31 | 98.89 | 61.61 | 70.14 | 33.30 | 86.29/71.83 | 86.88/73.11 | 88.17/75.93 |
|
180 |
+
|
181 |
+
|
182 |
|
183 |
## Additional information
|
184 |
|