Update README.md
Browse files
README.md
CHANGED
@@ -172,13 +172,14 @@ You can finetune this model on your own dataset.
|
|
172 |
## Benchmarks
|
173 |
|
174 |
### Retieval
|
175 |
-
Evaluated with [MIRACL-ja](https://huggingface.co/datasets/miracl/miracl), [JQARA](https://huggingface.co/datasets/hotchpotch/JQaRA) and [MLDR-ja](https://huggingface.co/datasets/Shitao/MLDR).
|
176 |
|
177 |
-
| model | size | MIRACL<br>Recall@5 | JQaRA<br>nDCG@10 | MLDR<br>nDCG@10 |
|
178 |
-
|
179 |
-
| mE5-base | 0.3B | 84.2 | 47.2 | 25.4 |
|
180 |
-
| GLuCoSE | 0.1B | 53.3 | 30.8 | 25.2 |
|
181 |
-
|
|
|
|
182 |
|
183 |
### JMTEB
|
184 |
Evaluated with [JMTEB](https://github.com/sbintuitions/JMTEB).
|
@@ -187,8 +188,8 @@ Evaluated with [JMTEB](https://github.com/sbintuitions/JMTEB).
|
|
187 |
|
188 |
| model | size | Class. | Ret. | STS. | Clus. | Pair. | Avg. |
|
189 |
|:--:|:--:|:--:|:--:|:----:|:-------:|:-------:|:------:|
|
190 |
-
| mE5-base
|
191 |
-
| GLuCoSE | 0.1B | **82.6** | 69.8 | 78.2 | 51.5 | **66.2** | 69.7 |
|
192 |
| GLuCoSE v2 | 0.1B | 80.5 | **82.8** | **83.0** | 49.8 | 62.4 | **71.7** |
|
193 |
|
194 |
|
|
|
172 |
## Benchmarks
|
173 |
|
174 |
### Retieval
|
175 |
+
Evaluated with [MIRACL-ja](https://huggingface.co/datasets/miracl/miracl), [JQARA](https://huggingface.co/datasets/hotchpotch/JQaRA) , [JaCWIR](https://huggingface.co/datasets/hotchpotch/JaCWIR) and [MLDR-ja](https://huggingface.co/datasets/Shitao/MLDR).
|
176 |
|
177 |
+
| model | size | MIRACL<br>Recall@5 | JQaRA<br>nDCG@10 | JaCWIR<br>MAP@10 | MLDR<br>nDCG@10 |
|
178 |
+
|:--:|:--:|:--:|:--:|:--:|:----:|
|
179 |
+
| [mE5-base](https://huggingface.co/intfloat/multilingual-e5-base) | 0.3B | 84.2 | 47.2 | **85.3** | 25.4 |
|
180 |
+
| [GLuCoSE](https://huggingface.co/pkshatech/GLuCoSE-base-ja) | 0.1B | 53.3 | 30.8 | 68.6 | 25.2 |
|
181 |
+
|[ruri-base](https://huggingface.co/cl-nagoya/ruri-base) | 0.1B | 74.3 | 58.1 | 84.6 | **35.3** |
|
182 |
+
| GLuCoSE v2 | 0.1B | **85.5** | **60.6** | **85.3** | 33.8 |
|
183 |
|
184 |
### JMTEB
|
185 |
Evaluated with [JMTEB](https://github.com/sbintuitions/JMTEB).
|
|
|
188 |
|
189 |
| model | size | Class. | Ret. | STS. | Clus. | Pair. | Avg. |
|
190 |
|:--:|:--:|:--:|:--:|:----:|:-------:|:-------:|:------:|
|
191 |
+
| [mE5-base](https://huggingface.co/intfloat/multilingual-e5-base) | 0.3B | 75.1 | 80.6 | 80.5 | **52.6** | 62.4 | 70.2 |
|
192 |
+
| [GLuCoSE](https://huggingface.co/pkshatech/GLuCoSE-base-ja) | 0.1B | **82.6** | 69.8 | 78.2 | 51.5 | **66.2** | 69.7 |
|
193 |
| GLuCoSE v2 | 0.1B | 80.5 | **82.8** | **83.0** | 49.8 | 62.4 | **71.7** |
|
194 |
|
195 |
|