Datasets:
ArXiv:
License:
docs: add initial version
Browse files
README.md
CHANGED
|
@@ -1,3 +1,45 @@
|
|
| 1 |
-
---
|
| 2 |
-
license: apache-2.0
|
| 3 |
-
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
license: apache-2.0
|
| 3 |
+
---
|
| 4 |
+
|
| 5 |
+
# TurBLiMP Evaluations
|
| 6 |
+
|
| 7 |
+
This dataset hosts the TurBLiMP evaluation results on my [Turkish Model Zoo](https://huggingface.co/collections/stefan-it/turkish-language-models-653c3146fbe250285fccec0f).
|
| 8 |
+
|
| 9 |
+
More about the TurBLiMP benchmark:
|
| 10 |
+
|
| 11 |
+
> TurBLiMP is the first Turkish benchmark of linguistic minimal pairs, designed to evaluate the linguistic abilities of monolingual and multilingual language models (LMs).
|
| 12 |
+
> This benchmark covers 16 core grammatical phenomena in Turkish, with 1,000 minimal pairs per phenomenon.
|
| 13 |
+
> Additionally, it incorporates experimental paradigms that examine model performance across different subordination strategies and word order variations.
|
| 14 |
+
|
| 15 |
+
I've modified the [original](https://github.com/ezgibasar/TurBLiMP/blob/297de13fb7a0ce524fe32e8b175c6b5255d66960/evaluation.py) evaluation script and extended it for the Turkish Model Zoo. My evaluation script can be found [here](own_evaluation.py).
|
| 16 |
+
|
| 17 |
+
# Results
|
| 18 |
+
|
| 19 |
+
After running the evaluation script, all results can be parsed with this [notebook](ParseEvaluations.ipynb) to print-out a nice overview table:
|
| 20 |
+
|
| 21 |
+
| Phenomenon | [`dbmdz/electra-small-turkish-cased-generator`](https://huggingface.co/dbmdz/electra-small-turkish-cased-generator) | [`dbmdz/electra-base-turkish-cased-generator`](https://huggingface.co/dbmdz/electra-base-turkish-cased-generator) | [`dbmdz/electra-base-turkish-mc4-cased-generator`](https://huggingface.co/dbmdz/electra-base-turkish-mc4-cased-generator) | [`dbmdz/electra-base-turkish-mc4-uncased-generator`](https://huggingface.co/dbmdz/electra-base-turkish-mc4-uncased-generator) | [`dbmdz/bert-base-turkish-cased`](https://huggingface.co/dbmdz/bert-base-turkish-cased) | [`dbmdz/bert-base-turkish-uncased`](https://huggingface.co/dbmdz/bert-base-turkish-uncased) | [`dbmdz/bert-base-turkish-128k-cased`](https://huggingface.co/dbmdz/bert-base-turkish-128k-cased) | [`dbmdz/bert-base-turkish-128k-uncased`](https://huggingface.co/dbmdz/bert-base-turkish-128k-uncased) | [`dbmdz/distilbert-base-turkish-cased`](https://huggingface.co/dbmdz/distilbert-base-turkish-cased) | [`dbmdz/convbert-base-turkish-cased`](https://huggingface.co/dbmdz/convbert-base-turkish-cased) | [`dbmdz/convbert-base-turkish-mc4-cased`](https://huggingface.co/dbmdz/convbert-base-turkish-mc4-cased) | [`dbmdz/convbert-base-turkish-mc4-uncased`](https://huggingface.co/dbmdz/convbert-base-turkish-mc4-uncased) |
|
| 22 |
+
|----------------------|-----------------------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------|
|
| 23 |
+
| Anaphor Agreement | 74.1 | 94.3 | 94.3 | 92.8 | 96.7 | 97.3 | 97.3 | 97.7 | 96.9 | 58.1 | 44.3 | 44.6 |
|
| 24 |
+
| Argument Str. Tran. | 86.6 | 99.6 | 99.4 | 98.7 | 99.7 | 99.6 | 99.8 | 99.1 | 97.5 | 51.9 | 58.1 | 51.3 |
|
| 25 |
+
| Argument Str. Ditr. | 79.3 | 96.1 | 95.5 | 95.2 | 99.8 | 96.1 | 96.1 | 96.1 | 95.4 | 64.6 | 58.6 | 64.5 |
|
| 26 |
+
| Binding | 70.7 | 96.2 | 91.4 | 89.6 | 99.9 | 98.5 | 97.7 | 99 | 93 | 89.1 | 49.4 | 78.4 |
|
| 27 |
+
| Determiners | 91.8 | 99.3 | 98.2 | 99.1 | 99.9 | 100 | 99 | 99.3 | 82.9 | 0 | 0 | 0 |
|
| 28 |
+
| Ellipsis | 10.6 | 49.7 | 46.3 | 49 | 87.4 | 73.6 | 96.6 | 87.5 | 13.6 | 54.7 | 57.8 | 67.9 |
|
| 29 |
+
| Irregular Forms | 98.7 | 97.9 | 99 | 99.8 | 98.8 | 100 | 99.9 | 99.6 | 94.1 | 82.9 | 86.6 | 95.2 |
|
| 30 |
+
| Island Effects | 39.1 | 35.3 | 41.8 | 44 | 49.4 | 39.8 | 60.9 | 51.2 | 47.4 | 96.7 | 99.4 | 100 |
|
| 31 |
+
| Nominalization | 90 | 96.6 | 97 | 95.4 | 97.4 | 97 | 98.9 | 97.4 | 95.6 | 55.2 | 59.2 | 60.6 |
|
| 32 |
+
| NPI Licensing | 90.9 | 96.1 | 95 | 98 | 98.2 | 97.6 | 97.2 | 95 | 92.1 | 82.1 | 95.6 | 71.9 |
|
| 33 |
+
| Passives | 100 | 91.2 | 93.6 | 91.6 | 82.2 | 78.1 | 84.4 | 81.3 | 98.8 | 100 | 100 | 99 |
|
| 34 |
+
| Quantifiers | 97.9 | 98 | 98 | 97.6 | 95.7 | 94.6 | 98 | 98.4 | 98.4 | 99 | 99 | 99 |
|
| 35 |
+
| Relative Clauses | 79.9 | 90.7 | 92 | 91.6 | 97.7 | 97.5 | 97 | 98.5 | 92 | 53.4 | 53.7 | 56.9 |
|
| 36 |
+
| Scrambling | 99.5 | 100 | 100 | 99.8 | 100 | 100 | 99.6 | 100 | 99.8 | 38.7 | 59.3 | 63.3 |
|
| 37 |
+
| Subject Agreement | 82.8 | 99 | 97.2 | 96.1 | 98.3 | 99.2 | 99.1 | 98.8 | 97 | 47.7 | 43.9 | 56.4 |
|
| 38 |
+
| Suspended Affixation | 97.5 | 99 | 99.1 | 98.8 | 100 | 100 | 100 | 100 | 100 | 25.4 | 12.8 | 23.2 |
|
| 39 |
+
| Model Average | 80.6 | 89.9 | 89.9 | 89.8 | 93.8 | 91.8 | **95.1** | 93.7 | 87.2 | 62.5 | 61.1 | 64.5 |
|
| 40 |
+
|
| 41 |
+
# Summary
|
| 42 |
+
|
| 43 |
+
The [TurBLiMP paper](https://arxiv.org/abs/2506.13487) used the [`dbmdz/bert-base-turkish-128k-uncased`](https://huggingface.co/dbmdz/bert-base-turkish-128k-uncased) model showing strong performance.
|
| 44 |
+
|
| 45 |
+
My evaluation here showed, that the [`dbmdz/bert-base-turkish-128k-cased`](https://huggingface.co/dbmdz/bert-base-turkish-128k-cased) even performs better on the TurBLiMP benchmark.
|