Abstract
Multilingual language models have pushed state-of-the-art in cross-lingual NLP transfer. The majority of zero-shot cross-lingual transfer, however, use one and the same massively multilingual transformer (e.g., mBERT or XLM-R) to transfer to all target languages, irrespective of their typological, etymological, and phylogenetic relations to other languages. In particular, readily available data and models of resource-rich sibling languages are often ignored. In this work, we empirically show, in a case study for Faroese – a low-resource language from a high-resource language family – that by leveraging the phylogenetic information and departing from the ‘one-size-fits-all’ paradigm, one can improve cross-lingual transfer to low-resource languages. In particular, we leverage abundant resources of other Scandinavian languages (i.e., Danish, Norwegian, Swedish, and Icelandic) for the benefit of Faroese. Our evaluation results show that we can substantially improve the transfer performance to Faroese by exploiting data and models of closely-related high-resource languages. Further, we release a new web corpus of Faroese and Faroese datasets for named entity recognition (NER), semantic text similarity (STS), and new language models trained on all Scandinavian languages.
| Originalsprog | Engelsk |
|---|---|
| Titel | Proceedings of the 24th Nordic Conference on Computational Linguistics (NoDaLiDa) |
| Redaktører | Tanel Alumae, Mark Fishel |
| Forlag | University of Tartu Library |
| Publikationsdato | 2023 |
| Sider | 728-737 |
| ISBN (Elektronisk) | 9789916219997 |
| Status | Udgivet - 2023 |
| Begivenhed | NoDaLiDa 2023: The 24th Nordic Conference on Computational Linguistics - Faroe Islands, Tórshavn, Danmark Varighed: 22 maj 2023 → 24 maj 2023 Konferencens nummer: 24 https://www.nodalida2023.fo/ |
Konference
| Konference | NoDaLiDa 2023 |
|---|---|
| Nummer | 24 |
| Lokation | Faroe Islands |
| Land/Område | Danmark |
| By | Tórshavn |
| Periode | 22/05/2023 → 24/05/2023 |
| Internetadresse |
Bibliografisk note
Publisher Copyright:© 2023 Association for Computational Linguistics.
Citationsformater
- APA
- Standard
- Harvard
- Vancouver
- Author
- BIBTEX
- RIS