Cross-Language Language Model (XLM)
(Redirected from Cross-Lingual LM)
Jump to navigation
Jump to search
A Cross-Language Language Model (XLM) is a language model that ...
- Example(s):
- See: Mono-Lingual Language Model, Masked Language Modeling.
References
2020
- (Conneau et al., 2020) ⇒ Alexis Conneau, Kartikay Khandelwal, Naman Goyal, Vishrav Chaudhary, Guillaume Wenzek, Francisco Guzmán, Edouard Grave, Myle Ott, Luke Zettlemoyer, and Veselin Stoyanov. (2020). “Unsupervised Cross-lingual Representation Learning at Scale.” In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, (ACL-2020).
- QUOTE: ... The goal of this paper is to improve cross-lingual language understanding (XLU), by carefully studying the effects of training unsupervised crosslingual representations at a very large scale. We present XLM-R a transformer-based multilingual masked language model pre-trained on text in 100 languages, which obtains state-of-the-art performance on cross-lingual classification, sequence labeling and question answering. Multilingual masked language models (MLM) like mBERT (Devlin et al., 2018) and XLM (Lample and Conneau, 2019) have pushed the state-of-the-art on cross-lingual understanding tasks by jointly pretraining large Transformer models (Vaswani et al., 2017) on many languages. These models allow for effective cross-lingual transfer, as seen in a number of benchmarks including cross-lingual natural language inference (Bowman et al., 2015; Williams et al., 2017; Conneau et al., 2018), question answering (Rajpurkar et al., 2016; Lewis et al., 2019), and named entity recognition (Pires et al., 2019; Wu and Dredze, 2019). However, all of these studies pre-train on Wikipedia, which provides a relatively limited scale especially for lower resource languages. ...