z-logo
open-access-imgOpen Access
Cross-lingual transfer of sentiment classifiers
Author(s) -
Marko Robnik–Šikonja,
Kristjan Reba,
Igor Mozetič
Publication year - 2021
Publication title -
slovenščina 2.0
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.165
H-Index - 1
ISSN - 2335-2736
DOI - 10.4312/slo2.0.2021.1.1-25
Subject(s) - computer science , word (group theory) , natural language processing , artificial intelligence , space (punctuation) , vector space , transfer (computing) , transfer of learning , language model , focus (optics) , joint (building) , linguistics , mathematics , architectural engineering , philosophy , physics , geometry , parallel computing , optics , engineering , operating system
Word embeddings represent words in a numeric space so that semantic relations between words are represented as distances and directions in the vector space. Cross-lingual word embeddings transform vector spaces of different languages so that similar words are aligned. This is done by mapping one language’s vector space to the vector space of another language or by construction of a joint vector space for multiple languages. Cross-lingual embeddings can be used to transfer machine learning models between languages, thereby compensating for insufficient data in less-resourced languages. We use cross-lingual word embeddings to transfer machine learning prediction models for Twitter sentiment between 13 languages. We focus on two transfer mechanisms that recently show superior transfer performance. The first mechanism uses the trained models whose input is the joint numerical space for many languages as implemented in the LASER library. The second mechanism uses large pretrained multilingual BERT language models. Our experiments show that the transfer of models between similar languages is sensible, even with no target language data. The performance of cross-lingual models obtained with the multilingual BERT and LASER library is comparable, and the differences are language-dependent. The transfer with CroSloEngual BERT, pretrained on only three languages, is superior on these and some closely related languages.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here