z-logo
open-access-imgOpen Access
Revisiting Multi-Domain Machine Translation
Author(s) -
Minh Quang Pham,
Josep-Maria Crego,
François Yvon
Publication year - 2021
Publication title -
transactions of the association for computational linguistics
Language(s) - English
Resource type - Journals
ISSN - 2307-387X
DOI - 10.1162/tacl_a_00351
Subject(s) - computer science , domain (mathematical analysis) , machine translation , translation (biology) , machine learning , artificial intelligence , work (physics) , sample (material) , training set , natural language processing , mechanical engineering , mathematical analysis , biochemistry , chemistry , mathematics , chromatography , messenger rna , engineering , gene
When building machine translation systems, one often needs to make the best out of heterogeneous sets of parallel data in training, and to robustly handle inputs from unexpected domains in testing. This multi-domain scenario has attracted a lot of recent work that fall under the general umbrella of transfer learning. In this study, we revisit multi-domain machine translation, with the aim to formulate the motivations for developing such systems and the associated expectations with respect to performance. Our experiments with a large sample of multi-domain systems show that most of these expectations are hardly met and suggest that further work is needed to better analyze the current behaviour of multi-domain systems and to make them fully hold their promises.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom