Chemformer: a pre-trained transformer for computational chemistry
Author(s) -
Ross Irwin,
Spyridon Dimitriadis,
Jiazhen He,
Esben Jannik Bjerrum
Publication year - 2021
Publication title -
machine learning science and technology
Language(s) - English
Resource type - Journals
ISSN - 2632-2153
DOI - 10.1088/2632-2153/ac3ffb
Subject(s) - cheminformatics , discriminative model , computer science , transformer , machine learning , benchmark (surveying) , artificial intelligence , data mining , engineering , bioinformatics , voltage , geodesy , geography , electrical engineering , biology
Transformer models coupled with a simplified molecular line entry system (SMILES) have recently proven to be a powerful combination for solving challenges in cheminformatics. These models, however, are often developed specifically for a single application and can be very resource-intensive to train. In this work we present the Chemformer model—a Transformer-based model which can be quickly applied to both sequence-to-sequence and discriminative cheminformatics tasks. Additionally, we show that self-supervised pre-training can improve performance and significantly speed up convergence on downstream tasks. On direct synthesis and retrosynthesis prediction benchmark datasets we publish state-of-the-art results for top-1 accuracy. We also improve on existing approaches for a molecular optimisation task and show that Chemformer can optimise on multiple discriminative tasks simultaneously. Models, datasets and code will be made available after publication.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom