z-logo
open-access-imgOpen Access
DNABERT: pre-trained Bidirectional Encoder Representations from Transformers model for DNA-language in genome
Author(s) -
Yanrong Ji,
Zhihan Zhou,
Han Liu,
Ramana V. Davuluri
Publication year - 2021
Publication title -
bioinformatics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 3.599
H-Index - 390
eISSN - 1367-4811
pISSN - 1367-4803
DOI - 10.1093/bioinformatics/btab083
Subject(s) - encoder , computer science , transformer , genome , dna , natural language processing , genetics , biology , gene , engineering , electrical engineering , operating system , voltage
Deciphering the language of non-coding DNA is one of the fundamental problems in genome research. Gene regulatory code is highly complex due to the existence of polysemy and distant semantic relationship, which previous informatics methods often fail to capture especially in data-scarce scenarios.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom