z-logo
open-access-imgOpen Access
Improving representations of genomic sequence motifs in convolutional networks with exponential activations
Author(s) -
Peter K. Koo,
Matt Ploenzke
Publication year - 2021
Publication title -
nature machine intelligence
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 4.894
H-Index - 16
ISSN - 2522-5839
DOI - 10.1038/s42256-020-00291-x
Subject(s) - interpretability , convolutional neural network , computer science , artificial intelligence , robustness (evolution) , exponential growth , sequence motif , machine learning , pattern recognition (psychology) , mathematics , biology , dna , genetics , gene , mathematical analysis
Deep convolutional neural networks (CNNs) trained on regulatory genomic sequences tend to build representations in a distributed manner, making it a challenge to extract learned features that are biologically meaningful, such as sequence motifs. Here we perform a comprehensive analysis on synthetic sequences to investigate the role that CNN activations have on model interpretability. We show that employing an exponential activation to first layer filters consistently leads to interpretable and robust representations of motifs compared to other commonly used activations. Strikingly, we demonstrate that CNNs with better test performance do not necessarily imply more interpretable representations with attribution methods. We find that CNNs with exponential activations significantly improve the efficacy of recovering biologically meaningful representations with attribution methods. We demonstrate these results generalise to real DNA sequences across several in vivo datasets. Together, this work demonstrates how a small modification to existing CNNs, i.e. setting exponential activations in the first layer, can significantly improve the robustness and interpretabilty of learned representations directly in convolutional filters and indirectly with attribution methods.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here