z-logo
open-access-imgOpen Access
Improving deep convolutional neural networks with mixed maxout units
Author(s) -
Haitao Zhao,
Liu Fu-xian,
Long-yue Li
Publication year - 2017
Publication title -
plos one
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.99
H-Index - 332
ISSN - 1932-6203
DOI - 10.1371/journal.pone.0180049
Subject(s) - pooling , convolutional neural network , computer science , feature (linguistics) , artificial intelligence , subspace topology , pattern recognition (psychology) , deep learning , machine learning , feature learning , artificial neural network , philosophy , linguistics
Motivated by insights from the maxout-units-based deep Convolutional Neural Network (CNN) that “non-maximal features are unable to deliver” and “feature mapping subspace pooling is insufficient,” we present a novel mixed variant of the recently introduced maxout unit called a mixout unit. Specifically, we do so by calculating the exponential probabilities of feature mappings gained by applying different convolutional transformations over the same input and then calculating the expected values according to their exponential probabilities. Moreover, we introduce the Bernoulli distribution to balance the maximum values with the expected values of the feature mappings subspace. Finally, we design a simple model to verify the pooling ability of mixout units and a Mixout-units-based Network-in-Network (NiN) model to analyze the feature learning ability of the mixout models. We argue that our proposed units improve the pooling ability and that mixout models can achieve better feature learning and classification performance.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here