z-logo
open-access-imgOpen Access
COMPUTATION COMPLEXITY OF DEEP RELU NEURAL NETWORKS IN HIGH-DIMENSIONAL APPROXIMATION
Author(s) -
Dinh Dũng,
Van Kien Nguyen,
Mai Xuan Thao
Publication year - 2021
Publication title -
journal of computer science and cybernetics (vietnam academy of science and technology)/journal of computer science and cybernetics
Language(s) - English
Resource type - Journals
eISSN - 2815-5939
pISSN - 1813-9663
DOI - 10.15625/1813-9663/37/3/15902
Subject(s) - unit cube , computation , artificial neural network , smoothness , context (archaeology) , dimension (graph theory) , cube (algebra) , deep neural networks , mathematics , algorithm , computer science , combinatorics , topology (electrical circuits) , discrete mathematics , artificial intelligence , mathematical analysis , paleontology , biology
The purpose of the present paper is to study the computation complexity of deep ReLU neural networks to approximate functions in H\"older-Nikol'skii spaces of mixed smoothness $H_\infty^\alpha(\mathbb{I}^d)$ on the unit cube $\mathbb{I}^d:=[0,1]^d$. In this context, for any function $f\in H_\infty^\alpha(\mathbb{I}^d)$, we explicitly construct nonadaptive and adaptive deep ReLU neural networks having an output that approximates $f$ with a prescribed accuracy $\varepsilon$, and prove dimension-dependent bounds for the computation complexity of this approximation, characterized by the size and the depth of this deep ReLU neural network, explicitly in $d$ and $\varepsilon$. Our results show the advantage of the adaptive method of approximation by deep ReLU neural networks over nonadaptive one.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here