z-logo
open-access-imgOpen Access
Rademacher Complexity in Simplex/l Set
Author(s) -
YenShen Lu
Publication year - 2021
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1827/1/012145
Subject(s) - artificial neural network , computer science , set (abstract data type) , path (computing) , simplex , computational complexity theory , sample complexity , artificial intelligence , theoretical computer science , algorithm , mathematics , combinatorics , computer network , programming language
When the size of the neural network is too large, calculating the bound of neural network is a difficult problem. Therefore, “size-independent” is what needs to look for here. This paper follows the path of “Size-Independent Sample Complexity of Neural Network ”, and tries to get a better expression of Rademacher Complexity of neural networks.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here