z-logo
Premium
Deep learning: Computational aspects
Author(s) -
Polson Nicholas,
Sokolov Vadim
Publication year - 2020
Publication title -
wiley interdisciplinary reviews: computational statistics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.693
H-Index - 38
eISSN - 1939-0068
pISSN - 1939-5108
DOI - 10.1002/wics.1500
Subject(s) - stochastic gradient descent , artificial intelligence , computer science , deep learning , statistical inference , exploratory data analysis , machine learning , inference , key (lock) , artificial neural network , latent variable , construct (python library) , gradient descent , data mining , mathematics , statistics , computer security , programming language
In this article, we review computational aspects of deep learning (DL). DL uses network architectures consisting of hierarchical layers of latent variables to construct predictors for high‐dimensional input–output models. Training a DL architecture is computationally intensive, and efficient linear algebra library is the key for training and inference. Stochastic gradient descent (SGD) optimization and batch sampling are used to learn from massive datasets. This article is categorized under: Statistical Learning and Exploratory Methods of the Data Sciences > Deep Learning Statistical Learning and Exploratory Methods of the Data Sciences > Modeling Methods Statistical Learning and Exploratory Methods of the Data Sciences > Neural Networks

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here