z-logo
open-access-imgOpen Access
Effective Pruning of Binary Activation Neural Networks
Author(s) -
William Severa,
Ryan Dellana,
Craig M. Vineyard
Publication year - 2020
Publication title -
osti oai (u.s. department of energy office of scientific and technical information)
Language(s) - English
Resource type - Conference proceedings
DOI - 10.1145/3407197.3407201
Subject(s) - pruning , computer science , enhanced data rates for gsm evolution , inference , artificial intelligence , deep neural networks , binary number , artificial neural network , deep learning , machine learning , edge device , mathematics , cloud computing , operating system , arithmetic , agronomy , biology
Deep learning networks have become a vital tool for image and data processing tasks for deployed and edge applications. Resource constraints, particularly low power budgets, have motivated methods and devices for efficient on-edge inference. Two promising methods are reduced precision communication networks (e.g. binary activation spiking neural networks) and weight pruning. In this paper, we provide a preliminary exploration for combining these two methods, specifically in-training weight pruning of whetstone networks, to achieve deep networks with both sparse weights and binary activations.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom