TY - GEN
T1 - SpikeDyn
T2 - 58th ACM/IEEE Design Automation Conference, DAC 2021
AU - Putra, Rachmad Vidya Wicaksana
AU - Shafique, Muhammad
N1 - Funding Information:
VII. ACKNOWLEDGMENT This work was partly supported by the Indonesia Endowment Fund for Education (LPDP) Program from Ministry of Finance, Indonesia. This work was also jointly supported by the NYUAD Center for Interacting Urban Networks (CITIES), funded by Tamkeen under the NYUAD Research Institute Award CG001 and by the Swiss Re Institute under the Quantum Cities™ initiative, and Center for CyberSecurity (CCS), funded by Tamkeen under the NYUAD Research Institute Award G1104. REFERENCES
Publisher Copyright:
© 2021 IEEE.
PY - 2021/12/5
Y1 - 2021/12/5
N2 - Spiking Neural Networks (SNNs) bear the potential of efficient unsupervised and continual learning capabilities because of their biological plausibility, but their complexity still poses a serious research challenge to enable their energy-efficient design for resource-constrained scenarios (like embedded systems, IoT-Edge, etc.). We propose SpikeDyn, a comprehensive framework for energy-efficient SNNs with continual and unsupervised learning capabilities in dynamic environments, for both the training and inference phases. It is achieved through the following multiple diverse mechanisms: 1) reduction of neuronal operations, by replacing the inhibitory neurons with direct lateral inhibitions; 2) a memory-and energy-constrained SNN model search algorithm that employs analytical models to estimate the memory footprint and energy consumption of different candidate SNN models and selects a Pareto-optimal SNN model; and 3) a lightweight continual and unsupervised learning algorithm that employs adaptive learning rates, adaptive membrane threshold potential, weight decay, and reduction of spurious updates. Our experimental results show that, for a network with 400 excitatory neurons, our SpikeDyn reduces the energy consumption on average by 51% for training and by 37% for inference, as compared to the state-of-the-art. Due to the improved learning algorithm, SpikeDyn provides on avg. 21% accuracy improvement over the state-of-the-art, for classifying the most recently learned task, and by 8% on average for the previously learned tasks.
AB - Spiking Neural Networks (SNNs) bear the potential of efficient unsupervised and continual learning capabilities because of their biological plausibility, but their complexity still poses a serious research challenge to enable their energy-efficient design for resource-constrained scenarios (like embedded systems, IoT-Edge, etc.). We propose SpikeDyn, a comprehensive framework for energy-efficient SNNs with continual and unsupervised learning capabilities in dynamic environments, for both the training and inference phases. It is achieved through the following multiple diverse mechanisms: 1) reduction of neuronal operations, by replacing the inhibitory neurons with direct lateral inhibitions; 2) a memory-and energy-constrained SNN model search algorithm that employs analytical models to estimate the memory footprint and energy consumption of different candidate SNN models and selects a Pareto-optimal SNN model; and 3) a lightweight continual and unsupervised learning algorithm that employs adaptive learning rates, adaptive membrane threshold potential, weight decay, and reduction of spurious updates. Our experimental results show that, for a network with 400 excitatory neurons, our SpikeDyn reduces the energy consumption on average by 51% for training and by 37% for inference, as compared to the state-of-the-art. Due to the improved learning algorithm, SpikeDyn provides on avg. 21% accuracy improvement over the state-of-the-art, for classifying the most recently learned task, and by 8% on average for the previously learned tasks.
KW - complexity
KW - continual learning
KW - embedded systems
KW - energy-efficiency
KW - SNNs
KW - Spiking neural networks
KW - unsupervised learning
UR - http://www.scopus.com/inward/record.url?scp=85119434845&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85119434845&partnerID=8YFLogxK
U2 - 10.1109/DAC18074.2021.9586281
DO - 10.1109/DAC18074.2021.9586281
M3 - Conference contribution
AN - SCOPUS:85119434845
T3 - Proceedings - Design Automation Conference
SP - 1057
EP - 1062
BT - 2021 58th ACM/IEEE Design Automation Conference, DAC 2021
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 5 December 2021 through 9 December 2021
ER -