Event-driven implementation of deep spiking convolutional neural networks for supervised classification using the SpiNNaker neuromorphic platform
Description
Neural networks have enabled great advances in recent times due mainly to improved parallel computing capabilities in accordance to Moore's Law, which allowed reducing the time needed for the parameter learning of complex, multi-layered neural architectures. However, with silicon technology reaching its physical limits, new types of computing paradigms are needed to increase the power efficiency of learning algorithms, especially for dealing with deep spatio-temporal knowledge on embedded applications. With the goal of mimicking the brain's power efficiency, new hardware architectures such as the SpiNNaker board have been built. Furthermore, recent works have shown that networks using spiking neurons as learning units can match classical neural networks in supervised tasks. In this paper, we show that the implementation of state-of-the-art models on both the MNIST and the event-based NMNIST digit recognition datasets is possible on neuromorphic hardware. We use two approaches, by directly converting a classical neural network to its spiking version and by training a spiking network from scratch. For both cases, software simulations and implementations into a SpiNNaker 103 machine were performed. Numerical results approaching the state of the art on digit recognition are presented, and a new method to decrease the spike rate needed for the task is proposed, which allows a significant reduction of the spikes (up to 34 times for a fully connected architecture) while preserving the accuracy of the system. With this method, we provide new insights on the capabilities offered by networks of spiking neurons to efficiently encode spatio-temporal information.
Abstract
Consejo Nacional de Ciencia Y Tecnología (México) FC2016-1961
Abstract
European Union's Horizon 2020 No 824164 HERMES
Abstract
Ministerio de Ciencia, Innovación y Universidades TEC2015-63884-C2-1-P
Additional details
- URL
- https://idus.us.es/handle//11441/102108
- URN
- urn:oai:idus.us.es:11441/102108
- Origin repository
- USE