Energy-Efficient Neuromorphic Classifiers
نویسندگان
چکیده
Neuromorphic engineering combines the architectural and computational principles of systems neuroscience with semiconductor electronics, with the aim of building efficient and compact devices that mimic the synaptic and neural machinery of the brain. The energy consumptions promised by neuromorphic engineering are extremely low, comparable to those of the nervous system. Until now, however, the neuromorphic approach has been restricted to relatively simple circuits and specialized functions, thereby obfuscating a direct comparison of their energy consumption to that used by conventional von Neumann digital machines solving real-world tasks. Here we show that a recent technology developed by IBM can be leveraged to realize neuromorphic circuits that operate as classifiers of complex real-world stimuli. Specifically, we provide a set of general prescriptions to enable the practical implementation of neural architectures that compete with state-of-the-art classifiers. We also show that the energy consumption of these architectures, realized on the IBM chip, is typically two or more orders of magnitude lower than that of conventional digital machines implementing classifiers with comparable performance. Moreover, the spike-based dynamics display a trade-off between integration time and accuracy, which naturally translates into algorithms that can be flexibly deployed for either fast and approximate classifications, or more accurate classifications at the mere expense of longer running times and higher energy costs. This work finally proves that the neuromorphic approach can be efficiently used in real-world applications and has significant advantages over conventional digital devices when energy consumption is considered.
منابع مشابه
Backpropagation for Energy-Efficient Neuromorphic Computing
Solving real world problems with embedded neural networks requires both training algorithms that achieve high performance and compatible hardware that runs in real time while remaining energy efficient. For the former, deep learning using backpropagation has recently achieved a string of successes across many domains and datasets. For the latter, neuromorphic chips that run spiking neural netwo...
متن کاملFinding a roadmap to achieve large neuromorphic hardware systems
Neuromorphic systems are gaining increasing importance in an era where CMOS digital computing techniques are reaching physical limits. These silicon systems mimic extremely energy efficient neural computing structures, potentially both for solving engineering applications as well as understanding neural computation. Toward this end, the authors provide a glimpse at what the technology evolution...
متن کاملNeuromorphic Hardware As Database Co-Processors
Today’s databases excel at processing data using fairly simple operators but are not efficient at executing operators which include pattern matching, speech recognition or other cognitive tasks. The only way to use such operators in data processing today is to simulate spiking neural networks. Neuromorphic hardware is supposed to become ubiquitous in complementing traditional computational infr...
متن کامل25-1: Pattern Classification with Memristive Crossbar Circuits
Neuromorphic pattern classifiers were implemented, for the first time, using transistor-free integrated crossbar circuits with bilayer metal-oxide memristors. 10×6and 10×8-crosspoint neuromorphic networks were trained in-situ using a Manhattan-Rule algorithm to separate a set of 3×3 binary images: into 3 classes using the batch-mode training, and into 4 classes using the stochastic-mode trainin...
متن کاملSpin Neurons: A Possible Path to Energy-Efficient Neuromorphic Computers
Recent years have witnessed growing interest in the field of brain-inspired computing based on neural-network architectures. In order to translate the related algorithmic models into powerful, yet energy-efficient cognitive-computing hardware, computing-devices beyond CMOS may need to be explored. The suitability of such devices to this field of computing would strongly depend upon how closely ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Neural computation
دوره 28 10 شماره
صفحات -
تاریخ انتشار 2016