Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 7 de 7
Filter
Add more filters










Database
Language
Publication year range
1.
Nat Mater ; 21(2): 195-202, 2022 02.
Article in English | MEDLINE | ID: mdl-34608285

ABSTRACT

Neuromorphic computing aims at the realization of intelligent systems able to process information similarly to our brain. Brain-inspired computing paradigms have been implemented in crossbar arrays of memristive devices; however, this approach does not emulate the topology and the emergent behaviour of biological neuronal circuits, where the principle of self-organization regulates both structure and function. Here, we report on in materia reservoir computing in a fully memristive architecture based on self-organized nanowire networks. Thanks to the functional synaptic connectivity with nonlinear dynamics and fading memory properties, the designless nanowire complex network acts as a network-wide physical reservoir able to map spatio-temporal inputs into a feature space that can be analysed by a memristive resistive switching memory read-out layer. Computing capabilities, including recognition of spatio-temporal patterns and time-series prediction, show that the emergent memristive behaviour of nanowire networks allows in materia implementation of brain-inspired computing paradigms characterized by a reduced training cost.


Subject(s)
Nanowires , Neural Networks, Computer , Brain , Neurons/physiology , Nonlinear Dynamics
2.
Nat Commun ; 12(1): 5806, 2021 10 04.
Article in English | MEDLINE | ID: mdl-34608133

ABSTRACT

Tree-based machine learning techniques, such as Decision Trees and Random Forests, are top performers in several domains as they do well with limited training datasets and offer improved interpretability compared to Deep Neural Networks (DNN). However, these models are difficult to optimize for fast inference at scale without accuracy loss in von Neumann architectures due to non-uniform memory access patterns. Recently, we proposed a novel analog content addressable memory (CAM) based on emerging memristor devices for fast look-up table operations. Here, we propose for the first time to use the analog CAM as an in-memory computational primitive to accelerate tree-based model inference. We demonstrate an efficient mapping algorithm leveraging the new analog CAM capabilities such that each root to leaf path of a Decision Tree is programmed into a row. This new in-memory compute concept for enables few-cycle model inference, dramatically increasing 103 × the throughput over conventional approaches.

3.
Front Neurosci ; 15: 709053, 2021.
Article in English | MEDLINE | ID: mdl-34489628

ABSTRACT

One of the main goals of neuromorphic computing is the implementation and design of systems capable of dynamic evolution with respect to their own experience. In biology, synaptic scaling is the homeostatic mechanism which controls the frequency of neural spikes within stable boundaries for improved learning activity. To introduce such control mechanism in a hardware spiking neural network (SNN), we present here a novel artificial neuron based on phase change memory (PCM) devices capable of internal regulation via homeostatic and plastic phenomena. We experimentally show that this mechanism increases the robustness of the system thus optimizing the multi-pattern learning under spike-timing-dependent plasticity (STDP). It also improves the continual learning capability of hybrid supervised-unsupervised convolutional neural networks (CNNs), in terms of both resilience and accuracy. Furthermore, the use of neurons capable of self-regulating their fire responsivity as a function of the PCM internal state enables the design of dynamic networks. In this scenario, we propose to use the PCM-based neurons to design bio-inspired recurrent networks for autonomous decision making in navigation tasks. The agent relies on neuronal spike-frequency adaptation (SFA) to explore the environment via penalties and rewards. Finally, we show that the conductance drift of the PCM devices, contrarily to the applications in neural network accelerators, can improve the overall energy efficiency of neuromorphic computing by implementing bio-plausible active forgetting.

4.
Nat Commun ; 11(1): 1638, 2020 Apr 02.
Article in English | MEDLINE | ID: mdl-32242006

ABSTRACT

A content-addressable memory compares an input search word against all rows of stored words in an array in a highly parallel manner. While supplying a very powerful functionality for many applications in pattern matching and search, it suffers from large area, cost and power consumption, limiting its use. Past improvements have been realized by using memristors to replace the static random-access memory cell in conventional designs, but employ similar schemes based only on binary or ternary states for storage and search. We propose a new analog content-addressable memory concept and circuit to overcome these limitations by utilizing the analog conductance tunability of memristors. Our analog content-addressable memory stores data within the programmable conductance and can take as input either analog or digital search values. Experimental demonstrations, scaled simulations and analysis show that our analog content-addressable memory can reduce area and power consumption, which enables the acceleration of existing applications, but also new computing application areas.

5.
Sci Adv ; 6(5): eaay2378, 2020 Jan.
Article in English | MEDLINE | ID: mdl-32064342

ABSTRACT

Machine learning has been getting attention in recent years as a tool to process big data generated by the ubiquitous sensors used in daily life. High-speed, low-energy computing machines are in demand to enable real-time artificial intelligence processing of such data. These requirements challenge the current metal-oxide-semiconductor technology, which is limited by Moore's law approaching its end and the communication bottleneck in conventional computing architecture. Novel computing concepts, architectures, and devices are thus strongly needed to accelerate data-intensive applications. Here, we show that a cross-point resistive memory circuit with feedback configuration can train traditional machine learning algorithms such as linear regression and logistic regression in just one step by computing the pseudoinverse matrix of the data within the memory. One-step learning is further supported by simulations of the prediction of housing price in Boston and the training of a two-layer neural network for MNIST digit recognition.

6.
Proc Natl Acad Sci U S A ; 116(10): 4123-4128, 2019 Mar 05.
Article in English | MEDLINE | ID: mdl-30782810

ABSTRACT

Conventional digital computers can execute advanced operations by a sequence of elementary Boolean functions of 2 or more bits. As a result, complicated tasks such as solving a linear system or solving a differential equation require a large number of computing steps and an extensive use of memory units to store individual bits. To accelerate the execution of such advanced tasks, in-memory computing with resistive memories provides a promising avenue, thanks to analog data storage and physical computation in the memory. Here, we show that a cross-point array of resistive memory devices can directly solve a system of linear equations, or find the matrix eigenvectors. These operations are completed in just one single step, thanks to the physical computing with Ohm's and Kirchhoff's laws, and thanks to the negative feedback connection in the cross-point circuit. Algebraic problems are demonstrated in hardware and applied to classical computing tasks, such as ranking webpages and solving the Schrödinger equation in one step.

7.
Sci Adv ; 4(9): eaat4752, 2018 09.
Article in English | MEDLINE | ID: mdl-30214936

ABSTRACT

The human brain is a complex integrated spatiotemporal system, where space (which neuron fires) and time (when a neuron fires) both carry information to be processed by cognitive functions. To parallel the energy efficiency and computing functionality of the brain, methodologies operating over both the space and time domains are thus essential. Implementing spatiotemporal functions within nanoscale devices capable of synaptic plasticity would contribute a significant step toward constructing a large-scale neuromorphic system that emulates the computing and energy performances of the human brain. We present a neuromorphic approach to brain-like spatiotemporal computing using resistive switching synapses. To process the spatiotemporal spike pattern, time-coded spikes are reshaped into exponentially decaying signals that are fed to a McCulloch-Pitts neuron. Recognition of spike sequences is demonstrated after supervised training of a multiple-neuron network with resistive switching synapses. Finally, we show that, due to the sensitivity to precise spike timing, the spatiotemporal neural network is able to mimic the sound azimuth detection of the human brain.


Subject(s)
Models, Neurological , Neural Networks, Computer , Synapses/physiology , Algorithms , Brain/physiology , Humans , Neurons/physiology , Sound , Spatio-Temporal Analysis
SELECTION OF CITATIONS
SEARCH DETAIL
...