Neuromorphic Computing indicates a broad area of research that aims at achieving means of physical information processing that are inspired by biological brains. As such, this kind of systems is envisaged as being the ideal approach for implementing artificial neural networks concepts. With the rapid pace of development in Deep Learning, the synergy between the development of neuromorphic hardware and neural network concepts is fundamental to obtain intelligent systems that can exploit the full potential of learning efficiently.
This talk aims at giving a broad overview of the possibilities of such synergy. First, we will quickly explore the fundamental differences between neuromorphic and traditional computing, and then we will focus on concepts, algorithms, and neural architectures that are prone to neuromorphic implementation.
2. About me
● Researcher at the Department of Computer Science,
University of Pisa
● Machine Learning, Deep Learning, Neural Networks,
Dynamical Systems
○ Reservoir Computing
○ Deep Randomized Neural Networks
○ Learning in Structured Domains
● IEEE Task Forces
○ Chair of the IEEE Task Force on Reservoir Computing
○ Vice-Chair of the IEEE Task Force on Randomization-Based
Neural Networks and Learning Systems
● Workshops, Tutorials
○ DL in Unconventional Neuromorphic Hardware (IJCNN-21)
○ ML for irregular time-series (ECML PKDD-21)
○ Deep Randomized Neural Networks (AAAI-21)
gallicch@di.unipi.it
5. “Neuromorphic Computing”
Mead, Carver. "Neuromorphic electronic systems." Proceedings
of the IEEE 78.10 (1990): 1629-1636.
records for TOPIC: “Neuromorphic Computing”
6. Motivations
● The computational brain
● Use custom hardware to implement
neurobiologically inspired systems
● The success of Machine and Deep
Learning
Von Neumann, John. The computer and the brain. Yale
university press, 1958.
8. Von Neumann
● Memory is de-localized
● Von Neumann bottleneck
● Moore’s law
● Koomey’s law
ALU
CU
M
e
m
o
r
y
CPU
input
output
9. Energy efficiency
Marr, Bo, et al.
"Scaling energy per
operation via an
asynchronous
pipeline." IEEE
Transactions on Very
Large Scale
Integration (VLSI)
Systems 21.1 (2012):
147-151.
12. Quantifying the carbon emissions of ML
Lacoste, Alexandre, et al.
"Quantifying the carbon
emissions of machine
learning." arXiv preprint
arXiv:1910.09700 (2019).
https://mlco2.github.io/impact/
15. Motivations
Schuman, Catherine D.,
et al. "A survey of
neuromorphic computing
and neural networks in
hardware." arXiv preprint
arXiv:1705.06963 (2017)
16. vs the Brain…
≈30 PFlops
10 MW vs 20 W
memory and computing are co-located
10!! neurons, 10!" synapses
10000 synapses/neuron
17. How can we achieve such a goal?
Deep Learning Physical Devices
19. Elements of Deep Learning
neuron = aggregation + non-linearity
𝑥!
𝑥"
𝑥#
…
𝑤!
𝑤"
𝑤#
𝑦
∑ neuron
synapses = valves for information
spiking neurons
firing-rate neurons
𝜏!
𝑑𝑢
𝑑𝑡
= −𝑢(𝑡) + 𝑅𝐼 𝑡 𝑦 = 𝜎(𝒘𝑻
𝒙)
Maass, Wolfgang. "Networks of
spiking neurons: the third
generation of neural network
models." Neural networks 10.9
(1997): 1659-1671.
20. Elements of Deep Learning
cat
layers
𝛿𝐿
𝛿𝑤
𝑥 = 𝜎(𝑉 𝑢) 𝑥" = 𝜎(𝑉𝑢" + 𝑊𝑥"#$)
Feed-forward Recurrent
21. Neuromorphic chip: CMOS with Memristors
● neurons implemented in CMOS
● the flowing information is
electrical current
● synapses implemented as
memristors
○ nanoscale resistors
○ non-volatile analog conductance
states
𝑉!
𝑉"
𝑉$
𝑉%
input
output
𝐼& = +
'
𝐺'&𝑉'
22. Neuromorphic chip: Spintronics
● magnetic nano-neurons
● synapses implemented as radiowaves
Torrejon, Jacob, et al. "Neuromorphic computing with nanoscale
spintronic oscillators." Nature 547.7664 (2017): 428-431.
Locatelli, Nicolas, Vincent Cros, and Julie Grollier. "Spin-torque
building blocks." Nature materials 13.1 (2014): 11-20.
23. Neuromorphic chip: Photonics
● neurons implemented by optical resonators
● the flow of information is light
● synapses implemented by multiple interferometers or
transmission of optical waveguides
24. Neuromorphic chip: Photonics
De Marinis, Lorenzo, et al.
"Photonic neural networks: a
survey." IEEE Access 7 (2019):
175827-175841.
25. Neuromorphic chip: Photonics
Moughames, Johnny, et al. "3D
printed multimode-splitters for
photonic interconnects." Optical
Materials Express 10.11 (2020):
2952-2961.
26.
27. Mechanical systems
● Neural Networks implemented by physical bodies or soft robots
Hauser, Helmut, et al. "Towards a theoretical foundation for
morphological computation with compliant bodies." Biological
cybernetics 105.5 (2011): 355-370.
Nakajima, Kohei, et al. "Information processing via
physical soft body." Scientific reports 5.1 (2015): 1-11.
28. Biological systems
● Neural Networks implemented on in vitro biological
components
Tanaka, Gouhei, et al. "Recent advances in physical reservoir computing:
A review." Neural Networks 115 (2019): 100-123.
Obien, Marie Engelene J., et al. "Revealing neuronal function through
microelectrode array recordings." Frontiers in neuroscience 8 (2015): 423.
Hafizovic, Sadik, et al. "A CMOS-based microelectrode
array for interaction with neuronal cultures." Journal of
neuroscience methods 164.1 (2007): 93-106.
31. Direct Feedback Alignment
● Biological unplausibility
of BP: e.g., symmetric
weights
● Disconnect the
feedback path from the
forward path
● Use random weights to
carry the teaching
signal through a
teaching path
● Effective in training
very deep (↑100) nets
Nøkland, Arild. "Direct feedback alignment provides learning in
deep neural networks." Advances in neural information processing
systems. 2016.
32. Direct Feedback Alignment
Launay, Julien, et al. "Direct
feedback alignment scales to
modern deep learning tasks and
architectures." NeurIPS 2020
33. E-prop
● BPTT is biologically unrealistic
○ e.g., it would require physical
propagation of errors
backwards in time (i.e., it is not
local)
● Basic idea: factorize the
gradients as
!"
!#%&
= ∑$ 𝐿% 𝑡 𝑒%& 𝑡
● Eligibility traces 𝑒%& 𝑡 : what a
synapse remembers of its
activation history
Bellec, Guillaume, et al. "Biologically inspired alternatives
to backpropagation through time for learning in recurrent
neural nets." arXiv preprint arXiv:1901.09049 (2019).
34. Equilibrium Propagation
Scellier, Benjamin, and Yoshua Bengio. "Equilibrium
propagation: Bridging the gap between energy-based
models and backpropagation." Frontiers in
computational neuroscience 11 (2017): 24.
Laydevant, Jérémie, et al. "Training Dynamical Binary
Neural Networks with Equilibrium
Propagation." Proceedings of the IEEE/CVF
Conference on Computer Vision and Pattern
Recognition. 2021.
35. Sparse Connections
● Synaptic connectivity
influences the dimension
of the representations
● Sparse patterns of
connectivity can be used
to maximize the dimension
● Match degrees of
connectivity in anatomical
observations
Litwin-Kumar, Ashok, et al.
"Optimal degrees of synaptic
connectivity." Neuron 93.5
(2017): 1153-1164.
37. Neural ODEs
● Replace the forward Euler
with a SOTA ODE solver
● Both the forward (i.e.,
inference) and the
backward pass (i.e.,
training) can be computed
by a call to an ODE solver
Chen, Ricky TQ, et al. "Neural
ordinary differential equations." arXiv
preprint arXiv:1806.07366 (2018).
38. Stability
● Forward propagation can be seen as Euler discretization of
h+ t = tanh 𝑊 ℎ 𝑡 , with ℎ 0 = ℎ,
over a time interval 𝑡 ∈ [0, 𝑇]
● The ODE is stable if max
-
𝑅𝑒 𝜆- 𝐽 𝑡 ≤ 0
● BUT you want 𝑅𝑒 𝜆- 𝐽 𝑡 ≈ 0