These slides are shared on behalf of Prof. Jörg Conradt, director of the Neuroscience System Theory (NST) at the Technical University of Munich (TUM).
https://www.nst.ei.tum.de/team/jorg-conradt/
Traditional “frame-based” cameras offer high-resolution crisp images at the cost of significant amounts of data to be processed for scene understanding. The human eye inspired so-called “event-based vision sensors” which asynchronously respond to local changes in light-intensity, and so offer high speed / low-latency visual perception at drastically reduced amounts of data to be processed. Digital Product School of UnternehmerTUM is delighted that our invited speaker introduced this novel sensing technology and the embedded event-based vision sensors and outlined possible application areas ranging from autonomous robotics to wearable electronics.
Neuro vision: Prof. Conradt @ Digital Product School of UnternehmerTUM
1. Event-based high-speed vision
for technical applications
http://www.nst.ei.tum.de
Lukas Everding, Mohsen Firouzi, Michael Lutter, Florian Mirus, Nicolai Waniek, Zied Tayeb
Dr. Cristian Axenie, Viviane Ghaderi, Ph.D., Dr. Christoph Richter
Jörg Conradt
Neuroscientific System Theory
ECE Competence Center NeuroEngineering
Chair of Automatic Control Engineering
Technische Universität München
Center of Competence Neuroengineering
2. Neuromorphic Engineering
Neuro - morphic
“to do with neurons”,
i.e. neurally inspired
“structure or form”
Discovering key principles by which brains work
and implementing them in technical systems
that intelligently interact with their environment.
• analog VLSI Circuits to implement neural processing
• Sensors, such as Silicon Retinae or Cochleae
• Distributed Adaptive Control, e.g. CPG based locomotion
• Massively Parallel Self-Growing Computing Architectures
• Neuro-Electronic Implants, Rehabilitation
Examples:
3. 10120 103 106 109
Elephant
Cat
Frog
Mouse
Honeybee
Zebrafish
Sponge
C.Elegans
Human
Chimpanzee
Getting to know your Brain
• 1.3 Kg, about 2% of body weight
• 1011 neurons
• Neuron growth
250.000 / min (early pregnancy)
… but also loss, 1 neuron/second
“Operating Mode” of Neurons
• Analog leaky integration in soma
• Digital pulses (spikes) along neurites
• 1014 stochastic synapses
• Typical operating “frequency”
≤100Hz, typically ~10Hz, asynchronous
Getting to know your Computer’s CPU
• 50g, largely irrelevant
• 1010 transistors (SPARC M7, 2015)
• ideally no modification over lifetime
“Operating Mode” of CPUs
• Digital Boolean logic processing
• Digital signal propagation
• Reliable remote storage of data
• Typical operating frequency
Several GHz, synchronous
Computation in Brains ... and in Machines
4. http://www.nst.ei.tum.de
Real-Time Neuronal Information Processing
in Closed-Loop Systems
Information Processing in
Distributed Neuronal Circuits
Self-Construction and
Organization of Neuronal Circuits
• Event-Based Neuromorphic Vision
• Information Processing in Distributed Neuronal Circuits
• Neuromorphic Real-Time Computing and Control
• Self-Construction and Organization of Neuronal Circuits
Event-Based
Neuromorphic Vision
Neuromorphic Real-Time
Computing and Control
5. The Neuromorphic Sensors Group
Institute of Neuroinformatics, ETH / University Zurich
Retinal Inspired Dynamic Vision Sensor
Advantages
• Sparse data stream (~200 kEvents / sec)
• Ultra-low response time (~15 us)
• Local contrast adjustment per pixel
~120dB dynamic illumination range
Sensor Features
• Pixel signal temporal changes of illumination (“events”)
• Asynchronous updates (no image frames)
• Few microseconds latency, up to 1Mevents/sec
DVS128 sensor
Video camera DVS reconstruction
Individual event
ms
pixely
6. Available DVS Sensors
USB
DVS128
DAVIS240
DAVIS640
Size and weight improved
developed at
Real-time embedded vision
and autonomous motor control
50mm
80mm
Conradt, J.: Event-Based Vision for Autonomous Robotics,
IEEE Robotics and Biomimetics, 2015, p. 1858-63.
10. Engineering (Student) Fun with eDVS
High-speed balancing miniature poles (pencils)
Mobile robot chain
Tracking ‘unpredictable’ motion Oculomotor control, saccades and smooth pursuit
Impact: Multiple MA and BE theses, project practicals,
several conference papers and live demonstrations,
best demo and best poster award
11. 26mm
36mm
Target LED
Rechargeable
Battery
Microcontroller
for PWM Signal
Power Switch
eDVS EventTime: tE =5612μs
Event Position: PE = (3/2)
PreviousEventTime
Memory: tM = 4600μs
TargetMarker LED
expected
Δtexp = 1000μs
2215 1650 4132 13 4231
3254 4132 4600 3432 2144
3124 5027 4301 4644 2234
2301 3113 1285 341 4233
…
…
128x128
tD = ΔtE - Δtexp = 12μs
ΔtE=tE-tM
20 40 60 80 100-100 -80 -60 -40 -20
1
tD
wT
time deviation tD in μs
GT: relative weight wT
Position Update: PT+1 = (1 – η(wT+wP))•PT + η(wT+wP)•PE
20 40 60 80 100
1
PD
wP
position deviation PD in pixel
GP: relative weight wP
PD = (PE – PT)2 = 17Σ
Current
Tracked
Position
PT = (7/3)
…
…
128x128
0
6
12
Pixel X
PixelY
on event count
Müller et al, IEEE ROBIO 2011
Application Example: Tracking an Active Object
14. SLAM for embedded Event-Based Vision Systems
Event-based Particle Filter
• Weikersdorfer, D.; Hoffmann, R.; Conradt, J.: Simultaneous Localization and Mapping
for event-based Vision Systems. Intl. Conf. Computer Vision 2013, p. 133-42.
• Weikersdorfer, D.; Conradt, J.: Event-based Particle Filtering for Robot Self-Localization.
IEEE International Conference on Robotics and Biomimetics (IEEE-ROBIO), 2012, p. 866-70.
Event-based Particle Filter Framework
• Every event used to incrementally build a map
• Every event used to incrementally update
estimate of quality of particles
computationally efficient framework
15. ?
5x real time
SLAM for embedded Event-Based Vision Systems
Hoffmann, R.; Weikersdorfer, D.; Conradt, J.: Autonomous Indoor Exploration
with an Event-Based Visual SLAM System. ECMR, 2013, p. 38-43.
16. Autonomous Indoor Mini Quadrocopter
System Specifications
• 5 orthogonally mounted meDVS128
• On-board real-time vision processing in µController
• Complete take-off weight ~44 gram
• Autonomous flight for ~ 4 minutes
Applications
• First responder in search and rescue scenarios
• Autonomous exploration and mapping of indoor areas
for real-time high-speed 3D SLAM
Miniaturized embedded DVS
• 18x18x11mm, 2.3gram including optics
• 128x128 pixel @ 60° FOV
• 32bit ARM µC (216MHz, 512KB SRAM)
• Complete take-off weight ~44g
• ~300mW (2,8V @ 108mA full load)
Complete on-board autonomous
high-speed vision processing
Everding, L. and Conradt, J., Persistent real-time line tracking
using Event-Based Dynamic Vision Sensors, submitted to ICCV.
17. Hands-free
Long
battery lifetime
Intuitive
Low cost
Light-weight
Ears free
for regular hearing
A portable device to help the visually-impaired
SpinOff Prototype
eDVS for Healthcare
Patent Assistive device and system to provide 3D spatial perception
to the visually-impaired by translating event-based visual information
into 3D auditory scenes (filed 16.2.2015, published 25.8.2016)
Feedback: Substantially increased autonomy,
therefore higher quality of live!
18. Left eDVS Right eDVS
Depth
image
eDVS for Healthcare
Patent Assistive device and system to provide 3D spatial perception
to the visually-impaired by translating event-based visual information
into 3D auditory scenes (filed 16.2.2015, published 25.8.2016)
19. http://www.nst.ei.tum.de
Real-Time Neuronal Information Processing
in Closed-Loop Systems
Information Processing in
Distributed Neuronal Circuits
Self-Construction and
Organization of Neuronal Circuits
• Event-Based Neuromorphic Vision
• Information Processing in Distributed Neuronal Circuits
• Neuromorphic Real-Time Computing and Control
• Self-Construction and Organization of Neuronal Circuits
Event-Based
Neuromorphic Vision
Neuromorphic Real-Time
Computing and Control