Sensor Fusion
Introduction and basics
Why?
Representation Certainty
Accuracy Completeness
Scenarios
• Single Sensor cannot generate data at
the desired granularity ??
• Single Sensor has high uncertainty in
measurement ??
• Single Sensor precision is too low ??
• Sensor cannot be used to directly
measure the desired parameter ??
• Single sensor generates incomplete
picture of the situation?
Data Fusion Strategies
Boudjemaa and Forbes
Fusion Type
• Fusion across sensors
• Fusion across
attributes
• Fusion across domain
• Fusion across time
Durrant-Whyte
Configuration
• Complementary
• Competitive
• Cooperative
Dasarathy (1994)
Input/Output
Characteristic
• Data In -> Data Out
• Data In -> Feature Out
• Feature In -> Feature
Out
• Feature In -> Decision
Out
• Decision In – Decision
Out
Common Representation Format
• Converting local spatial data to common coordinate
systems
• Image registration
Spatial Alignment
• Converting local observation time to common time axis
• Dynamic Time Wrapping
Temporal Alignment
• Values and uncertainties are normalized to a common
scale
• Binarization, Z-transform, Binning
Sensor Value
Normalization
Thanks you
Author : Asit Deva
Date : 10 April 2019

An Introduction to Sensor fusion

  • 1.
  • 2.
    Why? Representation Certainty Accuracy Completeness Scenarios •Single Sensor cannot generate data at the desired granularity ?? • Single Sensor has high uncertainty in measurement ?? • Single Sensor precision is too low ?? • Sensor cannot be used to directly measure the desired parameter ?? • Single sensor generates incomplete picture of the situation?
  • 3.
    Data Fusion Strategies Boudjemaaand Forbes Fusion Type • Fusion across sensors • Fusion across attributes • Fusion across domain • Fusion across time Durrant-Whyte Configuration • Complementary • Competitive • Cooperative Dasarathy (1994) Input/Output Characteristic • Data In -> Data Out • Data In -> Feature Out • Feature In -> Feature Out • Feature In -> Decision Out • Decision In – Decision Out
  • 4.
    Common Representation Format •Converting local spatial data to common coordinate systems • Image registration Spatial Alignment • Converting local observation time to common time axis • Dynamic Time Wrapping Temporal Alignment • Values and uncertainties are normalized to a common scale • Binarization, Z-transform, Binning Sensor Value Normalization
  • 5.
    Thanks you Author :Asit Deva Date : 10 April 2019

Editor's Notes

  • #3 Representation – Sensors have a limitation to measure value once every 5 minutes. We use 5 sensors to measure values once every minute.  Certainty – Sensor measures temperature with uncertainty of +/- 2 Fah. We use 10 sensors to measure the temperate at the same-time and then take the average of the reading.  Accuracy – IMU measure speed but is not accurate. We use Kalman filter to fuse the data from IMU and Wheel encoder to get a more accurate reading.  Completeness – Ammeter gives current drawn by the motor but fails to provide a complete scenario. We add a temperature sensor to add more information about the working environment. 
  • #4 Boudjemaa and Forbes Fusion Type Fusion across sensors: 5 temperature sensors measuring temperature of same object. Fusion across attributes: Different sensors measuring different parameters to deduce a final derived measurement Fusion across domains: When designing a temperature scale, data from different domains are fused to get a scale that fits all domains. Fusion across time: Current measurement fused with historical measurement. Durrant-Whyte Configuration Complementary: Sensors of different types work independently to create complete picture Competitive: Sensors measure the same property to help reduce uncertainty. Co-operative: Sensors work independently to measure different aspects to generate new information that no single sensor can generate independently Dasarathy (1994) Input/Output Characteristice DaI-DaO: Data In -> Data Out Filtering and Smoothing DaI-FeO: Data In -> Feature Out Feature generation FeI-FeO: Feature In -> Feature Out Feature fusion FeI-DeO: Feature In -> Decision Out Decision generation DeI-DeO: Decision In -> Decision Out Decision fusion
  • #5 Common Representation Format Conversion of all sensor observations to a common format is a basic requirement for all multi-sensor data fusion systems. The reason for this is that only after conversion to a common format are the sensor observations compatible and sensor fusion may be performed. Based on three types of information - spatial, temporal and value - the process of converting the sensor observations into a common representational format involves the following three functions: Spatial Alignment. Transformation of the local spatial positions x to a common coordinate system. The process is known as spatial alignment and involves geo-referencing the location and field-of-view of each sensor. Temporal Alignment. Transformation of the local times t to a common time axis. The process is known as temporal alignment and is often performed using a dynamic time warping algorithm. Sensor Value Normalization. The sensor values y and their uncertainties Δy are normalized to a common scale. This process is often performed using robust statistics.