2. CONTENT
1. DATA FUSION METHODS
Bayes Theorem
Kalman Filter Algorithms & Extended Kalman Filter
Sequential monte Carlo Methods
Structure of a Decentralised Data Fusion Node
2. Application.
3. Case Study.
3. HISTORY
The history of data fusion is strongly linked to developments in signal
processing, sensor technologies, and the demand for more accurate
and comprehensive data for a range of applications. While the idea of
merging data from several sources has existed historically in various
forms, data fusion techniques have developed and formalized over
time. A summary of significant turning points in the history of data
fusion is provided below.
4. • Early Developments (1950s - 1960s):
1. The origins of data fusion can be traced back to military applications, particularly in the context of radar systems
during the Cold War.
2. Early efforts focused on integrating data from multiple radars to improve target tracking and reduce false alarms.
1. Military Applications (1970s - 1980s):
1. Data fusion gained prominence in military and defense applications for intelligence gathering, surveillance, and
reconnaissance.
2. The integration of data from diverse sources, including radar, sonar, and infrared sensors, became essential for
enhancing situational awareness.
2. NASA's Data Fusion (1980s - 1990s):
1. NASA played a crucial role in advancing data fusion techniques for space exploration.
2. Applications included combining data from different spaceborne sensors for Earth observation, weather monitoring,
and planetary exploration.
3. Information Fusion Centers (1990s):
1. The establishment of information fusion centers in various sectors, including defense, homeland security, and law
enforcement, led to increased research and development in data fusion methodologies.
2. Fusion centers aimed to integrate information from various agencies and sensors to enhance overall threat
assessment and decision-making.
4. Technological Advancements (2000s - Present):
1. Advances in sensor technologies, communication networks, and computing power have significantly influenced the
field of data fusion.
2. Applications expanded beyond defense to include fields such as healthcare, transportation, environmental
monitoring, and smart cities.
5. Bayes Theorem
Bayes theory is probability theory and it use to describe or defined the
probability of event that might happen under condition or anther
event. The mathematically equation of Bayes theory is:
6. Levels of understanding
what is it saying?
- P (A) is the probability of event A just without any effect of any other event.
- P (B) is the probability of event B just without any effect of any other event.
- P (B | A) is the probability of event B given that A event is true.
• Why is it true?
P(B/A)
P(A)
P(B)= +
P(A/B)=
7. The Heart of Bayes Theorem
All possibilities All possibilities p Librarian given
fitting the evidence the evidence
8. Example
You’ve been planning a picnic for your family. You’re trying to decide
whether to postpone due to rain. The chance of rain on any day is
15%. The morning of the picnic, it’s cloudy. The probability of it being
cloudy is 25% and on days where it rains, it’s cloudy in the morning
80% of the time.
should you postpone the picnic?
Solution to example…
9. Kalman Filter
Signal Processing class, I said to myself :"How hard can it be?".
This article is the result of my couple of day's work and reflects the slow
learning curves of a "mathematically challenged" person.
If you're humble enough to admit that you don't understand this stuff
completely, you'll find this material very enlightening.
So, enjoy it!
10. A Quick Insight
• As I mentioned earlier, it's nearly impossible to grasp the full meaning of
Kalman Filter by starting from definitions and complicated equations (at
least for us mere mortals).
• For most cases, the state matrices drop out and we obtain the below
equation, which is much easier to start with.
11.
12. • Step-by-Step Guide
• Here's a simple step-by-step guide for a quick start to Kalman filtering.
• STEP 1 - Build a Model
• It's the most important step. First of all, you must be sure that, Kalman filtering conditions fit to your
problem.
• As we remember the two equations of Kalman Filter is as follows:
it means that each xk (our signal values) may be evaluated by using a linear stochastic equation (the first one). Any xk is a linear
combination of its previous value plus a control signal k and a process noise (which may be hard to conceptualize). Remember
that, most of the time, there's no control signal uk.
13. • The second equation tells that any measurement value (which we are not sure its accuracy) is a linear combination of
the signal value and the measurement noise. They are both considered to be Gaussian.
• The process noise and measurement noise are statistically independent.
• The entities A, B and H are in general form matrices. But in most of our signal processing problems, we use models
such that these entities are just numeric values. Also as an additional ease, while these values may change between
states, most of the time, we can assume that they're constant.
• If we are pretty sure that our system fits into this model (most of the systems do by the way), the only thing left is to
estimate the mean and standard deviation of the noise functions Wk-1 and vk. We know that, in real life, no signal is pure
Gaussian, but we may assume it with some approximation.
• This is not a big problem, because we'll see that the Kalman Filtering Algorithm tries to converge into correct
estimations, even if the Gaussian noise parameters are poorly estimated.
• The only thing to keep in mind is : "The better you estimate the noise parameters, the better estimates you get."
14. • STEP 2 - Start the Process
• If you succeeded to fit your model into Kalman Filter, then the next step is to determine the necessary
parameters and your initial values.
• We have two distinct set of equations : Time Update (prediction) and Measurement Update
(correction). Both equation sets are applied at each kth state.
15.
16. • we should start from somewhere, such as k=0. We should find or assume some initial state. Here, we
throw out some initial values. Let's assume estimate of X0 = 0, and P0 = 1. Then why didn't we choose
P0 = 0 for example? It's simple. If we chose that way, this would mean that there's no noise in the
environment, and this assumption would lead all the consequent Estimate of X at state k to be zero
(remaining as the initial state). So we choose P0 something other that zero.
• Let's write the Time Update and Measurement Update equations.
17. The above equation is one of the five Kalman filter equations. It is called the State Update Equation. It means the following:
18. Sequential monte Carlo Methods
• Sequential Monte Carlo (SMC) methods, also known as particle
filters, are a class of probabilistic algorithms used for state
estimation and inference in dynamic systems. These methods
provide a flexible and powerful framework for handling
nonlinear and non-Gaussian state spaces.
• Sequential Monte Carlo methods have become a crucial tool in
Bayesian filtering, providing a flexible and efficient approach for
tracking dynamic systems in a wide range of fields. Ongoing
research continues to address challenges and expand the
applicability of these methods.
19. Application
1.Automotive Industry:
1.Autonomous Vehicles: Combining data from sensors like cameras, radar,
and lidar to enable self-driving cars to perceive and navigate their
environment.
2.Driver Assistance Systems: Integrating sensor data for features like adaptive
cruise control, lane-keeping assistance, and collision avoidance.
2.Industrial Automation:
1.Process Control: Integrating data from sensors to monitor and control
industrial processes efficiently.
2.Fault Detection and Diagnostics: Combining information from various
sensors to identify and diagnose equipment failures in manufacturing plants.
20. Case Study
Suppose that a woman in her forties goes for a mammogram and receives bad news: a “positive”
mammogram. However, since not every positive result is real, what is the probability that she actually has
breast cancer? Given that the fraction of women in their forties who have breast cancer is 0.014 and the
probability that a woman who has breast cancer will get a positive result on a mammogram is 0.75. The
probability that a woman who does not have breast cancer will get a false positive on a mammogram is
0.1 Here, the suitable tool is Bayes theorem as we have prior knowledge of all possible probabilities.
• The various factors involved here are:
• 1. The fraction of women in their forties who have breast cancer is 0.014, which is about one in seventy.
The fraction who do not have breast cancer is therefore 1 - 0.014 = 0.986. These fractions are known as
the prior probabilities.
• 2. The probability that a woman who has breast cancer will get a positive result on a mammogram is
0.75. The probability that a woman who does not have breast cancer will get a false positive on a
mammogram is 0.1. These are known as the conditional probabilities.
• 3. Applying Bayes’ theorem, we can conclude that, among women who get a positive result, the fraction
who actually have breast cancer is (0.014 x 0.75) / ((0.014 x 0.75) + (0.986 x 0.1)) = 0.1, approximately.
That is, once we have seen the test result, the chance is about ninety per cent that it is a false positive.
21. Reference
• International Journal of Computer Applications (0975 – 8887) Volume 167 – No.7, June 201
• Bachmann, C., 2011. Multi-sensor Data Fusion for Traffic Speed and Travel Time Estimation, s.l.:
s.n.
• C. Andrieu, De Freitas, J.F.G. and Doucet, A. (1999). Sequential MCMC for Bayesian model
selection. In Proc. IEEE Workshop on Higher Order Statistics.
• J. Q. Wang, H. Y. Zhou, Y. Wu. Zhongshan University. Data Fusion Theory based on optimal
estimation [M]. Hubei: Applied Mathematics, 2007. 20 (2). Proceedings of the 2nd International
Symposium on Computer, Communication.
• Matisko P, Havlena V. Noise covariance estimation for Kalman filter tuning using Bayesian
approach and Monte Carlo. International Journal of Adaptive Control and Signal Processing.
2013;27(11):957-973.