The document discusses exact inference in Bayesian networks. It begins by stating the goal is to efficiently compute the sum product of the inference formula. It then lists some related topics that will be covered in subsequent units, such as approximate inference algorithms. The document outlines the structure of the related lecture notes, which will cover topics like variable elimination, belief propagation, and junction trees for exact inference. It also provides references for further self-study on probabilistic inference in graphical models.
This document provides an overview of Bayesian networks and probabilistic graphical models (PGMs). It outlines the goals of learning how to build graphical models using graph theory and perform inference under uncertainty using probability theory. It also lists some example PGM models like Markov random fields, hidden Markov models, dynamic Bayesian networks, naive Bayes models, and applications in computer vision. Finally, it provides the table of contents and references for further self-study on PGMs and Bayesian networks.
The document discusses probabilistic inference over time using Bayesian networks. It introduces the concepts of temporal models and the four types of inference in such models: filtering, prediction, smoothing, and most likely explanation. It outlines the goals of learning uncertainty in temporal models and examining hidden Markov models, Kalman filtering, particle filtering, and dynamic Bayesian networks. The document provides an overview of its structure and references related background units on probabilistic graphical models and inference.
This document discusses approximate inference in Bayesian networks using sampling methods. It introduces random number generation, which is important for sampling algorithms. Random number generators in programming languages typically generate uniform random numbers, but different distributions are needed for sampling Bayesian networks. The document covers generating random numbers from univariate and multivariate distributions to estimate probabilities for approximate inference in Bayesian networks.
Artificial Neural Networks used for learning algorithms in the field of Artificial Intelligence. BS(CS) level, semester 5th slides for the course Artificial Intelligence in Federal Urdu University of Arts Science and Technology.
This document defines and provides examples of different types of differential equations:
Ordinary differential equations depend on only one independent variable, while partial differential equations depend on two or more independent variables. The order of a differential equation is the highest derivative that occurs in the equation, and the degree is the power of the highest order derivative. Linear differential equations have dependent variables and derivatives of degree one and coefficients that do not depend on the dependent variable, while non-linear equations do not meet these criteria.
1) A differential equation contains an independent variable (x), a dependent variable (y), and the derivative of the dependent variable with respect to the independent variable (dy/dx).
2) The order of a differential equation refers to the highest order derivative present. For example, an equation containing dy/dx would be first order, while one containing d2y/dx2 would be second order.
3) The degree of a differential equation refers to the highest power of the highest order derivative. For example, an equation containing (d2y/dx) would be degree 1, while one containing (d2y/dx)2 would be degree 2.
4) There are several methods for solving first
The document discusses exact and non-exact differential equations. It defines an exact differential equation as one where the partial derivatives of M and N with respect to y and x respectively are equal. The solution to an exact differential equation involves finding a constant such that the integral of Mdx + terms of N not containing x dy is equal to that constant. A non-exact differential equation has unequal partial derivatives, requiring an integrating factor to make the equation exact. Several methods for finding an integrating factor are presented, including cases where it is a function of x or y alone or where the equation is homogeneous. Examples are provided to illustrate these concepts.
The document is an introduction to ordinary differential equations prepared by Ahmed Haider Ahmed. It defines key terms like differential equation, ordinary differential equation, partial differential equation, order, degree, and particular and general solutions. It then provides methods for solving various types of first order differential equations, including separable, homogeneous, exact, linear, and Bernoulli equations. Specific examples are given to illustrate each method.
This document provides an overview of Bayesian networks and probabilistic graphical models (PGMs). It outlines the goals of learning how to build graphical models using graph theory and perform inference under uncertainty using probability theory. It also lists some example PGM models like Markov random fields, hidden Markov models, dynamic Bayesian networks, naive Bayes models, and applications in computer vision. Finally, it provides the table of contents and references for further self-study on PGMs and Bayesian networks.
The document discusses probabilistic inference over time using Bayesian networks. It introduces the concepts of temporal models and the four types of inference in such models: filtering, prediction, smoothing, and most likely explanation. It outlines the goals of learning uncertainty in temporal models and examining hidden Markov models, Kalman filtering, particle filtering, and dynamic Bayesian networks. The document provides an overview of its structure and references related background units on probabilistic graphical models and inference.
This document discusses approximate inference in Bayesian networks using sampling methods. It introduces random number generation, which is important for sampling algorithms. Random number generators in programming languages typically generate uniform random numbers, but different distributions are needed for sampling Bayesian networks. The document covers generating random numbers from univariate and multivariate distributions to estimate probabilities for approximate inference in Bayesian networks.
Artificial Neural Networks used for learning algorithms in the field of Artificial Intelligence. BS(CS) level, semester 5th slides for the course Artificial Intelligence in Federal Urdu University of Arts Science and Technology.
This document defines and provides examples of different types of differential equations:
Ordinary differential equations depend on only one independent variable, while partial differential equations depend on two or more independent variables. The order of a differential equation is the highest derivative that occurs in the equation, and the degree is the power of the highest order derivative. Linear differential equations have dependent variables and derivatives of degree one and coefficients that do not depend on the dependent variable, while non-linear equations do not meet these criteria.
1) A differential equation contains an independent variable (x), a dependent variable (y), and the derivative of the dependent variable with respect to the independent variable (dy/dx).
2) The order of a differential equation refers to the highest order derivative present. For example, an equation containing dy/dx would be first order, while one containing d2y/dx2 would be second order.
3) The degree of a differential equation refers to the highest power of the highest order derivative. For example, an equation containing (d2y/dx) would be degree 1, while one containing (d2y/dx)2 would be degree 2.
4) There are several methods for solving first
The document discusses exact and non-exact differential equations. It defines an exact differential equation as one where the partial derivatives of M and N with respect to y and x respectively are equal. The solution to an exact differential equation involves finding a constant such that the integral of Mdx + terms of N not containing x dy is equal to that constant. A non-exact differential equation has unequal partial derivatives, requiring an integrating factor to make the equation exact. Several methods for finding an integrating factor are presented, including cases where it is a function of x or y alone or where the equation is homogeneous. Examples are provided to illustrate these concepts.
The document is an introduction to ordinary differential equations prepared by Ahmed Haider Ahmed. It defines key terms like differential equation, ordinary differential equation, partial differential equation, order, degree, and particular and general solutions. It then provides methods for solving various types of first order differential equations, including separable, homogeneous, exact, linear, and Bernoulli equations. Specific examples are given to illustrate each method.
1. Exact inference in Bayesian networks is NP-hard in the worst case, so approximation techniques are needed for large networks.
2. Major approximation techniques include variational methods like mean-field approximation, sampling methods like Monte Carlo Markov Chain, and bounded cutset conditioning.
3. Variational methods introduce variational parameters to minimize the distance between the approximate and true distributions. Sampling methods draw random samples to estimate probabilities. Bounded cutset conditioning breaks loops by instantiating subsets of variables.
1. Exact inference in Bayesian networks is NP-hard in the worst case, so approximation techniques are needed for large networks.
2. Major approximation techniques include variational methods like mean-field approximation, sampling methods like Monte Carlo Markov Chain, and bounded cutset conditioning.
3. Variational methods introduce variational parameters to minimize the distance between the approximate and true distributions. Sampling methods draw random samples to estimate probabilities. Bounded cutset conditioning breaks loops by instantiating subsets of variables.
This document introduces Bayesian networks and uncertainty inference with discrete variables. It discusses the goal of reviewing advanced statistical concepts like statistical inference and pattern recognition. The contents cover topics like acting under uncertainty, basic probability, marginal probability, inference using full joint distributions, independence, and Bayes' rule. Self-study materials on related topics are also referenced.
This document provides an overview of statistics concepts for image processing and pattern recognition. It reviews key statistical measures including histograms, measures of central tendency (mean, median, mode), variance, frequency distributions, covariance, correlation, and charts/graphs. The goal is to review basic statistics concepts that will be useful for subsequent units on uncertainty inference. Key concepts covered include histograms, probability density functions, measures of central tendency, variance as a measure of dispersion, and expected values.
This document describes a unit on uncertainty inference using continuous distributions. It covers Bayesian networks and Gaussian distributions, including univariate, bivariate, and multivariate Gaussian distributions. The key concepts covered are the Gaussian distribution parameters of mean and covariance matrix, properties of Gaussian distributions like axis-aligned and spherical Gaussians, and applications like using Gaussians for noise modeling in images. Self-study references on statistics and artificial intelligence are also provided.
This document provides an introduction to probability and probability distributions for Bayesian networks. It begins with a review of basic probability concepts like events, axioms of probability, and theorems derived from the axioms. It then discusses random variables, including discrete, continuous, and random vector variables. Examples of random variables in image processing and computer vision are provided. The document concludes with an overview of probability distributions as a set of probabilities assigned to a random variable or vector.
This document contains slides from a lecture on Bayes net structure learning given by Andrew W. Moore. The slides introduce Bayes net structure learning as an additional machine learning method. They cover scoring Bayes net structures based on a Bayesian Information Criterion, and searching over possible structures to find the one with the best score. The purpose is to teach students about learning the structure of Bayesian networks from data.
Bayesian Networks - A Brief IntroductionAdnan Masood
- A Bayesian network is a graphical model that depicts probabilistic relationships among variables. It represents a joint probability distribution over variables in a directed acyclic graph with conditional probability tables.
- A Bayesian network consists of a directed acyclic graph whose nodes represent variables and edges represent probabilistic dependencies, along with conditional probability distributions that quantify the relationships.
- Inference using a Bayesian network allows computing probabilities like P(X|evidence) by taking into account the graph structure and probability tables.
The document discusses using Bayesian methods for image classification. It describes applying a k-nearest neighbors (KNN) algorithm to classify images into categories like motorcycles, bicycles, humans, and cars. A probabilistic version of KNN is developed that assigns probabilities of category membership based on neighboring classified examples. Bayesian analysis is then used to estimate hyperparameters of the model and make predictions, implemented via MCMC sampling.
This document presents a new high-dimensional neural network approach for constructing accurate and efficient interatomic potentials. The approach expands the total energy in terms of environment-dependent atom pairs. It discusses the advantages of using neural networks, which have a flexible functional form and do not require knowledge of the underlying functional form. The approach represents the total energy as a sum of atomic pair energies and uses symmetry functions as inputs to neural network subnets for each pair. It achieves accurate reproduction of total energies, forces and stresses for training structures.
Computer vision and pattern recognition algorithms are important for IoT applications like smart homes and healthcare that involve large camera networks. Academic expertise is needed for accuracy and efficiency, while industrial concerns focus on system integration, configuration and management. The presentation describes a large-scale video surveillance system using heterogeneous information fusion and visualization across a university campus. It also discusses implementing system self-awareness through fault, environment and context awareness, and presents methods for real-time camera anomaly detection.
This document discusses parallelizing computer vision algorithms using GPGPU computing. It begins with an introduction to multicore computing and GPUs. It explains that as CPU clock speeds can no longer increase due to power constraints, the industry has shifted to multicore CPUs and GPUs to continue improving performance. Computer vision algorithms are well-suited to parallelization on GPUs due to their massive data processing needs. The document reviews GPU architectures from Nvidia, Qualcomm, AMD, and ARM that can be used to accelerate computer vision. It also discusses parallel programming frameworks for GPUs like CUDA, OpenCL, and OpenACC.
The document discusses embedded computer vision and presents examples of embedded computer vision systems developed by Wang, Yuan-Kai and his team. It describes research in embedded computer vision using CPUs, DSPs and FPGAs. It also outlines challenges in embedded computer vision and provides examples of projects including an entertainment robot, vision sensor network, video surveillance system, and wearable camera.
This document discusses parallel computing with GPUs and CUDA. It begins by explaining that the multicore era requires parallel computing approaches. It then provides an overview of GPU architecture and programming with CUDA. Specific examples of using GPUs for image restoration, feature extraction, and video processing are mentioned.
The document is a presentation about monocular human pose estimation using Bayesian networks. It includes:
- An outline with sections on introduction, approach overview, model learning, pose estimation, feature extraction, experiments and conclusions.
- Discussion of applications of human motion capture such as animation, games, medical diagnosis and visual surveillance.
- Comparison of different sensor approaches for human pose estimation including active markers, passive markers and markerless methods using cameras.
- Description of the proposed approach which uses Bayesian networks to represent the articulated human body and estimate 2D and 3D joint positions through representation, learning and inference steps.
My slides for acamedia talk about embedded vision in 2010. Some of our research results are also presented in this presentation.
Few slides have chinese characters.
It is a presentation for acamedia talk about cloud computing for intelligent video surveillance, i.e. VSaaS, given in 2010. Some of our research results are also presented in this presentation.
This document discusses intelligent video surveillance and sousveillance. It covers topics such as video surveillance market trends, important crime cases solved using CCTV footage, and technology used in intelligent video surveillance systems. Computer vision algorithms are used to add intelligence to video surveillance, going beyond just monitoring to visual surveillance. The document also presents examples of intelligent surveillance applications and research from universities and companies.
This document provides an overview of wound healing, its functions, stages, mechanisms, factors affecting it, and complications.
A wound is a break in the integrity of the skin or tissues, which may be associated with disruption of the structure and function.
Healing is the body’s response to injury in an attempt to restore normal structure and functions.
Healing can occur in two ways: Regeneration and Repair
There are 4 phases of wound healing: hemostasis, inflammation, proliferation, and remodeling. This document also describes the mechanism of wound healing. Factors that affect healing include infection, uncontrolled diabetes, poor nutrition, age, anemia, the presence of foreign bodies, etc.
Complications of wound healing like infection, hyperpigmentation of scar, contractures, and keloid formation.
1. Exact inference in Bayesian networks is NP-hard in the worst case, so approximation techniques are needed for large networks.
2. Major approximation techniques include variational methods like mean-field approximation, sampling methods like Monte Carlo Markov Chain, and bounded cutset conditioning.
3. Variational methods introduce variational parameters to minimize the distance between the approximate and true distributions. Sampling methods draw random samples to estimate probabilities. Bounded cutset conditioning breaks loops by instantiating subsets of variables.
1. Exact inference in Bayesian networks is NP-hard in the worst case, so approximation techniques are needed for large networks.
2. Major approximation techniques include variational methods like mean-field approximation, sampling methods like Monte Carlo Markov Chain, and bounded cutset conditioning.
3. Variational methods introduce variational parameters to minimize the distance between the approximate and true distributions. Sampling methods draw random samples to estimate probabilities. Bounded cutset conditioning breaks loops by instantiating subsets of variables.
This document introduces Bayesian networks and uncertainty inference with discrete variables. It discusses the goal of reviewing advanced statistical concepts like statistical inference and pattern recognition. The contents cover topics like acting under uncertainty, basic probability, marginal probability, inference using full joint distributions, independence, and Bayes' rule. Self-study materials on related topics are also referenced.
This document provides an overview of statistics concepts for image processing and pattern recognition. It reviews key statistical measures including histograms, measures of central tendency (mean, median, mode), variance, frequency distributions, covariance, correlation, and charts/graphs. The goal is to review basic statistics concepts that will be useful for subsequent units on uncertainty inference. Key concepts covered include histograms, probability density functions, measures of central tendency, variance as a measure of dispersion, and expected values.
This document describes a unit on uncertainty inference using continuous distributions. It covers Bayesian networks and Gaussian distributions, including univariate, bivariate, and multivariate Gaussian distributions. The key concepts covered are the Gaussian distribution parameters of mean and covariance matrix, properties of Gaussian distributions like axis-aligned and spherical Gaussians, and applications like using Gaussians for noise modeling in images. Self-study references on statistics and artificial intelligence are also provided.
This document provides an introduction to probability and probability distributions for Bayesian networks. It begins with a review of basic probability concepts like events, axioms of probability, and theorems derived from the axioms. It then discusses random variables, including discrete, continuous, and random vector variables. Examples of random variables in image processing and computer vision are provided. The document concludes with an overview of probability distributions as a set of probabilities assigned to a random variable or vector.
This document contains slides from a lecture on Bayes net structure learning given by Andrew W. Moore. The slides introduce Bayes net structure learning as an additional machine learning method. They cover scoring Bayes net structures based on a Bayesian Information Criterion, and searching over possible structures to find the one with the best score. The purpose is to teach students about learning the structure of Bayesian networks from data.
Bayesian Networks - A Brief IntroductionAdnan Masood
- A Bayesian network is a graphical model that depicts probabilistic relationships among variables. It represents a joint probability distribution over variables in a directed acyclic graph with conditional probability tables.
- A Bayesian network consists of a directed acyclic graph whose nodes represent variables and edges represent probabilistic dependencies, along with conditional probability distributions that quantify the relationships.
- Inference using a Bayesian network allows computing probabilities like P(X|evidence) by taking into account the graph structure and probability tables.
The document discusses using Bayesian methods for image classification. It describes applying a k-nearest neighbors (KNN) algorithm to classify images into categories like motorcycles, bicycles, humans, and cars. A probabilistic version of KNN is developed that assigns probabilities of category membership based on neighboring classified examples. Bayesian analysis is then used to estimate hyperparameters of the model and make predictions, implemented via MCMC sampling.
This document presents a new high-dimensional neural network approach for constructing accurate and efficient interatomic potentials. The approach expands the total energy in terms of environment-dependent atom pairs. It discusses the advantages of using neural networks, which have a flexible functional form and do not require knowledge of the underlying functional form. The approach represents the total energy as a sum of atomic pair energies and uses symmetry functions as inputs to neural network subnets for each pair. It achieves accurate reproduction of total energies, forces and stresses for training structures.
Computer vision and pattern recognition algorithms are important for IoT applications like smart homes and healthcare that involve large camera networks. Academic expertise is needed for accuracy and efficiency, while industrial concerns focus on system integration, configuration and management. The presentation describes a large-scale video surveillance system using heterogeneous information fusion and visualization across a university campus. It also discusses implementing system self-awareness through fault, environment and context awareness, and presents methods for real-time camera anomaly detection.
This document discusses parallelizing computer vision algorithms using GPGPU computing. It begins with an introduction to multicore computing and GPUs. It explains that as CPU clock speeds can no longer increase due to power constraints, the industry has shifted to multicore CPUs and GPUs to continue improving performance. Computer vision algorithms are well-suited to parallelization on GPUs due to their massive data processing needs. The document reviews GPU architectures from Nvidia, Qualcomm, AMD, and ARM that can be used to accelerate computer vision. It also discusses parallel programming frameworks for GPUs like CUDA, OpenCL, and OpenACC.
The document discusses embedded computer vision and presents examples of embedded computer vision systems developed by Wang, Yuan-Kai and his team. It describes research in embedded computer vision using CPUs, DSPs and FPGAs. It also outlines challenges in embedded computer vision and provides examples of projects including an entertainment robot, vision sensor network, video surveillance system, and wearable camera.
This document discusses parallel computing with GPUs and CUDA. It begins by explaining that the multicore era requires parallel computing approaches. It then provides an overview of GPU architecture and programming with CUDA. Specific examples of using GPUs for image restoration, feature extraction, and video processing are mentioned.
The document is a presentation about monocular human pose estimation using Bayesian networks. It includes:
- An outline with sections on introduction, approach overview, model learning, pose estimation, feature extraction, experiments and conclusions.
- Discussion of applications of human motion capture such as animation, games, medical diagnosis and visual surveillance.
- Comparison of different sensor approaches for human pose estimation including active markers, passive markers and markerless methods using cameras.
- Description of the proposed approach which uses Bayesian networks to represent the articulated human body and estimate 2D and 3D joint positions through representation, learning and inference steps.
My slides for acamedia talk about embedded vision in 2010. Some of our research results are also presented in this presentation.
Few slides have chinese characters.
It is a presentation for acamedia talk about cloud computing for intelligent video surveillance, i.e. VSaaS, given in 2010. Some of our research results are also presented in this presentation.
This document discusses intelligent video surveillance and sousveillance. It covers topics such as video surveillance market trends, important crime cases solved using CCTV footage, and technology used in intelligent video surveillance systems. Computer vision algorithms are used to add intelligence to video surveillance, going beyond just monitoring to visual surveillance. The document also presents examples of intelligent surveillance applications and research from universities and companies.
More from IEEE International Conference on Intelligent Information Hiding and Multimedia Signal Processing (10)
This document provides an overview of wound healing, its functions, stages, mechanisms, factors affecting it, and complications.
A wound is a break in the integrity of the skin or tissues, which may be associated with disruption of the structure and function.
Healing is the body’s response to injury in an attempt to restore normal structure and functions.
Healing can occur in two ways: Regeneration and Repair
There are 4 phases of wound healing: hemostasis, inflammation, proliferation, and remodeling. This document also describes the mechanism of wound healing. Factors that affect healing include infection, uncontrolled diabetes, poor nutrition, age, anemia, the presence of foreign bodies, etc.
Complications of wound healing like infection, hyperpigmentation of scar, contractures, and keloid formation.
Main Java[All of the Base Concepts}.docxadhitya5119
This is part 1 of my Java Learning Journey. This Contains Custom methods, classes, constructors, packages, multithreading , try- catch block, finally block and more.
বাংলাদেশের অর্থনৈতিক সমীক্ষা ২০২৪ [Bangladesh Economic Review 2024 Bangla.pdf] কম্পিউটার , ট্যাব ও স্মার্ট ফোন ভার্সন সহ সম্পূর্ণ বাংলা ই-বুক বা pdf বই " সুচিপত্র ...বুকমার্ক মেনু 🔖 ও হাইপার লিংক মেনু 📝👆 যুক্ত ..
আমাদের সবার জন্য খুব খুব গুরুত্বপূর্ণ একটি বই ..বিসিএস, ব্যাংক, ইউনিভার্সিটি ভর্তি ও যে কোন প্রতিযোগিতা মূলক পরীক্ষার জন্য এর খুব ইম্পরট্যান্ট একটি বিষয় ...তাছাড়া বাংলাদেশের সাম্প্রতিক যে কোন ডাটা বা তথ্য এই বইতে পাবেন ...
তাই একজন নাগরিক হিসাবে এই তথ্য গুলো আপনার জানা প্রয়োজন ...।
বিসিএস ও ব্যাংক এর লিখিত পরীক্ষা ...+এছাড়া মাধ্যমিক ও উচ্চমাধ্যমিকের স্টুডেন্টদের জন্য অনেক কাজে আসবে ...
Reimagining Your Library Space: How to Increase the Vibes in Your Library No ...Diana Rendina
Librarians are leading the way in creating future-ready citizens – now we need to update our spaces to match. In this session, attendees will get inspiration for transforming their library spaces. You’ll learn how to survey students and patrons, create a focus group, and use design thinking to brainstorm ideas for your space. We’ll discuss budget friendly ways to change your space as well as how to find funding. No matter where you’re at, you’ll find ideas for reimagining your space in this session.
This presentation includes basic of PCOS their pathology and treatment and also Ayurveda correlation of PCOS and Ayurvedic line of treatment mentioned in classics.
LAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UPRAHUL
This Dissertation explores the particular circumstances of Mirzapur, a region located in the
core of India. Mirzapur, with its varied terrains and abundant biodiversity, offers an optimal
environment for investigating the changes in vegetation cover dynamics. Our study utilizes
advanced technologies such as GIS (Geographic Information Systems) and Remote sensing to
analyze the transformations that have taken place over the course of a decade.
The complex relationship between human activities and the environment has been the focus
of extensive research and worry. As the global community grapples with swift urbanization,
population expansion, and economic progress, the effects on natural ecosystems are becoming
more evident. A crucial element of this impact is the alteration of vegetation cover, which plays a
significant role in maintaining the ecological equilibrium of our planet.Land serves as the foundation for all human activities and provides the necessary materials for
these activities. As the most crucial natural resource, its utilization by humans results in different
'Land uses,' which are determined by both human activities and the physical characteristics of the
land.
The utilization of land is impacted by human needs and environmental factors. In countries
like India, rapid population growth and the emphasis on extensive resource exploitation can lead
to significant land degradation, adversely affecting the region's land cover.
Therefore, human intervention has significantly influenced land use patterns over many
centuries, evolving its structure over time and space. In the present era, these changes have
accelerated due to factors such as agriculture and urbanization. Information regarding land use and
cover is essential for various planning and management tasks related to the Earth's surface,
providing crucial environmental data for scientific, resource management, policy purposes, and
diverse human activities.
Accurate understanding of land use and cover is imperative for the development planning
of any area. Consequently, a wide range of professionals, including earth system scientists, land
and water managers, and urban planners, are interested in obtaining data on land use and cover
changes, conversion trends, and other related patterns. The spatial dimensions of land use and
cover support policymakers and scientists in making well-informed decisions, as alterations in
these patterns indicate shifts in economic and social conditions. Monitoring such changes with the
help of Advanced technologies like Remote Sensing and Geographic Information Systems is
crucial for coordinated efforts across different administrative levels. Advanced technologies like
Remote Sensing and Geographic Information Systems
9
Changes in vegetation cover refer to variations in the distribution, composition, and overall
structure of plant communities across different temporal and spatial scales. These changes can
occur natural.
Strategies for Effective Upskilling is a presentation by Chinwendu Peace in a Your Skill Boost Masterclass organisation by the Excellence Foundation for South Sudan on 08th and 09th June 2024 from 1 PM to 3 PM on each day.
This slide is special for master students (MIBS & MIFB) in UUM. Also useful for readers who are interested in the topic of contemporary Islamic banking.
How to Setup Warehouse & Location in Odoo 17 InventoryCeline George
In this slide, we'll explore how to set up warehouses and locations in Odoo 17 Inventory. This will help us manage our stock effectively, track inventory levels, and streamline warehouse operations.
Chapter wise All Notes of First year Basic Civil Engineering.pptxDenish Jangid
Chapter wise All Notes of First year Basic Civil Engineering
Syllabus
Chapter-1
Introduction to objective, scope and outcome the subject
Chapter 2
Introduction: Scope and Specialization of Civil Engineering, Role of civil Engineer in Society, Impact of infrastructural development on economy of country.
Chapter 3
Surveying: Object Principles & Types of Surveying; Site Plans, Plans & Maps; Scales & Unit of different Measurements.
Linear Measurements: Instruments used. Linear Measurement by Tape, Ranging out Survey Lines and overcoming Obstructions; Measurements on sloping ground; Tape corrections, conventional symbols. Angular Measurements: Instruments used; Introduction to Compass Surveying, Bearings and Longitude & Latitude of a Line, Introduction to total station.
Levelling: Instrument used Object of levelling, Methods of levelling in brief, and Contour maps.
Chapter 4
Buildings: Selection of site for Buildings, Layout of Building Plan, Types of buildings, Plinth area, carpet area, floor space index, Introduction to building byelaws, concept of sun light & ventilation. Components of Buildings & their functions, Basic concept of R.C.C., Introduction to types of foundation
Chapter 5
Transportation: Introduction to Transportation Engineering; Traffic and Road Safety: Types and Characteristics of Various Modes of Transportation; Various Road Traffic Signs, Causes of Accidents and Road Safety Measures.
Chapter 6
Environmental Engineering: Environmental Pollution, Environmental Acts and Regulations, Functional Concepts of Ecology, Basics of Species, Biodiversity, Ecosystem, Hydrological Cycle; Chemical Cycles: Carbon, Nitrogen & Phosphorus; Energy Flow in Ecosystems.
Water Pollution: Water Quality standards, Introduction to Treatment & Disposal of Waste Water. Reuse and Saving of Water, Rain Water Harvesting. Solid Waste Management: Classification of Solid Waste, Collection, Transportation and Disposal of Solid. Recycling of Solid Waste: Energy Recovery, Sanitary Landfill, On-Site Sanitation. Air & Noise Pollution: Primary and Secondary air pollutants, Harmful effects of Air Pollution, Control of Air Pollution. . Noise Pollution Harmful Effects of noise pollution, control of noise pollution, Global warming & Climate Change, Ozone depletion, Greenhouse effect
Text Books:
1. Palancharmy, Basic Civil Engineering, McGraw Hill publishers.
2. Satheesh Gopi, Basic Civil Engineering, Pearson Publishers.
3. Ketki Rangwala Dalal, Essentials of Civil Engineering, Charotar Publishing House.
4. BCP, Surveying volume 1
हिंदी वर्णमाला पीपीटी, hindi alphabet PPT presentation, hindi varnamala PPT, Hindi Varnamala pdf, हिंदी स्वर, हिंदी व्यंजन, sikhiye hindi varnmala, dr. mulla adam ali, hindi language and literature, hindi alphabet with drawing, hindi alphabet pdf, hindi varnamala for childrens, hindi language, hindi varnamala practice for kids, https://www.drmullaadamali.com
How to Add Chatter in the odoo 17 ERP ModuleCeline George
In Odoo, the chatter is like a chat tool that helps you work together on records. You can leave notes and track things, making it easier to talk with your team and partners. Inside chatter, all communication history, activity, and changes will be displayed.
1. Bayesian Networks
Unit 6 Exact Inference
in Bayesian Networks
Wang, Yuan-Kai, 王元凱
ykwang@mails.fju.edu.tw
http://www.ykwang.tw
Department of Electrical Engineering, Fu Jen Univ.
輔仁大學電機工程系
2006~2011
Reference this document as:
Wang, Yuan-Kai, “Exact Inference in Bayesian Networks,"
Lecture Notes of Wang, Yuan-Kai, Fu Jen University, Taiwan, 2011.
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
2. Bayesian Networks Unit - Exact Inference in BN p. 2
Goal of This Unit
• Learn to efficiently compute the
sum product of the inference formula
P( X | E e) P ( X i | Pa ( X i ))
hH i 1~ n
– Remember: enumeration and
multiplication of all P(Xi|Pa(Xi) are not
efficient
– We will learn other 3 methods for exact
inference
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
3. Bayesian Networks Unit - Exact Inference in BN p. 3
Related Units
• Background
– Probabilistic graphical model
• Next units
– Approximate inference algorithms
– Probabilistic inference over time
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
4. Bayesian Networks Unit - Exact Inference in BN p. 4
Self-Study References
• Chapter 14, Artificial Intelligence-a modern
approach, 2nd, by S. Russel & P. Norvig, Prentice
Hall, 2003.
• The generalized distributive law, S. M. Aji and R. J.
McEliece, IEEE Trans. On Information Theory, vol.
46, no. 2, 2000.
• Inference in Bayesian networks, B. D’Ambrosio, AI
Magazine, 1999.
• Probabilistic Inference in graphical models, M. I.
Jordan & Y. Weiss.
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
5. Bayesian Networks Unit - Exact Inference in BN p. 5
Structure of Related Lecture Notes
Problem Structure Data
Learning
PGM B E
Representation Learning
A
Unit 5 : BN Units 16~ : MLE, EM
Unit 9 : Hybrid BN J M
Units 10~15: Naïve Bayes, MRF,
HMM, DBN,
Kalman filter P(B) Parameter
P(E) Learning
P(A|B,E)
P(J|A)
Query Inference
P(M|A)
Unit 6: Exact inference
Unit 7: Approximate inference
Unit 8: Temporal inference
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
6. Bayesian Networks Unit - Exact Inference in BN p. 6
Contents
1. Basics of Graph ……………………………… 11
2. Sum-Product and Generalized Distributive
Law …………………………………………..... 20
3. Variable Elimination ........................................ 29
4. Belief Propagation ....……............................... 96
5. Junction Tree ……………...……………........ 157
6. Summary .......................................................... 212
7. Implementation ……………………………… 214
8. Reference .......................................................... 215
Fu Jen University
Fu Jen University Department of Electrical Engineering
Department of Electronic Engineering Wang, Yuan-Kai Copyright
Yuan-Kai Wang Copyright
7. Bayesian Networks Unit - Exact Inference in BN p. 7
Four Steps of Inference P(X|e)
• Step 1: Bayesian theorem
P ( X , E e)
P ( X | E e) P ( X , E e)
P ( E e)
• Step 2: Marginalization
P( X , E e, H h)
hH
• Step 3: Conditional independence
P( X i | Pa ( X i ))
hH i 1~ n
• Step 4: Sum-Product computation
– Exact inference
– Approximate inference
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
8. Bayesian Networks Unit - Exact Inference in BN p. 8
Five Types of Queries in Inference
• For a probabilistic graphical model G
• Given a set of evidence E=e
• Query the PGM with
– P(e) : Likelihood query
– arg max P(e) :
Maximum likelihood query
– P(X|e) : Posterior belief query
– arg maxx P(X=x|e) : (Single query variable)
Maximum a posterior (MAP) query
– arg maxx …x P(X1=x1, …, Xk=xk|e) :
1 k
Most probable explanation (MPE) query
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
9. Bayesian Networks Unit - Exact Inference in BN p. 9
Brute Force Enumeration
• We can compute
in O(KN) time, where K=|Xi|
B E
A
J M
• By using BN, we can represent joint distribution
in O(N) space
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
10. Bayesian Networks Unit - Exact Inference in BN p. 10
Expression Tree of Enumeration :
Repeated Computations
• P(b|j,m)= EAP(b)P(E)P(A|b,E)P(j|A)P(m|A)
E=e
+ E= e
+
+
A=a *
A=a * *
* A= a * *
* A= a * * *
* *
* * P(a|b,e)
P(a|b,e) P(m|a) P(e) P(b) * P(a|b,e)
P(m|a) *
P(e) P(b) P(a|b,e) P(j|a) P(m|a) P(e) P(b)
P(j|a) P(m|a) P(e) P(b) P(j|a)
P(j|a)
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
11. Bayesian Networks Unit - Exact Inference in BN p. 11
1. Basics of Graph
• Polytree
• Multiply connected networks
• Clique
• Markov network
• Chordal graph
• Induced width
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
12. Bayesian Networks Unit - Exact Inference in BN p. 12
Two Kinds of PGMs
• There are two kinds of
probabilistic graphical models
(PGMs)
– Singly connected network
• Polytree
– Multiply connected network
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
13. Bayesian Networks Unit - Exact Inference in BN p. 13
Singly Connected Networks (Polytree)
• Any two nodes are Burglary Earthquake
connected by at most
one undirected path Alarm
• Theorem John Calls Mary Calls
• Inference in a polytree
is linear in the node size A H
of the network
B C
• This assumes tabular
CPT representation D E
F G
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
14. Bayesian Networks Unit - Exact Inference in BN p. 14
Multiply Connected Networks
• At least two nodes are connected by
more than one undirected path
Cloudy
Sprinkler Rain
Wet
Grass
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
15. Bayesian Networks Unit - Exact Inference in BN p. 15
Clique (1/2)
• A clique is a subgraph of an undirected
graph that is complete and maximal
– Complete:
• Fully connected
• Every node connects to every other nodes
– Maximal:
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
16. Bayesian Networks Unit - Exact Inference in BN p. 16
Clique (2/2)
• Identify cliques
A
EGH CEG
B C G
DEF ACE
D E H
F ABD ADE
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
17. Bayesian Networks Unit - Exact Inference in BN p. 17
Markov Network (1/2)
• An undirected graph with
– Hyper-nodes (multi-vertex nodes)
– Hyper-edges (multi-vertex edges)
EGH CEG
DEF ACE
ABD ADE
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
18. Bayesian Networks Unit - Exact Inference in BN p. 18
Markov Network (2/2)
• Every hyper-edge e=(x1…xk) has a
potential function fe(x1…xk)
• The probability distribution is
P ( X 1 ,..., X n ) Z f e ( x e1 ,..., x ek )
e E
Z 1 / ... f e ( x e1 ,..., x ek )
x1 xn e E
EGH CEG P ( EGH , CEG ) Z f e ( E , G, H , C )
eE
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
19. Bayesian Networks Unit - Exact Inference in BN p. 19
Chordal Graphs
• Elimination ordering undirected chordal
graph V S V S
T L T L
A B A B
X D X D
Graph:
• Maximal cliques are factors in elimination
• Factors in elimination are cliques in the graph
• Complexity is exponential in size of the largest
clique in graph
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
20. Bayesian Networks Unit - Exact Inference in BN p. 20
2. Sum-Product and
Generalized Distributive Law
P ( X | E e) P ( X i | Pa ( X i ))
hH i 1~ n
We obtain the formula because
two rules in probability theory
Sum Rule : P( x) P( x, y )
y
Product Rule : P( x, y ) P( x | y ) P( y )
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
21. Bayesian Networks Unit - Exact Inference in BN p. 21
The Sum-Product with
Generalized Distributive Law
P ( X | E e) P ( X i | Pa ( X i ))
hH i 1~ n
P ( X i | Pa ( X i ))
Xk X 1 i 1~ k
P ( X 1 | Pa ( X 1 )) P ( X k | Pa ( X k ))
Xk X1
P( X k | Pa ( X k )) P( X t | X k , )
Xk X k 1
P( X
X1
1 | Pa ( X 1 )) P( X u | X 1 , )
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
22. Bayesian Networks Unit - Exact Inference in BN p. 22
Distributive Law for Sum-Product
(1/3)
• ax1 ax2 a ( x1 x2 ) ax
i
i a xi
i
• x x
i j
i j ( x)( x)
i
i
j
j P( x | x )
i
i h
i
P ( xi , x h )
P ( xh )
P ( xi , xh ) P ( xh )
• P ( x i ) P ( x j ) P ( x i ) P ( x j ) i
Variable i
i j i j
is eliminated
P( x | x ) P( x
i j
i h j | xk ) ( i
)(
P ( x i | xh )
j
P ( x j | xk ) )
P ( xh ) P ( xk ) f1 ( xh ) f 2 ( xk )
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
23. Bayesian Networks Unit - Exact Inference in BN p. 23
Distributive Law for Sum-Product
(2/3)
• P ( x i | xh ) P ( x j | x k )
i j
( i
P ( x i | xh ) )( j
)
P ( x j | xk )
f1 ( xh ) f 2 ( xk )
• P( x i | xk ) P( x j | xi ) P( x | x )( P( x | x ) )
i k j i
i j i j
P( x | x ) f ( x )) f ( x )
( i k i
k
i
• P(b | j , m) P(b) P(e)P(a | b, e) P( j | a) P(m | a)
e a
P (b) P(e) P (a | b, e) P ( j | a ) P (m | a )
e a
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
24. Bayesian Networks Unit - Exact Inference in BN p. 24
Distributive Law for Sum-Product
(3/3)
ab + ac = a(b+c)
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
25. Bayesian Networks Unit - Exact Inference in BN p. 25
Distributive Law for Max-Product
• max(ax1 , ax2 ) a max( x1 , x2 )
max axi a max xi
i i
• max max xi x j max xi max x j
i j i j
• max max P ( x i ) P ( x j ) max P ( x i ) max P ( x j )
i j i j
max max P( x i | xk ) P( x j | xk )
i j
max P( x i | xk ) max P( x j | xk )
i j
• arg max P ( xi )
i
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
26. Bayesian Networks Unit - Exact Inference in BN p. 26
Generalized Distributive Law (1/2)
Aji and McEliece,
2000
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
27. Bayesian Networks Unit - Exact Inference in BN p. 27
Generalized Distributive Law (2/2)
Aji and McEliece,
2000
•a+0=0+a=a
•a*1=1*a=a
•a*b+a*c=a*(b+c)
•max(a,0)=max(0+a)=a
•a*1=1*a=a
•max(a*b, a*c)
=a*max(b, c)
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
28. Bayesian Networks Unit - Exact Inference in BN p. 28
Marginal to MAP : MAX Product
Likelihood & Posterior Queries
x1
x2
x3
x4 x5
Maximum Likelihood Query
& MAP Query
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
29. Bayesian Networks Unit - Exact Inference in BN p. 29
3. Variable Elimination
• Variable elimination improves the
enumeration algorithm by
– Eliminating repeated calculations
• Carry out summations right-to-left
–Bottom-up in the evaluation tree
• Storing intermediate results (factors) to
avoid re-computation
– Dropping irrelevant variables
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
30. Bayesian Networks Unit - Exact Inference in BN p. 30
Basic Idea
• Write query in the form
P ( X n , e ) P ( xi | pa i )
xk x3 x2 i
• Iteratively
–Move all irrelevant terms (constants) outside
the innermost summation
(i aibc) = (bc (i ai ))
–Perform innermost sum, getting a new term:
factors
–Insert the new term into the product
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
31. Bayesian Networks Unit - Exact Inference in BN p. 31
An Example without Evidence (1/2)
P(C)
Cloudy
0.5 C P(R|C)
T 0.8
C P(S|C) F 0.2
T 0.1
F 0.5
Sprinkler Rain S R P(W|S,R)
T T 0.99
T F 0.90
F T 0.90
WetGrass F F 0.00
P ( w) P ( w | r , s ) P ( r | c ) P ( s | c ) P (c )
r , s ,c
P ( w | r , s ) P ( r | c ) P ( s | c ) P (c )
r ,s c
P ( w | r , s ) f1 ( r , s ) f1 ( r , s )
r ,s Factor
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
32. Bayesian Networks Unit - Exact Inference in BN p. 32
An Example without Evidence (2/2)
R S C P(R|C) P(S|C) P(C) P(R|C) P(S|C) P(C)
T T T
T T F
T F T
T F F
F T T
F T F
F F T
F F F
R S f1(R,S) = ∑c P(R|S) P(S|C) P(C)
Factor f1(r,s) T T
A factor may be T F
• A function F T
• A value F F
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
33. Bayesian Networks Unit - Exact Inference in BN p. 33
An Example with Evidence (1/2)
Factors
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
34. Bayesian Networks Unit - Exact Inference in BN p. 34
An Example with Evidence (2/2)
P(E)
Burglary Earthquake
• fM(a) = <0.7,0.1> P(B)
0.002
B E P(A|B,E)
T T 0.95
• fJ(a) = <0.9,0.05> 0.001 Alarm T F
F T
0.95
0.29
• fA(a,b,e) A P(J|A)
F F 0.001
John Calls T 0.90 Mary Calls A P(M|A)
• fÃJM(b,e) F 0.05 T
F
0.70
0.01
J M A B E fM(a) PJ(a) fA(a,b,e) fJM (a,b,e) fÃJM (b,e)
T T T T T 0.7 0.9 0.95 0.7*0.9*0.95
T T T T F 0.7 0.9 0.95 0.7*0.9*0.95
T T T F T 0.7 0.9 0.29 0.7*0.9*0.29
T T T F F 0.7 0.9 0.001 0.7*0.9*0.01
T T F T T 0.1 0.05 0.05 0.1*0.05*0.05
T T F T F 0.1 0.05 0.05 0.1*0.05*0.05
T T F F T 0.1 0.05 0.71 0.1*0.05*0.71
T T F F F 0.1 0.05 0.95 0.1*0.05*0.95
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
35. Bayesian Networks Unit - Exact Inference in BN p. 35
Basic Operations
• Summing out a variable from a
product of factors
– Move any irrelevant terms (constants)
outside the innermost summation
– Add up submatrices in pointwise
product of remaining factors
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
36. Bayesian Networks Unit - Exact Inference in BN p. 36
Variable Elimination Algorithm
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
37. Bayesian Networks Unit - Exact Inference in BN p. 37
Irrelevant Variables (1/2)
• Consider the query
P(JohnCalls|Burglary = true)
– P(J|b)= P(b) eP(e) aP(a|b,e)P(J|a) mP(m|a)
– Sum over m is identically 1
mP(m|a) = 1
– M is irrelevant to the query
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
38. Bayesian Networks Unit - Exact Inference in BN p. 38
Irrelevant Variables (2/2)
• Theorem 1: P(X|E)
Y is irrelevant if YAncestors({X}E)
• In the example P(J|b)
– X =JohnCalls, E={Burglary}
– Ancestors({X} E)
= {Alarm,Earthquake}
– so MaryCalls is irrelevant
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
39. Bayesian Networks Unit - Exact Inference in BN p. 39
Complexity
• Time and space cost of variable elimination
are O(dkn)
– n: No. of random variables
– d: no. of discrete values
– k: no. of parent nodes k is critical for
• Polytrees : k is small, Linear complexity
– If k=1, O(dn)
• Multiply connected networks :
– O(dkn), k is large
– Can reduce 3SAT to variable elimination
• NP-hard
– Equivalent to counting 3SAT models
• #P-complete, i.e. strictly harder than NP-complete
problems
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
40. Bayesian Networks Unit - Exact Inference in BN p. 40
Pros and Cons
• Variable elimination is simple and
efficient for single query P(Xi | e)
• But it is less efficient if all the variables
are computed: P(X1 | e), …, P(Xk | e)
– In a polytree network, one would need to
issue O(n) queries costing O(n) each: O(n2)
• Junction tree algorithm extends variable
elimination that compute posterior
probabilities for all nodes
simultaneously
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
41. Bayesian Networks Unit - Exact Inference in BN p. 41
3.1 An Example
• The Asia network
Visit to Smoking
Asia
Tuberculosis Lung Cancer
Abnormality Bronchitis
in Chest
X-Ray Dyspnea
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
42. Bayesian Networks Unit - Exact Inference in BN p. 42
V S
• We want to inference P(d)
• Need to eliminate: v,s,x,t,l,a,b T L
A B
Initial factors
X D
P (v, s , t , l , a , b, x, d )
P ( v ) P ( s ) P (t | v ) P (l | s ) P (b | s ) P ( a | t , l ) P ( x | a ) P ( d | a , b )
“Brute force approach”
P (d) P (v, s, t, l, a,b, x, d)
x b a l t s v
T
Complexity is exponential O(N )
• N : size of the graph, number of variables
• K : number of states for each variable
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
43. Bayesian Networks Unit - Exact Inference in BN p. 43
V S
• We want to inference P(d)
• Need to eliminate : v,s,x,t,l,a,b T L
A B
Initial factors
X D
P (v )P ( s )P (t | v )P (l | s )P (b | s )P (a | t , l )P ( x | a )P (d | a , b )
Eliminate: v
Compute: fv (t ) P (v )P (t |v )
v
fv (t )P ( s )P (l | s )P (b | s )P (a | t , l )P ( x | a )P (d | a , b )
t fv(t)
Note: fv(t) = P(t) T 0.70
In general, result of elimination is F 0.01
not necessarily a probability term
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
44. Bayesian Networks Unit - Exact Inference in BN p. 44
V S
• We want to inference P(d)
• Need to eliminate : s,x,t,l,a,b T L
A B
• Initial factors
X D
P (v )P ( s )P (t | v )P (l | s )P (b | s )P (a | t , l )P ( x | a )P (d | a , b )
fv (t )P ( s )P (l | s )P (b | s )P (a | t , l )P ( x | a )P (d | a , b )
Eliminate: s
Compute: fs (b , l ) P (s )P (b | s )P (l | s )
s
fv (t )fs (b , l )P (a | t , l )P ( x | a )P (d | a , b )
b l fs(b,l)
T T 0.95
•Summing on s results in fs(b,l) T
F
F 0.95
T 0.29
•A factor with two arguments F F 0.001
•Result of elimination may be a function of several variables
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
45. Bayesian Networks Unit - Exact Inference in BN p. 45
V S
• We want to inference P(d)
• Need to eliminate : x,t,l,a,b T L
A B
• Initial factors
X D
P (v )P ( s )P (t | v )P (l | s )P (b | s )P (a | t , l )P ( x | a )P (d | a , b )
fv (t )P ( s )P (l | s )P (b | s )P (a | t , l )P ( x | a )P (d | a , b )
fv (t )fs (b , l )P (a | t , l )P ( x | a )P (d | a , b )
Eliminate: x
Compute: fx (a ) P (x | a )
x
fv (t )fs (b , l )fx (a )P (a | t , l )P (d | a , b )
Note: fx(a) = 1 for all values of a !!
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
46. Bayesian Networks Unit - Exact Inference in BN p. 46
V S
• We want to inference P(d)
• Need to eliminate : t,l,a,b T L
A B
• Initial factors
X D
P (v )P ( s )P (t | v )P (l | s )P (b | s )P (a | t , l )P ( x | a )P (d | a , b )
fv (t )P ( s )P (l | s )P (b | s )P (a | t , l )P ( x | a )P (d | a , b )
fv (t )fs (b , l )P (a | t , l )P ( x | a )P (d | a , b )
fv (t )fs (b , l )fx (a )P (a | t , l )P (d | a , b )
Eliminate: t
Compute: ft (a , l ) fv (t )P (a |t , l )
t
fs (b , l )fx (a )ft (a , l )P (d | a , b )
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
47. Bayesian Networks Unit - Exact Inference in BN p. 47
V S
• We want to inference P(d)
• Need to eliminate : l,a,b T L
A B
• Initial factors
X D
P (v )P ( s )P (t | v )P (l | s )P (b | s )P (a | t , l )P ( x | a )P (d | a , b )
fv (t )P ( s )P (l | s )P (b | s )P (a | t , l )P ( x | a )P (d | a , b )
fv (t )fs (b , l )P (a | t , l )P ( x | a )P (d | a , b )
fv (t )fs (b , l )fx (a )P (a | t , l )P (d | a , b )
fs (b , l )fx (a )ft (a , l )P (d | a , b )
Eliminate: l
Compute: fl (a , b ) fs (b , l )ft (a , l )
l
fl (a , b )fx (a )P (d | a , b )
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
48. Bayesian Networks Unit - Exact Inference in BN p. 48
V S
• We want to inference P(d)
T L
• Need to eliminate : b
A B
• Initial factors X D
P (v )P ( s )P (t | v )P (l | s )P (b | s )P (a | t , l )P ( x | a )P (d | a , b )
fv (t )P ( s )P (l | s )P (b | s )P (a | t , l )P ( x | a )P (d | a , b )
fv (t )fs (b , l )P (a | t , l )P ( x | a )P (d | a , b )
fv (t )fs (b , l )fx (a )P (a | t , l )P (d | a , b )
fs (b , l )fx (a )ft (a , l )P (d | a , b )
fl (a , b )fx (a )P (d | a , b ) fa (b , d ) fb (d )
Eliminate: a,b
Compute:
fa (b , d ) fl (a , b )fx (a ) p (d | a , b )
a
fb (d ) fa (b , d )
b
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
49. Bayesian Networks Unit - Exact Inference in BN p. 49
V S
• Different elimination ordering
• Need to eliminate : a,b,x,t,v,s,l T L
• Initial factors A B
P (v)P (s)P (t | v)P (l | s)P (b | s)P (a | t, l)P ( x | a)P (d | a,b)
X D
Intermediate factors: In previous order
g a (l , t , d , b , x , s , v ) Both f v (v, s , x, t , l , a , b )
g b (l , t , d , x , s , v ) need f s ( s , x, t , l , a , b )
g x (l , t , d , s , v ) n=7 f x ( x, t , l , a , b )
g t (l , d , s , v ) steps f t (t , l , a, b)
g v (l , d , s ) f l (l , a, b)
g s (l , d )
But each step has
f a ( a, b)
different
g l (d )
computation size f b (d )
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
50. Bayesian Networks Unit - Exact Inference in BN p. 50
Short Summary
• Variable elimination is a sequence of
rewriting operations
• Computation depends on
– Number of variables n
• Each elimination step reduces one variable
• So we need n elimination steps
– Size of factors
• Effected by order of elimination
• Discussed in sub-section 3.2
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
51. Bayesian Networks Unit - Exact Inference in BN p. 51
V S
Dealing with Evidence(1/7) T L
A B
• How do we deal with evidence? X D
• Suppose get evidence V = t, S = f, D = t
• We want to compute P(L, V = t, S = f, D = t)
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
52. Bayesian Networks Unit - Exact Inference in BN p. 52
V S
Dealing with Evidence(2/7) T L
A B
• We start by writing the factors: X D
P (v )P ( s )P (t | v )P (l | s )P (b | s )P (a | t , l )P ( x | a )P (d | a , b )
• Since we know that V = t, we don’t need to eliminate V
• Instead, we can replace the factors P(V) and P(T|V) with
fP (V ) P (V t ) fp (T |V ) ( ) P ( |V t )
T T
• These “select” the appropriate parts of the original
factors given the evidence
• Note that fp(V) is a constant, and thus does not appear in
elimination of other variables
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
53. Bayesian Networks Unit - Exact Inference in BN p. 53
V S
Dealing with Evidence(3/7) T L
• Given evidence V = t, S = f, D = t A B
• Compute P(L, V = t, S = f, D = t )
X D
• Initial factors, after setting evidence:
fP (v )fP ( s )fP (t |v ) (t )fP (l |s ) (l )fP ( b|s ) (b )P (a | t , l )P ( x | a )fP (d |a ,b ) (a , b )
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
54. Bayesian Networks Unit - Exact Inference in BN p. 54
V S
Dealing with Evidence(4/7) T L
A B
• Given evidence V = t, S = f, D = t
• Compute P(L, V = t, S = f, D = t ) X D
• Initial factors, after setting evidence:
fP (v )fP ( s )fP (t |v ) (t )fP (l |s ) (l )fP ( b|s ) (b )P (a | t , l )P ( x | a )fP (d |a ,b ) (a , b )
• Eliminating x, we get
fP (v )fP ( s )fP (t |v ) (t )fP (l |s ) (l )fP ( b|s ) (b )P (a | t , l )fx (a )fP (d |a ,b ) (a , b )
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
55. Bayesian Networks Unit - Exact Inference in BN p. 55
V S
Dealing with Evidence(5/7) T L
• Given evidence V = t, S = f, D = t A B
• Compute P(L, V = t, S = f, D = t )
• Initial factors, after setting evidence: X D
fP (v )fP ( s )fP (t |v ) (t )fP (l |s ) (l )fP ( b|s ) (b )P (a | t , l )P ( x | a )fP (d |a ,b ) (a , b )
• Eliminating x, we get
fP (v )fP ( s )fP (t |v ) (t )fP (l |s ) (l )fP ( b|s ) (b )P (a | t , l )fx (a )fP (d |a ,b ) (a , b )
• Eliminating t, we get
fP (v )fP ( s )fP (l |s ) (l )fP ( b|s ) (b )ft (a , l )fx (a )fP (d |a ,b ) (a , b )
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
56. Bayesian Networks Unit - Exact Inference in BN p. 56
V S
Dealing with Evidence(6/7) T L
• Given evidence V = t, S = f, D = t A B
• Compute P(L, V = t, S = f, D = t )
• Initial factors, after setting evidence: X D
fP (v )fP ( s )fP (t |v ) (t )fP (l |s ) (l )fP ( b|s ) (b )P (a | t , l )P ( x | a )fP (d |a ,b ) (a , b )
• Eliminating x, we get
fP (v )fP ( s )fP (t |v ) (t )fP (l |s ) (l )fP ( b|s ) (b )P (a | t , l )fx (a )fP (d |a ,b ) (a , b )
• Eliminating t, we get
fP (v )fP ( s )fP (l |s ) (l )fP ( b|s ) (b )ft (a , l )fx (a )fP (d |a ,b ) (a , b )
• Eliminating a, we get
fP (v )fP ( s )fP (l |s ) (l )fP ( b|s ) (b )fa (b , l )
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
57. Bayesian Networks Unit - Exact Inference in BN p. 57
V S
Dealing with Evidence(7/7) T L
• Given evidence V = t, S = f, D = t A B
• Compute P(L, V = t, S = f, D = t )
• Initial factors, after setting evidence: X D
fP (v )fP ( s )fP (t |v ) (t )fP (l |s ) (l )fP ( b|s ) (b )P (a | t , l )P ( x | a )fP (d |a ,b ) (a , b )
• Eliminating x, we get
fP (v )fP ( s )fP (t |v ) (t )fP (l |s ) (l )fP ( b|s ) (b )P (a | t , l )fx (a )fP (d |a ,b ) (a , b )
• Eliminating t, we get
fP (v )fP ( s )fP (l |s ) (l )fP ( b|s ) (b )ft (a , l )fx (a )fP (d |a ,b ) (a , b )
• Eliminating a, we get
fP (v )fP ( s )fP (l |s ) (l )fP ( b|s ) (b )fa (b , l )
• Eliminating b, we get fP (v )fP ( s )fP (l |s ) (l )fb (l )
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
58. Bayesian Networks Unit - Exact Inference in BN p. 58
Complexity (1/2)
• Suppose in one elimination step we compute
fx ( y 1 , , y k ) f 'x (x , y , , y k )
x
1
m
f 'x ( x , y , , y k ) fi ( x , y , y
1 1,1, 1,li
)
i 1
This requires |X| : No. of discrete values of X
• m X Yi multiplications
i
– For each value for x, y1, …, yk, we do m
multiplications
• X Yi additions
i
– For each value of y1, …, yk , we do |X| additions
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
59. Bayesian Networks Unit - Exact Inference in BN p. 59
Complexity (2/2)
• One elimination step requires
– m X Yi multiplications
i
– X Yi additions
i
– O( X Yi ), m is a constant (neglected)
i
– Or O(d k) if
• |X|=|Yi|=d,
• k: no. of parent nodes
• Time and space cost are O(dkn) Complexity is
– n: No. of random variables exponential in number
– d: no. of discrete values of variables k
– k: no. of parent nodes
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
60. Bayesian Networks Unit - Exact Inference in BN p. 60
3.2 Order of Elimination
• How to select “good” elimination
orderings in order to reduce complexity
1. Start by understanding variable
elimination via the graph we are working
with
2. Then reduce the problem of finding good
ordering to graph-theoretic operation that
is well-understood
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
61. Bayesian Networks Unit - Exact Inference in BN p. 61
Undirected Graph Conversion (1/2)
• At each stage of the variable
elimination,
• We have an algebraic term that we
need to evaluate
• This term is of the form
P ( x 1 , , x k ) fi ( Z i )
y1 yn i
where Zi are sets of variables
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
62. Bayesian Networks Unit - Exact Inference in BN p. 62
Undirected Graph Conversion (2/2)
• Plot a graph where
– If X,Y are arguments of some factor
• That is, if X,Y are in some Zi
– There are undirected edges X--Y
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
63. Bayesian Networks Unit - Exact Inference in BN p. 63
Example
• Consider the “Asia” example
• The initial factors are
P (v )P ( s )P (t | v )P (l | s )P (b | s )P (a | t , l )P ( x | a )P (d | a , b )
• The undirected graph is
V S V S
T L T L
A B A B
X D X D
• In the first step this graph is just the
moralized graph
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
64. Bayesian Networks Unit - Exact Inference in BN p. 64
Variable Elimination
Change of Graph
P (v )P ( s )P (t | v )P (l | s )P (b | s )P (a | t , l )P ( x | a )P (d | a , b )
• Now we eliminate t, getting
P (v )P ( s )P (l | s )P (b | s )P ( x | a )P (d | a , b )ft (v , a , l )
• The corresponding change in the graph
is V S V S
T L T L Nodes V,L,A
become
A B A B a clique
X D X D
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
65. Bayesian Networks Unit - Exact Inference in BN p. 65
Example (1/6)
• Want to compute P(L,V=t,S=f,D=t)
V S
T L
A B
• Moralizing V S
X D
T L
A B
X D
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
66. Bayesian Networks Unit - Exact Inference in BN p. 66
Example (2/6)
• Want to compute P(L,V=t,S=f,D=t)
V S
T L
• Moralizing A B
• Setting evidence X D
V S
T L
A B
X D
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
67. Bayesian Networks Unit - Exact Inference in BN p. 67
Example (3/6)
• Want to compute P(L,V=t,S=f,D=t)
V S
T L
• Moralizing A B
• Setting evidence
• Eliminating x
X D
V S
– New factor fx(A) T L
A B
X D
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
68. Bayesian Networks Unit - Exact Inference in BN p. 68
Example (4/6)
• Want to compute P(L,V=t,S=f,D=t)
V S
T L
• Moralizing A B
• Setting evidence X D
• Eliminating x
Eliminating a
V S
•
– New factor fa(b,t,l) T L
A B
A clique in reduced undirected graph
X D
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
69. Bayesian Networks Unit - Exact Inference in BN p. 69
Example (5/6)
• Want to compute P(L,V=t,S=f,D=t)
V S
• Moralizing T L
• Setting evidence A B
• Eliminating x X D
• Eliminating a V S
• Eliminating b T L
– New factor fb(t,l) A B
A clique in reduced
X D
undirected graph
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
70. Bayesian Networks Unit - Exact Inference in BN p. 70
Example (6/6)
• Want to compute P(L,V=t,S=f,D=t)
V S
T L
• Moralizing A B
• Setting evidence X D
• Eliminating x V S
• Eliminating a T L
• Eliminating b
Eliminating t A B
•
– New factor ft(l) X D
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
71. Bayesian Networks Unit - Exact Inference in BN p. 71
Elimination and Clique (1/2)
• We can eliminate a variable x by
1. For all Y,Z, s.t., Y--X, Z--X
• add an edge Y--Z
2. Remove X and all adjacent edges to it
• This procedures create a clique that contains
all the neighbors of X
• After step 1 we have a clique that
corresponds to the intermediate factor
(before marginalization)
• The cost of the step is exponential in the size
of this clique : dk in O(ndk)
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
72. Bayesian Networks Unit - Exact Inference in BN p. 72
Elimination and Clique (2/2)
• The process of eliminating nodes from
an undirected graph gives us a clue to
the complexity of inference
• To see this, we will examine the graph
that contains all of the edges we added
during the elimination
• The resulting graph is always chordal
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
73. Bayesian Networks Unit - Exact Inference in BN p. 73
V S
Example (1/7) T L
• Want to compute P(L) A B
X D
• Moralizing V S
T L
A B
X D
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
74. Bayesian Networks Unit - Exact Inference in BN p. 74
V S
Example (2/7) T L
• Want to compute P(L) A B
X D
• Moralizing
• Eliminating v V S
– Multiply to get f’v(v,t)
– Result fv(t) T L
A B
X D
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
75. Bayesian Networks Unit - Exact Inference in BN p. 75
V S
Example (3/7) T L
• Want to compute P(L) A B
X D
• Moralizing
• Eliminating v V S
• Eliminating x T L
–Multiply to get f’x(a,x)
–Result fx(a) A B
X D
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
76. Bayesian Networks Unit - Exact Inference in BN p. 76
V S
Example (4/7) T L
• Want to compute P(L) A B
X D
• Moralizing
• Eliminating v V S
• Eliminating x T L
• Eliminating s
–Multiply to get f’s(l,b,s) A B
–Result fs(l,b) X D
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
77. Bayesian Networks Unit - Exact Inference in BN p. 77
V S
Example (5/7) T L
• Want to compute P(D) A B
X D
• Moralizing
• Eliminating v
• Eliminating x
V S
• Eliminating s T L
• Eliminating t A B
–Multiply to get f’t(a,l,t) X D
–Result ft(a,l)
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
78. Bayesian Networks Unit - Exact Inference in BN p. 78
V S
Example (6/7) T L
• Want to compute P(D) A B
X D
• Moralizing
• Eliminating v V S
• Eliminating x T L
• Eliminating s
• Eliminating t A B
• Eliminating l X D
–Multiply to get f’l(a,b,l)
–Result fl(a,b)
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
79. Bayesian Networks Unit - Exact Inference in BN p. 79
V S
Example (7/7) T L
• Want to compute P(D) A B
X D
• Moralizing
• Eliminating v
• Eliminating x V S
• Eliminating s T L
• Eliminating t A B
• Eliminating l X D
• Eliminating a, b
–Multiply to get f’a(a,b,d)
–Result f(d)
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
80. Bayesian Networks Unit - Exact Inference in BN p. 80
Induced Graphs V S
• The resulting graph are induced T L
graphs (for this particular ordering) A B
X D
• Main property:
– Every maximal clique in the induced graph
corresponds to an intermediate factor in the
computation
– Every factor stored during the process is a subset of
some maximal clique in the graph
• These facts are true for any variable
elimination ordering on any network
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
81. Bayesian Networks Unit - Exact Inference in BN p. 81
Induced Width (Treewidth)
• The size of the largest clique k in the
induced graph is
– An indicator for the complexity of variable
elimination
• w=k-1 is called
– Induced width (treewidth) of a graph
– According to the specified ordering
• Finding a good ordering for a graph is
equivalent to finding the minimal
induced width of the graph
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
82. Bayesian Networks Unit - Exact Inference in BN p. 82
Treewidth
Low treewidth High tree width
Chains N=nxn grid
W=1
Trees (no loops)
W = O(n) = O(p N)
MINVOLSET
INTUBATION KINKEDTUBE
PULMEMBOLUS VENTMACH
DISCONNECT
PAP SHUNT VENTLUNG VENITUBE
MINOVL
PVSAT
VENTALV
ARTCO2
PRESS
Loopy graphs Arnborg85
TPR SAO2 EXPCO2
INSUFFANESTH
HYPOVOLEMIA
LVFAILURE CATECHOL
LVEDVOLUME
STROEVOLUME ERRBLOWOUTPUT
HISTORY HRERRCAUTER
CVP PCWP CO HREKGHRSAT
HRBP
BP
W = #parents
W = NP-hard to find
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
83. Bayesian Networks Unit - Exact Inference in BN p. 83
Complexity
• Time and space cost of variable elimination
are O(dkn)
– n: No. of random variables
– d: no. of discrete values
– k: no. of parent nodes = treewidth + 1 (W+1)
• Polytrees : k is small, Linear
– If k=1, O(dn)
• Multiply connected networks :
– O(dkn), k is large
– Can reduce 3SAT to variable elimination
• NP-hard
– Equivalent to counting 3SAT models
• #P-complete, i.e. strictly harder than NP-complete
problems
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
84. Bayesian Networks Unit - Exact Inference in BN p. 84
Elimination on Trees (1/3)
• Suppose we have a tree that
– A network where each variable has at most
one parent
• Then all the factors involve at most two
variables: Treewidth=1
• The moralized graph is also a tree
A A
B C B C
D E D E
F G F G
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
85. Bayesian Networks Unit - Exact Inference in BN p. 85
Elimination on Trees (2/3)
• We can maintain the tree structure by
eliminating extreme variables in the tree
A A
B C B C
D E D E
A
F G F G
B C
D E
F G
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
86. Bayesian Networks Unit - Exact Inference in BN p. 86
Elimination on Trees (3/3)
• Formally, for any tree, there is an
elimination ordering with treewidth = 1
Theorem
• Inference on trees is linear in number of
variables : O(dn)
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
87. Bayesian Networks Unit - Exact Inference in BN p. 87
Exercise: Variable Elimination
p(smart)=.8 p(study)=.6 Query: What is the probability
smart study that a student studied, given
that they pass the exam?
p(fair)=.9
prepared fair
p(prep|…) smart smart
pass study .9 .7
smart smart study .5 .1
p(pass|…)
prep prep prep prep
fair .9 .7 .7 .2
fair .1 .1 .1 .1
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
88. Bayesian Networks Unit - Exact Inference in BN p. 88
Variable Elimination Algorithm
• Let X1,…, Xm be an ordering on the non-query
variables ... P ( X | Parents ( X )) j j
X 1 X 2 X m j
• For i = m, …, 1
– Leave in the summation for Xi only factors
mentioning Xi
– Multiply the factors, getting a factor that contains a
number for each value of the variables mentioned,
including Xi
– Sum out Xi, getting a factor f that contains a
number for each value of the variables mentioned,
not including Xi
– Replace the multiplied factor in the summation
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
89. Bayesian Networks Unit - Exact Inference in BN p. 89
3.3 General Graphs
• If the graph is not a polytree
– More general networks
– Usually loopy networks
• Can we inference loopy networks by
variable elimination?
– If network has a cycle, the treewidth for
any ordering is greater than 1
– Its complexity is high,
– VE becomes a not practical algorithm
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
90. Bayesian Networks Unit - Exact Inference in BN A p. 90
B C
Example (1/2)
D E
• Eliminating A, B, C, D, E,….
F G
• Resulting graph is chordal with
treewidth 2 H
A A A A
B C B C B C B C
D E D E D E D E
F G F G F G F G
H H H H
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
91. Bayesian Networks Unit - Exact Inference in BN A p. 91
B C
Example (2/2)
D E
• Eliminating H,G, E, C, F, D, E, A
F G
• Resulting graph is chordal with
treewidth 3 H
A A A A
B C B C B C B C
D E D E D E D E
F G F G F G F G
H H H H
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
92. Bayesian Networks Unit - Exact Inference in BN p. 92
Find Good Elimination Order
in General Graph
Theorem:
• Finding an ordering that minimizes the
treewidth is NP-Hard
However,
• There are reasonable heuristic for finding
“relatively” good ordering
• There are provable approximations to the best
treewidth
• If the graph has a small treewidth, there are
algorithms that find it in polynomial time
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
93. Bayesian Networks Unit - Exact Inference in BN p. 93
Heuristics for
Finding an Elimination Order
• Since elimination order is NP-hard to
optimize,
• It is common to apply greedy search
techniques: Kjaerulff90
• At each iteration, eliminate the node that
would result in the smallest
– Number of fill-in edges [min-fill]
– Resulting clique weight [min-weight] (Weight of
clique = product of number of states per node in
clique)
• There are some approximation algorithms Amir01
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
94. Bayesian Networks Unit - Exact Inference in BN p. 94
Factorization in Loopy Networks
Probabilistic models with no loop are tractable
Factorizable
a
b Pa, x P(b, x) P(c, x) P(d, x)
a b c d
c
d P (a, x) P (b, x) P (c, x) P (d, x)
a b c d
Probabilistic models with loop are not tractable
a
Not Factorizable
b
c Pa, b, c, d, x
a b c d
d
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
95. Bayesian Networks Unit - Exact Inference in BN p. 95
Short Summary
• Variable elimination
– Actual computation is done in elimination
step
– Computation depends on order of
elimination
– Very sensitive to topology
– Space = time
• Complexity
– Polytrees: Linear time
– General graphs: NP-hard
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
96. Bayesian Networks Unit - Exact Inference in BN p. 96
4. Belief Propagation
• Also called
– Message passing
– Pearl’s algorithm
• Subsections
– 4.1 Message passing in simple chains
– 4.2 Message passing in trees
– 4.3 BP Algorithm
– 4.4 Message passing in general graphs
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
97. Bayesian Networks Unit - Exact Inference in BN p. 97
What’s Wrong with VarElim
• Often we want to query all hidden nodes
• Variable elimination takes O(N2dk) time to
compute P(Xi|e) for all (hidden) nodes Xi
• Message passing algorithms that can do
this in O(Ndk) time
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
98. Bayesian Networks Unit - Exact Inference in BN p. 98
Repeated Variable Elimination Leads to
Redundant Calculations
X1 X2 X3
Y1 Y2 Y3
P ( x1 | y1:3 ) P ( x1 ) P ( y1 | x1 ) P ( x 2 | x1 ) P ( y 2 | x 2 ) P ( x3 | x 2 ) P ( y 3 | x3 )
x2 x3
P ( x 2 | y1:3 ) P ( x 2 | x1 ) P ( y 2 | x 2 ) P ( x1 ) P ( y1 | x1 ) P ( x3 | x 2 ) P ( y 3 | x3 )
x1 x3
P ( x3 | y1:3 ) P ( x3 | x 2 ) P ( y 3 | x3 ) P ( x1 ) P ( y1 | x1 ) P ( x 2 | x1 ) P ( y 2 | x 2 )
x1 x2
O(N2 K2) time to compute all N marginals
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
99. Bayesian Networks Unit - Exact Inference in BN p. 99
Belief Propagation
• Belief propagation (BP) operates by sending
beliefs/messages between nearby variables in
the graphical model
• It works like variable elimination
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
100. Bayesian Networks Unit - Exact Inference in BN p. 100
4.1 Message Passing
in Simple Chains
X1 ... Xk ... Xn
• Likelihood query (query without evidence)
– P(X1), P(Xn), P(Xk)
– P(Xj , Xk)
• Posterior query (query with evidence)
– P(X1|Xn), P(Xn|X1),
– P(Xk|X1), P(Xk|Xn),
– P(X1|Xk), P(Xn|Xk),
– P(Xk|Xj)
• Maximum A Posterior (MAP) query
– arg max P(Xk|Xj)
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
101. Bayesian Networks Unit - Exact Inference in BN p. 101
Sum-Product of the Simple Chain
(1/2)
X1 ... Xk ... Xn
P( X k ) P( X
X 1 X k 1 , X k 1 X n
1 , , X k , , X n )
P ( X 1 , , X k , , X n )
X1 X k 1 X k 1 Xn
P ( X i | Pa ( X i ))
X1 X k 1 X k 1 Xn Xi
P ( X n | X n 1 ) P ( X k | X k 1 ) P ( X 2 | X 1 ) P ( X 1 )
X1 X k 1 X k 1 Xn
P ( X 1 ) P ( X 2 | X 1 ) P ( X k 1 | X k 2 ) P ( X k | X k 1 )
X1 X2 X k 1
P( X
X k 1
k 1 | X k ) P ( X n | X n 1 )
Xn
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
102. Bayesian Networks Unit - Exact Inference in BN p. 102
Sum-Product of the Simple Chain
(2/2)
X1 ... Xk ... Xn
P( X k | X j ) P ( X 1 , , X n )
{ X i |1 i n , i j , k }
P( X
{ X i |1 i n , i j , k } X i
i | Pa ( X i ))
P( X
{ X i |1 i n , i j , k }
n | X n 1 ) P ( X k | X k 1 ) P ( X 2 | X 1 ) P ( X 1 )
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
103. Bayesian Networks Unit - Exact Inference in BN p. 103
4.1.1 Likelihood Query
• P(Xn) or P(xn) : Forward passing
X1 X2 X3 ... Xn
• P(X1) or P(x1) : Backward passing
X1 X2 X3 ... Xn
• P(Xk) or P(xk) : Forward-Backward passing
X1 X2 ... Xk ... Xn
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
104. Bayesian Networks Unit - Exact Inference in BN p. 104
Forward Passing (1/6)
A B C D E
• P(e)
P ( e ) P ( a ) P (b | a ) P ( c | b ) P ( d | c ) P ( e | d )
d c b a
P ( e | d ) P ( d | c ) P ( c | b ) P ( a ) P (b | a )
d c b a
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright
105. Bayesian Networks Unit - Exact Inference in BN p. 105
Forward Passing (2/6)
X we can perform innermost summation
m AB (B )
A B C D E
• Now
P ( e ) P ( e | d ) P ( d | c ) P ( c | b ) P ( a ) P (b | a )
d c b a
P ( e | d ) P ( d | c ) P ( c | b ) p (b )
d c b
• This summation is exactly
– A variable elimination step
– We call it: send a CPT P(b) to compute next
innermost summation
– The sent CPT P(b) is called a belief, or message:
m AB (b) P(b) P(a ) P(b | a ) f (a, b)
a a
Fu Jen University Department of Electrical Engineering Wang, Yuan-Kai Copyright