Agenda
Why do you need PGMs?
What is a PGM?
Bayesian Networks
Markov’s Random Fields
Use-Cases
Belief Networks in MRF
PGMs & Neural Networks
01
02
03
04
05
06
07
Why do you need PGMs?
Why do you need
Probabilistic Graphical
Models?
Probabilistic Graphical Models are rich frameworks for encoding
probability distributions over complex domains.
Compact Graphical Representation
PGM are frameworks used to create and
represent compact graphical models of complex
real world scenarios
01
Intuitive Diagrams of Complex Relationships
PGMs give us intuitive diagrams of complex
relationships between stochastic variables.
02
Convenient from Computational Aspect
PGMs are also convenient from computational
point of view, since we already have algorithms
for working with graphs and statistics.
03
Dynamic Simulation of Models
Using PGM we can simulate dynamics of
industrial establishments, create models, and
many other things.
04
What is a PGM?
What is a Probabilistic
Graphical Model?
WC
G
P
Consider you have 4 binary(Yes/No) variables.
Spot in the World Cup(Yes/No)
Performance in the Pre WC Tour (Yes/No)
Good Genetics(Yes/No)
F Good Form(Yes/No)
What is a Probabilistic
Graphical Model?
Consider you have 4 binary(Yes/No) variables.
WC
G
P
F
What is a Probabilistic
Graphical Model?
Components of a Graphical Model
Nodes = Random Variables
WC
G
P
F
What is a Probabilistic
Graphical Model?
Components of a Graphical Model
Edges = Inter-Nodal Dependencies
What is a Probabilistic
Graphical Model?
So, What is a PGM?
P
M
G
Probabilistic
Graphical
Models
What is a Probabilistic
Graphical Model?
So, What is a PGM?
P Probabilistic
G M
The nature of the problem that we are generally interested to
solve or the type of queries we want to make are all
probabilistic because of uncertainty. There are many reasons
that contributes to it.
What is a Probabilistic
Graphical Model?
So, What is a PGM?
P
M
G
Probabilistic
Graphical
Models
What is a Probabilistic
Graphical Model?
So, What is a PGM?
G Graphical
P M
Graphical representation helps us to visualise better and
So, we use Graph Theory to reduce the no of relevant
combinations of all the participating variables to represent
the high dimensional probability distribution model more
compactly.
What is a Probabilistic
Graphical Model?
So, What is a PGM?
P
M
G
Probabilistic
Graphical
Models
What is a Probabilistic
Graphical Model?
So, What is a PGM?
M Models
P G
A Model is a declarative representation of a real world
scenario or a problem that we want to analysis. It is
represented by using any mathematical tools like graph or
even simply by an equation.
What is a Probabilistic
Graphical Model?
So, What is a PGM?
P
M
G
Probabilistic
Graphical
Models
What is a Probabilistic
Graphical Model?
So, What is a PGM?
P G M
Probabilistic Graphical Models (PGM) is a technique of
compactly representing a joint distribution by exploiting
dependencies between the random variables. It also allows us
to do inference on joint distributions in a computationally
cheaper way than the traditional methods.
What is a Probabilistic
Graphical Model?
Probability
A
A
A
A
A
B
B
B
B
B
B
C
C
C
C
C
C
What is the Probability of A?
Solution:
• Count all As and Divide it by Total number of Possibilities.
• P(A) = (#A)/(#A+#B+#C)
What is a Probabilistic
Graphical Model?
Conditional Probability
A
A
A A
B
B
B
B
B
What is the Probability of A&B?
Solution:
• B should occur when A is already happening.
• P(A&B) = P(A) * P(B|A) or P(B) * P(A|B).
What is a Probabilistic
Graphical Model?
Joint, Probability and Marginal Distributions
• The Joint Probability Distribution describes how two or more
variables are distributed simultaneously. To get a probability from
the joint distribution of A and B, you would consider P(A=a and
B=b).
• The Conditional Probability Distribution looks at how the
probabilities of A are distributed, given a certain value, say, for B,
P(A=a| B=b).
• The Marginal Probability Distribution is one that results from
taking mean over one variable to get the probability distribution of
the other.
For Example, the marginal probability distribution of A when A & B
are related would be given by the following;
‫׬‬𝑩
P(a|b) P(b)db
Bayesian Networks
Bayesian Networks
Bayesian Probability
A
A
A A
B
B
B
B
B
P(One Event | Another Event)
We have seen earlier:
• P(A&B) = P(A) * P(B|A) or P(B) * P(A|B)
From here, we could isolate either P(B|A) or P(A|B) and
compute from simpler probabilities.
Bayesian Networks
Bayes Theorem
A
A
A A
B
B
B
B
B
We have seen earlier:
• P(A|B) = P(B|A) * P(A)/P(B)
• P(B|A) = P(A|B) * P(B)/P(A)
Bayesian Networks
Bayes Network
A Bayes Network is a structure that can be represented as a
Direct Acyclic Graph.
1. It allows a compact representation of the distribution from
the chain rule of Bayes Networks.
2. It observes conditional independence relationships
between random variables.
WC
G
P
F
A DAG(Direct Acyclic Graph) is a finite
directed graph with no directed cycles.
Bayesian Networks
Bayes Theorem : Example
WC
G
P
F
Genes P(Genes)
Good 0.2
Bad 0.8
In Form P(Form)
Yes 0.7
No 0.3
Condition No
Spot
Spot
given
Bad
Performance
0.95 0.05
Okay
Performance
0.8 0.2
Brilliant
Performance
0.5 0.5
Condition Bad Okay Brilliant
Good Genes,
Good Form
0.5 0.3 0.2
Good Genes,
Bad Form
0.8 0.15 0.05
Bad Genes,
Good Form
0.8 0,1 0.1
Bad Genes,
Bad Form
0.9 0.08 0.02
Bayesian Networks
Bayes Theorem : Example
WC
G
P
F
What should you think about?
• Does a spot in the WC team depend on
Genetics?
• Does a spot in the WC team depend on
Genetics if you know someone is in good
form?
• Does a spot in the WC team depend on
Genetics if you know the performance in
the Pre WC Tour?
Bayesian Networks
Bayes Theorem : Example
WC
G
P
F
How this works.
• Each node in the Bayes Network will
have a CPD associated with it.
• If the node has parents, the associated
CPD represents P(value| parent’s value)
• If a node has no parents, the CPD
represents P(value), the unconditional
probability of the value.
Markov’s Random Fields
Markov’s Random Fields
Undirected Graphical Models
B CA
ED
• P(A,B,C,D,E) ∝ ϕ(A,B) ϕ(B,C) ϕ(B,D) ϕ(C,E) ϕ(D,E)
Probability Distribution of the variables in the graph can
factorised as individual clique potential functions.
Markov’s Random Fields
Cliques
B CA
ED
• P(A,B,C,D,E) ∝ ϕ(A,B) ϕ(B,C) ϕ(B,D) ϕ(C,E) ϕ(D,E)
P(X) =
1
𝑍
ς 𝑐∈𝑐𝑙𝑖𝑞𝑢𝑒𝑠(𝐺) ϕ 𝑐 (𝑥 𝑐 ) potential functions
Markov’s Random Fields
Cliques
B CA
ED
• P(A,B,C,D,E) ∝ ϕ(A,B) ϕ(B,C) ϕ(B,D) ϕ(C,E) ϕ(D,E)
P(X) =
1
𝑍
ς 𝑐∈𝑐𝑙𝑖𝑞𝑢𝑒𝑠(𝐺) ϕ 𝑐 (𝑥 𝑐 ) potential functions
Markov’s Random Fields
Cliques
B CA
ED
• P(A,B,C,D,E) ∝ ϕ(A,B) ϕ(B,C) ϕ(B,D) ϕ(C,E) ϕ(D,E)
P(X) =
1
𝑍
ς 𝑐∈𝑐𝑙𝑖𝑞𝑢𝑒𝑠(𝐺) ϕ 𝑐 (𝑥 𝑐 ) potential functions
Markov’s Random Fields
Cliques
B CA
ED
• P(A,B,C,D,E) ∝ ϕ(A,B) ϕ(B,C) ϕ(B,D) ϕ(C,E) ϕ(D,E)
P(X) =
1
𝑍
ς 𝑐∈𝑐𝑙𝑖𝑞𝑢𝑒𝑠(𝐺) ϕ 𝑐 (𝑥 𝑐 ) potential functions
Markov’s Random Fields
Cliques
B CA
ED
• P(A,B,C,D,E) ∝ ϕ(A,B) ϕ(B,C) ϕ(B,D) ϕ(C,E) ϕ(D,E)
P(X) =
1
𝑍
ς 𝑐∈𝑐𝑙𝑖𝑞𝑢𝑒𝑠(𝐺) ϕ 𝑐 (𝑥 𝑐 ) potential functions
Markov’s Random Fields
Cliques
B CA
ED
• P(A,B,C,D,E) ∝ ϕ(A,B) ϕ(B,C,D) ϕ(C,D,E)
P(X) =
1
𝑍
ς 𝑐∈𝑐𝑙𝑖𝑞𝑢𝑒𝑠(𝐺) ϕ 𝑐 (𝑥 𝑐 ) potential functions
Markov’s Random Fields
Markov Random Fields
B CA
ED
• Paths between A and C ;
A-B-C
A-B-D-E-C
Markov’s Random Fields
Markov Random Fields
B CA
ED
• Paths between A and C ;
A-B-C
A-B-D-E-C
Markov’s Random Fields
Markov Random Fields
B CA
ED
• Any two subsets of variables are conditionally independent,
given a separating subset.
• {B,D},{B,E} & {B,D,E} are the separating subsets.
Use- Cases
Use Cases
Applications of PGMs
Google is based on a very
simple graph algorithm
called page rank.
Netflix, Amazon, Facebook
all use PGMs to
recommend what is best for
you.
Use Cases
PGMs can also apparently
infer whether one is a Modi
Supporter or Kejriwal
Supporter.
FiveThirtyEight is a company
that makes predictions about
American Presidential Polls using
PGMs.
Applications of PGMs
Bayesian Networks &
Markov Random Fields
Belief Networks &
Markov Random Fields
Bayes Nets as MRFs
BA
BA
Bayes Network
MRF
P(A,B) = P(A) * P(B|A)
P(A,B) ∝ ϕ(A,B)
Belief Networks &
Markov Random Fields
Bayes Nets as MRFs
Bayes Network
MRF
P(A,B) = P(A)P(B|A)P(C|B)
P(A,B) ∝ ϕ(A,B) ϕ(B,C)
BA C
BA C
Belief Networks &
Markov Random Fields
Bayes Nets as MRFs : Chains
Bayes Network
MRF
P(A,B,C) = P(A)P(B|A)P(C|B)
P(A,B,C) ∝ ϕ(A,B) ϕ(B,C)
ϕ(A,B) ← P(A)P(B|A)
ϕ(B,C) ← P(C|B)
Parameterization is not unique
BA C
BA C
Belief Networks &
Markov Random Fields
Bayes Nets as MRFs : Shared Parents
Bayes Network
MRF
P(A,B,C) = P(A)P(B|A)P(C|B)
P(A,B,C) ∝ ϕ(A,B) ϕ(A,C)
ϕ(A,B) ← P(A)P(B|A)
ϕ(A,C) ← P(C|A)
B
A
C
B
A
C
Belief Networks &
Markov Random Fields
Bayes Nets as MRFs : Shared Child
Bayes Network
MRF
P(A,B,C) = P(A)P(B)P(C|A,B)
A and B are dependent given C
P(A,B,C) ∝ ϕ(A,C) ϕ(B,C)
A and B are independent given C
C
A B
C
A B
Belief Networks &
Markov Random Fields
Converting Bayes Nets to MRFs : Moralizing Parents
P(A,B,C) ∝ ϕ(A,C) ϕ(B,C)
A and B are independent given C
C
A B
• Moralize all co-parents.
• Lose marginal independence of parents
undirecteddirected
PGMs and Neural Networks
PGMs & Neural
Networks
The Boyfriend Problem
PGMs & Neural
Networks
The Boyfriend Problem
PGMs & Neural
Networks
The Boyfriend Problem
PGMs & Neural
Networks
The Boyfriend Problem
• N common friends
• N bit vector = {0,1,0,1,…} ; 1 is a boost, 0 means no contact
• Objective : find the set of friends that I should ask to boost me,
i.e. the best vector.
PGMs & Neural
Networks
Solution 1 : Neural Networks
Yes/No
More Friends/
Happening Social
Life(M)
Me
Friends(N)
PGMs & Neural
Networks
Solution 1 : Neural Networks
More Friends/
Happening Social
Life(M)
Yes/No
Friends(N)
Me
PGMs & Neural
Networks
Solution 1 : Neural Networks
But, Why?
Me
PGMs & Neural
Networks
Solution 2 : Probabilistic Graphical Models
Approval
from N
Friends/
Observation
from N
nodes
Me
0
1
0
1
0
0
1
1
Date Acceptance/
Observable Event
Impression/
Random Variable
PGMs & Neural
Networks
Solution 2 : Probabilistic Graphical Models
Date Acceptance/
Observable Event
Approval
from N
Friends/
Observation
from N nodes
Me
Impression/ Random
Variable
P(impression| vector of all approvals) ← P(approval from friend i| impression)
P(all approvals| impression) = product of all P(approval i | impression)
Graphical Models In Python | Edureka

Graphical Models In Python | Edureka

  • 2.
    Agenda Why do youneed PGMs? What is a PGM? Bayesian Networks Markov’s Random Fields Use-Cases Belief Networks in MRF PGMs & Neural Networks 01 02 03 04 05 06 07
  • 3.
    Why do youneed PGMs?
  • 4.
    Why do youneed Probabilistic Graphical Models? Probabilistic Graphical Models are rich frameworks for encoding probability distributions over complex domains. Compact Graphical Representation PGM are frameworks used to create and represent compact graphical models of complex real world scenarios 01 Intuitive Diagrams of Complex Relationships PGMs give us intuitive diagrams of complex relationships between stochastic variables. 02 Convenient from Computational Aspect PGMs are also convenient from computational point of view, since we already have algorithms for working with graphs and statistics. 03 Dynamic Simulation of Models Using PGM we can simulate dynamics of industrial establishments, create models, and many other things. 04
  • 5.
  • 6.
    What is aProbabilistic Graphical Model? WC G P Consider you have 4 binary(Yes/No) variables. Spot in the World Cup(Yes/No) Performance in the Pre WC Tour (Yes/No) Good Genetics(Yes/No) F Good Form(Yes/No)
  • 7.
    What is aProbabilistic Graphical Model? Consider you have 4 binary(Yes/No) variables. WC G P F
  • 8.
    What is aProbabilistic Graphical Model? Components of a Graphical Model Nodes = Random Variables WC G P F
  • 9.
    What is aProbabilistic Graphical Model? Components of a Graphical Model Edges = Inter-Nodal Dependencies
  • 10.
    What is aProbabilistic Graphical Model? So, What is a PGM? P M G Probabilistic Graphical Models
  • 11.
    What is aProbabilistic Graphical Model? So, What is a PGM? P Probabilistic G M The nature of the problem that we are generally interested to solve or the type of queries we want to make are all probabilistic because of uncertainty. There are many reasons that contributes to it.
  • 12.
    What is aProbabilistic Graphical Model? So, What is a PGM? P M G Probabilistic Graphical Models
  • 13.
    What is aProbabilistic Graphical Model? So, What is a PGM? G Graphical P M Graphical representation helps us to visualise better and So, we use Graph Theory to reduce the no of relevant combinations of all the participating variables to represent the high dimensional probability distribution model more compactly.
  • 14.
    What is aProbabilistic Graphical Model? So, What is a PGM? P M G Probabilistic Graphical Models
  • 15.
    What is aProbabilistic Graphical Model? So, What is a PGM? M Models P G A Model is a declarative representation of a real world scenario or a problem that we want to analysis. It is represented by using any mathematical tools like graph or even simply by an equation.
  • 16.
    What is aProbabilistic Graphical Model? So, What is a PGM? P M G Probabilistic Graphical Models
  • 17.
    What is aProbabilistic Graphical Model? So, What is a PGM? P G M Probabilistic Graphical Models (PGM) is a technique of compactly representing a joint distribution by exploiting dependencies between the random variables. It also allows us to do inference on joint distributions in a computationally cheaper way than the traditional methods.
  • 18.
    What is aProbabilistic Graphical Model? Probability A A A A A B B B B B B C C C C C C What is the Probability of A? Solution: • Count all As and Divide it by Total number of Possibilities. • P(A) = (#A)/(#A+#B+#C)
  • 19.
    What is aProbabilistic Graphical Model? Conditional Probability A A A A B B B B B What is the Probability of A&B? Solution: • B should occur when A is already happening. • P(A&B) = P(A) * P(B|A) or P(B) * P(A|B).
  • 20.
    What is aProbabilistic Graphical Model? Joint, Probability and Marginal Distributions • The Joint Probability Distribution describes how two or more variables are distributed simultaneously. To get a probability from the joint distribution of A and B, you would consider P(A=a and B=b). • The Conditional Probability Distribution looks at how the probabilities of A are distributed, given a certain value, say, for B, P(A=a| B=b). • The Marginal Probability Distribution is one that results from taking mean over one variable to get the probability distribution of the other. For Example, the marginal probability distribution of A when A & B are related would be given by the following; ‫׬‬𝑩 P(a|b) P(b)db
  • 21.
  • 22.
    Bayesian Networks Bayesian Probability A A AA B B B B B P(One Event | Another Event) We have seen earlier: • P(A&B) = P(A) * P(B|A) or P(B) * P(A|B) From here, we could isolate either P(B|A) or P(A|B) and compute from simpler probabilities.
  • 23.
    Bayesian Networks Bayes Theorem A A AA B B B B B We have seen earlier: • P(A|B) = P(B|A) * P(A)/P(B) • P(B|A) = P(A|B) * P(B)/P(A)
  • 24.
    Bayesian Networks Bayes Network ABayes Network is a structure that can be represented as a Direct Acyclic Graph. 1. It allows a compact representation of the distribution from the chain rule of Bayes Networks. 2. It observes conditional independence relationships between random variables. WC G P F A DAG(Direct Acyclic Graph) is a finite directed graph with no directed cycles.
  • 25.
    Bayesian Networks Bayes Theorem: Example WC G P F Genes P(Genes) Good 0.2 Bad 0.8 In Form P(Form) Yes 0.7 No 0.3 Condition No Spot Spot given Bad Performance 0.95 0.05 Okay Performance 0.8 0.2 Brilliant Performance 0.5 0.5 Condition Bad Okay Brilliant Good Genes, Good Form 0.5 0.3 0.2 Good Genes, Bad Form 0.8 0.15 0.05 Bad Genes, Good Form 0.8 0,1 0.1 Bad Genes, Bad Form 0.9 0.08 0.02
  • 26.
    Bayesian Networks Bayes Theorem: Example WC G P F What should you think about? • Does a spot in the WC team depend on Genetics? • Does a spot in the WC team depend on Genetics if you know someone is in good form? • Does a spot in the WC team depend on Genetics if you know the performance in the Pre WC Tour?
  • 27.
    Bayesian Networks Bayes Theorem: Example WC G P F How this works. • Each node in the Bayes Network will have a CPD associated with it. • If the node has parents, the associated CPD represents P(value| parent’s value) • If a node has no parents, the CPD represents P(value), the unconditional probability of the value.
  • 28.
  • 29.
    Markov’s Random Fields UndirectedGraphical Models B CA ED • P(A,B,C,D,E) ∝ ϕ(A,B) ϕ(B,C) ϕ(B,D) ϕ(C,E) ϕ(D,E) Probability Distribution of the variables in the graph can factorised as individual clique potential functions.
  • 30.
    Markov’s Random Fields Cliques BCA ED • P(A,B,C,D,E) ∝ ϕ(A,B) ϕ(B,C) ϕ(B,D) ϕ(C,E) ϕ(D,E) P(X) = 1 𝑍 ς 𝑐∈𝑐𝑙𝑖𝑞𝑢𝑒𝑠(𝐺) ϕ 𝑐 (𝑥 𝑐 ) potential functions
  • 31.
    Markov’s Random Fields Cliques BCA ED • P(A,B,C,D,E) ∝ ϕ(A,B) ϕ(B,C) ϕ(B,D) ϕ(C,E) ϕ(D,E) P(X) = 1 𝑍 ς 𝑐∈𝑐𝑙𝑖𝑞𝑢𝑒𝑠(𝐺) ϕ 𝑐 (𝑥 𝑐 ) potential functions
  • 32.
    Markov’s Random Fields Cliques BCA ED • P(A,B,C,D,E) ∝ ϕ(A,B) ϕ(B,C) ϕ(B,D) ϕ(C,E) ϕ(D,E) P(X) = 1 𝑍 ς 𝑐∈𝑐𝑙𝑖𝑞𝑢𝑒𝑠(𝐺) ϕ 𝑐 (𝑥 𝑐 ) potential functions
  • 33.
    Markov’s Random Fields Cliques BCA ED • P(A,B,C,D,E) ∝ ϕ(A,B) ϕ(B,C) ϕ(B,D) ϕ(C,E) ϕ(D,E) P(X) = 1 𝑍 ς 𝑐∈𝑐𝑙𝑖𝑞𝑢𝑒𝑠(𝐺) ϕ 𝑐 (𝑥 𝑐 ) potential functions
  • 34.
    Markov’s Random Fields Cliques BCA ED • P(A,B,C,D,E) ∝ ϕ(A,B) ϕ(B,C) ϕ(B,D) ϕ(C,E) ϕ(D,E) P(X) = 1 𝑍 ς 𝑐∈𝑐𝑙𝑖𝑞𝑢𝑒𝑠(𝐺) ϕ 𝑐 (𝑥 𝑐 ) potential functions
  • 35.
    Markov’s Random Fields Cliques BCA ED • P(A,B,C,D,E) ∝ ϕ(A,B) ϕ(B,C,D) ϕ(C,D,E) P(X) = 1 𝑍 ς 𝑐∈𝑐𝑙𝑖𝑞𝑢𝑒𝑠(𝐺) ϕ 𝑐 (𝑥 𝑐 ) potential functions
  • 36.
    Markov’s Random Fields MarkovRandom Fields B CA ED • Paths between A and C ; A-B-C A-B-D-E-C
  • 37.
    Markov’s Random Fields MarkovRandom Fields B CA ED • Paths between A and C ; A-B-C A-B-D-E-C
  • 38.
    Markov’s Random Fields MarkovRandom Fields B CA ED • Any two subsets of variables are conditionally independent, given a separating subset. • {B,D},{B,E} & {B,D,E} are the separating subsets.
  • 39.
  • 40.
    Use Cases Applications ofPGMs Google is based on a very simple graph algorithm called page rank. Netflix, Amazon, Facebook all use PGMs to recommend what is best for you.
  • 41.
    Use Cases PGMs canalso apparently infer whether one is a Modi Supporter or Kejriwal Supporter. FiveThirtyEight is a company that makes predictions about American Presidential Polls using PGMs. Applications of PGMs
  • 42.
  • 43.
    Belief Networks & MarkovRandom Fields Bayes Nets as MRFs BA BA Bayes Network MRF P(A,B) = P(A) * P(B|A) P(A,B) ∝ ϕ(A,B)
  • 44.
    Belief Networks & MarkovRandom Fields Bayes Nets as MRFs Bayes Network MRF P(A,B) = P(A)P(B|A)P(C|B) P(A,B) ∝ ϕ(A,B) ϕ(B,C) BA C BA C
  • 45.
    Belief Networks & MarkovRandom Fields Bayes Nets as MRFs : Chains Bayes Network MRF P(A,B,C) = P(A)P(B|A)P(C|B) P(A,B,C) ∝ ϕ(A,B) ϕ(B,C) ϕ(A,B) ← P(A)P(B|A) ϕ(B,C) ← P(C|B) Parameterization is not unique BA C BA C
  • 46.
    Belief Networks & MarkovRandom Fields Bayes Nets as MRFs : Shared Parents Bayes Network MRF P(A,B,C) = P(A)P(B|A)P(C|B) P(A,B,C) ∝ ϕ(A,B) ϕ(A,C) ϕ(A,B) ← P(A)P(B|A) ϕ(A,C) ← P(C|A) B A C B A C
  • 47.
    Belief Networks & MarkovRandom Fields Bayes Nets as MRFs : Shared Child Bayes Network MRF P(A,B,C) = P(A)P(B)P(C|A,B) A and B are dependent given C P(A,B,C) ∝ ϕ(A,C) ϕ(B,C) A and B are independent given C C A B C A B
  • 48.
    Belief Networks & MarkovRandom Fields Converting Bayes Nets to MRFs : Moralizing Parents P(A,B,C) ∝ ϕ(A,C) ϕ(B,C) A and B are independent given C C A B • Moralize all co-parents. • Lose marginal independence of parents undirecteddirected
  • 49.
  • 50.
    PGMs & Neural Networks TheBoyfriend Problem
  • 51.
    PGMs & Neural Networks TheBoyfriend Problem
  • 52.
    PGMs & Neural Networks TheBoyfriend Problem
  • 53.
    PGMs & Neural Networks TheBoyfriend Problem • N common friends • N bit vector = {0,1,0,1,…} ; 1 is a boost, 0 means no contact • Objective : find the set of friends that I should ask to boost me, i.e. the best vector.
  • 54.
    PGMs & Neural Networks Solution1 : Neural Networks Yes/No More Friends/ Happening Social Life(M) Me Friends(N)
  • 55.
    PGMs & Neural Networks Solution1 : Neural Networks More Friends/ Happening Social Life(M) Yes/No Friends(N) Me
  • 56.
    PGMs & Neural Networks Solution1 : Neural Networks But, Why? Me
  • 57.
    PGMs & Neural Networks Solution2 : Probabilistic Graphical Models Approval from N Friends/ Observation from N nodes Me 0 1 0 1 0 0 1 1 Date Acceptance/ Observable Event Impression/ Random Variable
  • 58.
    PGMs & Neural Networks Solution2 : Probabilistic Graphical Models Date Acceptance/ Observable Event Approval from N Friends/ Observation from N nodes Me Impression/ Random Variable P(impression| vector of all approvals) ← P(approval from friend i| impression) P(all approvals| impression) = product of all P(approval i | impression)