SlideShare a Scribd company logo
Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers
1th Mediterranean Conference on Pattern Recognition and Artificial Intelligence
EL-Hachemi Samy Dominique Ramdane
Guerrout Ait-Aoudia Michelucci Mahiou
Hidden Markov Random Field model and BFGS
algorithm for Brain Image Segmentation
LMCS Laboratory, ESI, Algeria & LE2I Laboratory, UB, France
1 / 19
Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers
1 Introduction
2 Hidden Markov Random Field
3 BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm
4 Experimental Results
5 Conclusion & Perspective
2 / 19
Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers
Problematic & Solution
1 Nowadays, We face a huge number of medical images
2 Manual analysis and interpretation became a tedious task
3 Automatic image analysis and interpretation is a necessity
4 To simplify the representation of an image into items meaningful
and easier to analyze, we need a segmentation method
3 / 19
Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers
A segmentation methods
Segmentation methods can be classified in four main categories :
1 Threshold-based methods
2 Region-based methods
3 Model-based methods
4 Classification methods
1 HMRF - Hidden Markov Random Field
2 etc
We have chosen HMRF as a model to perform segmentation
4 / 19
Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers
Hidden Markov Random Field
1 HMRF provides an elegant way to model the segmentation
problem
2 HMRF is a generalization of Hidden Markov Model
3 Each pixel is seen as a realization of Markov random variable
4 Each image is seen as a realization of set or family of Markov
random variables
5 / 19
Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers
Hidden Markov Random Field
The image to segment y = {ys}s∈S
into K classes is a realization of Y
1 Y = {Ys}s∈S is a family of
random variables
2 ys ∈ [0...255]
The segmented image into K classes
x = {xs}s∈S is realization of X
1 X = {Xs}s∈S is a family of
random variables
2 xs ∈ {1,...,K}
An example of segmentation into
K = 4 classes
The goal of HMRF is looking for x∗
x∗ = argx∈Ω max {P[X = x | Y = y]}
6 / 19
Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers
Hidden Markov Random Field
1 This elegant model leads to the optimization of an energy function
Ψ(x,y) = ∑s∈S ln(σxs
)+
(ys−µxs )2
2σ2
xs
+ β
T ∑c2={s,t} (1 −2δ(xs,xt ))
2 Our way to look for the minimization of Ψ(x,y) is to look for the
minimization Ψ(µ), µ = (µ1,...,µK ) where µi are means of gray
values of class i
3 The main idea is to focus on the means adjustment instead of
treating pixels adjustment
7 / 19
Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers
Hidden Markov Random Field
1 Now, we seek for u∗



µ∗ = argµ∈[0...255]K min{Ψ(µ)}
Ψ(µ) = ∑K
j=1 f(µj )
f(µj ) = ∑
s∈Sj
[ln(σj )+
(ys−µj )2
2σ2
j
]+ β
T ∑
c2={s,t}
(1 −2δ(xs,xt ))
2 To apply optimization techniques, we redefine the function Ψ(µ)
for µ ∈ RK
instead µ ∈ [0...255]K
. For that, we distinguish two
forms Ψ1
(µ) and Ψ2
(µ).
8 / 19
Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers
Hidden Markov Random Field
Form 1
Ψ1
(µ) =



Ψ(µ) if µ ∈ [0...255]K
+∞ otherwise
Ψ1
treats all points outside
[0...255] in the same way
Form 2



Ψ2
(µ) = ∑K
j=1 F(µj ) where µj ∈ R
F(µj ) =



f(0)−uj if µj < 0
f(µj ) if µj ∈ [0...255]
f(255)+(uj −255) if µj > 255
9 / 19
Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers
BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm
1 BFGS is one of the most powerful methods to solve
unconstrained optimization problem
2 BFGS is the most popular quasi-Newton method
3 BFGS is based on the gradient descent to reach the local
minimum
4 Main idea of descent gradient is :
1 We start from the initial point µ0
2 At the iteration k +1, the point µk+1
is calculated from the point µk
according to the following formula : µk+1
= µk
+αk dk
- αk is the step size at the iteration k
- dk the search direction at the iteration k
10 / 19
Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers
BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm
Summary of BFGS algorithm
1 Initialization : Set k := 0,
Choose µ0
close to the solution
Set H0 := I, Set α0 = 1
Choose the required accuracy ε ∈ R,ε>0
2 At the iteration k :
Compute Hessian matrix approximation Hk
Compute the inverse of Hessian matrix
Compute the search direction dk
Compute the step size αk
Compute the point µk+1
3 The stopping criterion : If Ψ (µk
) <ε then ˆµ := µk
4 k := k +1
11 / 19
Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers
DC - The Dice Coefficient
The Dice coefficient measures how much
the segmentation result is close to the
ground truth
DC =
2|A ∩B|
|A ∪B|
1 DC equals 1 in the best case
(perfect segmentation)
2 DC equals 0 in the worst case
(every pixel is misclassified)
FIGURE – The Dice Coefficient
12 / 19
Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers
BFGS in practice
1 We used the Gnu Scientific Library implementation of the BFGS
2 To apply BFGS, we need at least the first derivative
3 In our case, computing the first derivative is not obvious
4 We have used the centric form to compute an approximation of
the first derivative
Centric form of the first derivative



Ψ (µ)) = ( ∂Ψ
∂µ1
,..., ∂Ψ
∂µn
)
∂Ψ
∂µi
= Ψ(µ1,...,µi +ε,...,µn)−Ψ(µ1,...,µi −ε,...,µn)
2ε
5 Good approximation of the first derivative relies on the choice of
the value of the parameter ε
13 / 19
Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers
BFGS in practice - Results
1 Through the numerous tests conducted we have selected
ε = 0.01 as the best value for a good approximation of the first
derivative
2 We have tested two functions Ψ1
and Ψ2
, Ψ1
treats all points
outside [0...255] in the same way
3 In practice, Ψ1
and Ψ2
give nearly the same results
14 / 19
Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers
HMRF-BFGS VS Classical MRF, MRF-ACO-Gossiping &
MRF-ACO
Methods
Dice coefficient
GM WM CSF Mean
Classical-MRF 0.763 0.723 0.780 0.756
MRF-ACO 0.770 0.729 0.785 0.762
MRF-ACO-Gossiping 0.770 0.729 0.786 0.762
HMRF-BFGS 0.974 0.991 0.960 0.975
15 / 19
Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers
Results with (N-Noise,I-Intensity non-uniformity)
(N,I)
The initial Dice coefficient
Time(s)
point GM WM CSF Mean
(0 % , 0 %) µ0,1
0.974 0.991 0.960 0.975 27.544
(2 % , 20 %) µ0,2
0.942 0.969 0.939 0.950 15.630
(5 % , 20 %) µ0,3
0.919 0.952 0.920 0.930 84.967
16 / 19
Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers
Example of segmentation using HMRF-BFGS
(0%,0%) (3%,20%) (5%,20%)
17 / 19
Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers
Conclusion & Perspective
1 We have presented a combination method HMRF-BFGS
2 Through the tests conducted,
1 We have figured out good parameter settings
2 HMRF-BFGS method shows a good results
3 We conclude that HMRF-BFGS method it is very promising
4 Nevertheless, the opinion of specialists must be considered in the
evaluation
18 / 19
Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers
Thank you
for your attention
19 / 19

More Related Content

What's hot

Gate 2013 complete solutions of ec electronics and communication engineering
Gate 2013 complete solutions of ec  electronics and communication engineeringGate 2013 complete solutions of ec  electronics and communication engineering
Gate 2013 complete solutions of ec electronics and communication engineering
manish katara
 
Multicasting in Linear Deterministic Relay Network by Matrix Completion
Multicasting in Linear Deterministic Relay Network by Matrix CompletionMulticasting in Linear Deterministic Relay Network by Matrix Completion
Multicasting in Linear Deterministic Relay Network by Matrix Completion
Tasuku Soma
 
Foreground Detection : Combining Background Subspace Learning with Object Smo...
Foreground Detection : Combining Background Subspace Learning with Object Smo...Foreground Detection : Combining Background Subspace Learning with Object Smo...
Foreground Detection : Combining Background Subspace Learning with Object Smo...
Shanghai Jiao Tong University(上海交通大学)
 
ARIC Team Seminar
ARIC Team SeminarARIC Team Seminar
ARIC Team Seminar
Benoit Lopez
 
Grds international conference on pure and applied science (5)
Grds international conference on pure and applied science (5)Grds international conference on pure and applied science (5)
Grds international conference on pure and applied science (5)
Global R & D Services
 
FPGA based BCH Decoder
FPGA based BCH DecoderFPGA based BCH Decoder
FPGA based BCH Decoder
ijsrd.com
 
Lt2419681970
Lt2419681970Lt2419681970
Lt2419681970
IJERA Editor
 
Unit 6.3
Unit 6.3Unit 6.3
Unit 6.3
Mark Ryder
 
CVGIP_Chia-Pin Tseng
CVGIP_Chia-Pin TsengCVGIP_Chia-Pin Tseng
CVGIP_Chia-Pin Tseng
Chia-Pin Tseng
 
Matrix part 3.2 (1)
Matrix part 3.2 (1)Matrix part 3.2 (1)
Matrix part 3.2 (1)
Ghanshyam Tewani
 
Linear Cryptanalysis Lecture 線形解読法
Linear Cryptanalysis Lecture 線形解読法Linear Cryptanalysis Lecture 線形解読法
Linear Cryptanalysis Lecture 線形解読法
Kai Katsumata
 
Spsp fw
Spsp fwSpsp fw
Spsp fw
sanzeeb123
 
Efficient Solution of Two-Stage Stochastic Linear Programs Using Interior Poi...
Efficient Solution of Two-Stage Stochastic Linear Programs Using Interior Poi...Efficient Solution of Two-Stage Stochastic Linear Programs Using Interior Poi...
Efficient Solution of Two-Stage Stochastic Linear Programs Using Interior Poi...
SSA KPI
 
Mathketball
MathketballMathketball
Show ant-colony-optimization-for-solving-the-traveling-salesman-problem
Show ant-colony-optimization-for-solving-the-traveling-salesman-problemShow ant-colony-optimization-for-solving-the-traveling-salesman-problem
Show ant-colony-optimization-for-solving-the-traveling-salesman-problem
jayatra
 
Goldberg-Coxeter construction for 3- or 4-valent plane maps
Goldberg-Coxeter construction for 3- or 4-valent plane mapsGoldberg-Coxeter construction for 3- or 4-valent plane maps
Goldberg-Coxeter construction for 3- or 4-valent plane maps
Mathieu Dutour Sikiric
 
Lecture 2: Stochastic Hydrology
Lecture 2: Stochastic Hydrology Lecture 2: Stochastic Hydrology
Lecture 2: Stochastic Hydrology
Amro Elfeki
 

What's hot (17)

Gate 2013 complete solutions of ec electronics and communication engineering
Gate 2013 complete solutions of ec  electronics and communication engineeringGate 2013 complete solutions of ec  electronics and communication engineering
Gate 2013 complete solutions of ec electronics and communication engineering
 
Multicasting in Linear Deterministic Relay Network by Matrix Completion
Multicasting in Linear Deterministic Relay Network by Matrix CompletionMulticasting in Linear Deterministic Relay Network by Matrix Completion
Multicasting in Linear Deterministic Relay Network by Matrix Completion
 
Foreground Detection : Combining Background Subspace Learning with Object Smo...
Foreground Detection : Combining Background Subspace Learning with Object Smo...Foreground Detection : Combining Background Subspace Learning with Object Smo...
Foreground Detection : Combining Background Subspace Learning with Object Smo...
 
ARIC Team Seminar
ARIC Team SeminarARIC Team Seminar
ARIC Team Seminar
 
Grds international conference on pure and applied science (5)
Grds international conference on pure and applied science (5)Grds international conference on pure and applied science (5)
Grds international conference on pure and applied science (5)
 
FPGA based BCH Decoder
FPGA based BCH DecoderFPGA based BCH Decoder
FPGA based BCH Decoder
 
Lt2419681970
Lt2419681970Lt2419681970
Lt2419681970
 
Unit 6.3
Unit 6.3Unit 6.3
Unit 6.3
 
CVGIP_Chia-Pin Tseng
CVGIP_Chia-Pin TsengCVGIP_Chia-Pin Tseng
CVGIP_Chia-Pin Tseng
 
Matrix part 3.2 (1)
Matrix part 3.2 (1)Matrix part 3.2 (1)
Matrix part 3.2 (1)
 
Linear Cryptanalysis Lecture 線形解読法
Linear Cryptanalysis Lecture 線形解読法Linear Cryptanalysis Lecture 線形解読法
Linear Cryptanalysis Lecture 線形解読法
 
Spsp fw
Spsp fwSpsp fw
Spsp fw
 
Efficient Solution of Two-Stage Stochastic Linear Programs Using Interior Poi...
Efficient Solution of Two-Stage Stochastic Linear Programs Using Interior Poi...Efficient Solution of Two-Stage Stochastic Linear Programs Using Interior Poi...
Efficient Solution of Two-Stage Stochastic Linear Programs Using Interior Poi...
 
Mathketball
MathketballMathketball
Mathketball
 
Show ant-colony-optimization-for-solving-the-traveling-salesman-problem
Show ant-colony-optimization-for-solving-the-traveling-salesman-problemShow ant-colony-optimization-for-solving-the-traveling-salesman-problem
Show ant-colony-optimization-for-solving-the-traveling-salesman-problem
 
Goldberg-Coxeter construction for 3- or 4-valent plane maps
Goldberg-Coxeter construction for 3- or 4-valent plane mapsGoldberg-Coxeter construction for 3- or 4-valent plane maps
Goldberg-Coxeter construction for 3- or 4-valent plane maps
 
Lecture 2: Stochastic Hydrology
Lecture 2: Stochastic Hydrology Lecture 2: Stochastic Hydrology
Lecture 2: Stochastic Hydrology
 

Similar to Hidden Markov Random Field model and BFGS algorithm for Brain Image Segmentation

A Fast Hadamard Transform for Signals with Sub-linear Sparsity
A Fast Hadamard Transform for Signals with Sub-linear SparsityA Fast Hadamard Transform for Signals with Sub-linear Sparsity
A Fast Hadamard Transform for Signals with Sub-linear Sparsity
Robin Scheibler
 
Data sparse approximation of Karhunen-Loeve Expansion
Data sparse approximation of Karhunen-Loeve ExpansionData sparse approximation of Karhunen-Loeve Expansion
Data sparse approximation of Karhunen-Loeve Expansion
Alexander Litvinenko
 
Slides
SlidesSlides
Data sparse approximation of the Karhunen-Loeve expansion
Data sparse approximation of the Karhunen-Loeve expansionData sparse approximation of the Karhunen-Loeve expansion
Data sparse approximation of the Karhunen-Loeve expansion
Alexander Litvinenko
 
Delayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsDelayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithms
Christian Robert
 
計算材料学
計算材料学計算材料学
Conjugate Gradient method for Brain Magnetic Resonance Images Segmentation
Conjugate Gradient method for Brain Magnetic Resonance Images SegmentationConjugate Gradient method for Brain Magnetic Resonance Images Segmentation
Conjugate Gradient method for Brain Magnetic Resonance Images Segmentation
EL-Hachemi Guerrout
 
k-MLE: A fast algorithm for learning statistical mixture models
k-MLE: A fast algorithm for learning statistical mixture modelsk-MLE: A fast algorithm for learning statistical mixture models
k-MLE: A fast algorithm for learning statistical mixture models
Frank Nielsen
 
Strong convergence of an algorithm about strongly quasi nonexpansive mappings
Strong convergence of an algorithm about strongly quasi nonexpansive mappingsStrong convergence of an algorithm about strongly quasi nonexpansive mappings
Strong convergence of an algorithm about strongly quasi nonexpansive mappings
Alexander Decker
 
An investigation of inference of the generalized extreme value distribution b...
An investigation of inference of the generalized extreme value distribution b...An investigation of inference of the generalized extreme value distribution b...
An investigation of inference of the generalized extreme value distribution b...
Alexander Decker
 
Recursive Compressed Sensing
Recursive Compressed SensingRecursive Compressed Sensing
Recursive Compressed Sensing
Pantelis Sopasakis
 
Применение машинного обучения для навигации и управления роботами
Применение машинного обучения для навигации и управления роботамиПрименение машинного обучения для навигации и управления роботами
Применение машинного обучения для навигации и управления роботами
Skolkovo Robotics Center
 
Tensor Train data format for uncertainty quantification
Tensor Train data format for uncertainty quantificationTensor Train data format for uncertainty quantification
Tensor Train data format for uncertainty quantification
Alexander Litvinenko
 
Knowledge extraction from support vector machines
Knowledge extraction from support vector machinesKnowledge extraction from support vector machines
Knowledge extraction from support vector machines
Eyad Alshami
 
Paris Lecture 4: Practical issues in Bayesian modeling
Paris Lecture 4: Practical issues in Bayesian modelingParis Lecture 4: Practical issues in Bayesian modeling
Paris Lecture 4: Practical issues in Bayesian modeling
Shravan Vasishth
 
Current limitations of sequential inference in general hidden Markov models
Current limitations of sequential inference in general hidden Markov modelsCurrent limitations of sequential inference in general hidden Markov models
Current limitations of sequential inference in general hidden Markov models
Pierre Jacob
 
Presentation_Tan
Presentation_TanPresentation_Tan
Presentation_Tan
Zhangyun Tan
 
Vladimir Milov and Andrey Savchenko - Classification of Dangerous Situations...
Vladimir Milov and  Andrey Savchenko - Classification of Dangerous Situations...Vladimir Milov and  Andrey Savchenko - Classification of Dangerous Situations...
Vladimir Milov and Andrey Savchenko - Classification of Dangerous Situations...
AIST
 
AOT2 Single Variable Optimization Algorithms.pdf
AOT2 Single Variable Optimization Algorithms.pdfAOT2 Single Variable Optimization Algorithms.pdf
AOT2 Single Variable Optimization Algorithms.pdf
SandipBarik8
 
Bayesian inference on mixtures
Bayesian inference on mixturesBayesian inference on mixtures
Bayesian inference on mixtures
Christian Robert
 

Similar to Hidden Markov Random Field model and BFGS algorithm for Brain Image Segmentation (20)

A Fast Hadamard Transform for Signals with Sub-linear Sparsity
A Fast Hadamard Transform for Signals with Sub-linear SparsityA Fast Hadamard Transform for Signals with Sub-linear Sparsity
A Fast Hadamard Transform for Signals with Sub-linear Sparsity
 
Data sparse approximation of Karhunen-Loeve Expansion
Data sparse approximation of Karhunen-Loeve ExpansionData sparse approximation of Karhunen-Loeve Expansion
Data sparse approximation of Karhunen-Loeve Expansion
 
Slides
SlidesSlides
Slides
 
Data sparse approximation of the Karhunen-Loeve expansion
Data sparse approximation of the Karhunen-Loeve expansionData sparse approximation of the Karhunen-Loeve expansion
Data sparse approximation of the Karhunen-Loeve expansion
 
Delayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsDelayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithms
 
計算材料学
計算材料学計算材料学
計算材料学
 
Conjugate Gradient method for Brain Magnetic Resonance Images Segmentation
Conjugate Gradient method for Brain Magnetic Resonance Images SegmentationConjugate Gradient method for Brain Magnetic Resonance Images Segmentation
Conjugate Gradient method for Brain Magnetic Resonance Images Segmentation
 
k-MLE: A fast algorithm for learning statistical mixture models
k-MLE: A fast algorithm for learning statistical mixture modelsk-MLE: A fast algorithm for learning statistical mixture models
k-MLE: A fast algorithm for learning statistical mixture models
 
Strong convergence of an algorithm about strongly quasi nonexpansive mappings
Strong convergence of an algorithm about strongly quasi nonexpansive mappingsStrong convergence of an algorithm about strongly quasi nonexpansive mappings
Strong convergence of an algorithm about strongly quasi nonexpansive mappings
 
An investigation of inference of the generalized extreme value distribution b...
An investigation of inference of the generalized extreme value distribution b...An investigation of inference of the generalized extreme value distribution b...
An investigation of inference of the generalized extreme value distribution b...
 
Recursive Compressed Sensing
Recursive Compressed SensingRecursive Compressed Sensing
Recursive Compressed Sensing
 
Применение машинного обучения для навигации и управления роботами
Применение машинного обучения для навигации и управления роботамиПрименение машинного обучения для навигации и управления роботами
Применение машинного обучения для навигации и управления роботами
 
Tensor Train data format for uncertainty quantification
Tensor Train data format for uncertainty quantificationTensor Train data format for uncertainty quantification
Tensor Train data format for uncertainty quantification
 
Knowledge extraction from support vector machines
Knowledge extraction from support vector machinesKnowledge extraction from support vector machines
Knowledge extraction from support vector machines
 
Paris Lecture 4: Practical issues in Bayesian modeling
Paris Lecture 4: Practical issues in Bayesian modelingParis Lecture 4: Practical issues in Bayesian modeling
Paris Lecture 4: Practical issues in Bayesian modeling
 
Current limitations of sequential inference in general hidden Markov models
Current limitations of sequential inference in general hidden Markov modelsCurrent limitations of sequential inference in general hidden Markov models
Current limitations of sequential inference in general hidden Markov models
 
Presentation_Tan
Presentation_TanPresentation_Tan
Presentation_Tan
 
Vladimir Milov and Andrey Savchenko - Classification of Dangerous Situations...
Vladimir Milov and  Andrey Savchenko - Classification of Dangerous Situations...Vladimir Milov and  Andrey Savchenko - Classification of Dangerous Situations...
Vladimir Milov and Andrey Savchenko - Classification of Dangerous Situations...
 
AOT2 Single Variable Optimization Algorithms.pdf
AOT2 Single Variable Optimization Algorithms.pdfAOT2 Single Variable Optimization Algorithms.pdf
AOT2 Single Variable Optimization Algorithms.pdf
 
Bayesian inference on mixtures
Bayesian inference on mixturesBayesian inference on mixtures
Bayesian inference on mixtures
 

Recently uploaded

Topic: SICKLE CELL DISEASE IN CHILDREN-3.pdf
Topic: SICKLE CELL DISEASE IN CHILDREN-3.pdfTopic: SICKLE CELL DISEASE IN CHILDREN-3.pdf
Topic: SICKLE CELL DISEASE IN CHILDREN-3.pdf
TinyAnderson
 
mô tả các thí nghiệm về đánh giá tác động dòng khí hóa sau đốt
mô tả các thí nghiệm về đánh giá tác động dòng khí hóa sau đốtmô tả các thí nghiệm về đánh giá tác động dòng khí hóa sau đốt
mô tả các thí nghiệm về đánh giá tác động dòng khí hóa sau đốt
HongcNguyn6
 
THEMATIC APPERCEPTION TEST(TAT) cognitive abilities, creativity, and critic...
THEMATIC  APPERCEPTION  TEST(TAT) cognitive abilities, creativity, and critic...THEMATIC  APPERCEPTION  TEST(TAT) cognitive abilities, creativity, and critic...
THEMATIC APPERCEPTION TEST(TAT) cognitive abilities, creativity, and critic...
Abdul Wali Khan University Mardan,kP,Pakistan
 
bordetella pertussis.................................ppt
bordetella pertussis.................................pptbordetella pertussis.................................ppt
bordetella pertussis.................................ppt
kejapriya1
 
3D Hybrid PIC simulation of the plasma expansion (ISSS-14)
3D Hybrid PIC simulation of the plasma expansion (ISSS-14)3D Hybrid PIC simulation of the plasma expansion (ISSS-14)
3D Hybrid PIC simulation of the plasma expansion (ISSS-14)
David Osipyan
 
Eukaryotic Transcription Presentation.pptx
Eukaryotic Transcription Presentation.pptxEukaryotic Transcription Presentation.pptx
Eukaryotic Transcription Presentation.pptx
RitabrataSarkar3
 
Bob Reedy - Nitrate in Texas Groundwater.pdf
Bob Reedy - Nitrate in Texas Groundwater.pdfBob Reedy - Nitrate in Texas Groundwater.pdf
Bob Reedy - Nitrate in Texas Groundwater.pdf
Texas Alliance of Groundwater Districts
 
NuGOweek 2024 Ghent programme overview flyer
NuGOweek 2024 Ghent programme overview flyerNuGOweek 2024 Ghent programme overview flyer
NuGOweek 2024 Ghent programme overview flyer
pablovgd
 
The binding of cosmological structures by massless topological defects
The binding of cosmological structures by massless topological defectsThe binding of cosmological structures by massless topological defects
The binding of cosmological structures by massless topological defects
Sérgio Sacani
 
Orion Air Quality Monitoring Systems - CWS
Orion Air Quality Monitoring Systems - CWSOrion Air Quality Monitoring Systems - CWS
Orion Air Quality Monitoring Systems - CWS
Columbia Weather Systems
 
20240520 Planning a Circuit Simulator in JavaScript.pptx
20240520 Planning a Circuit Simulator in JavaScript.pptx20240520 Planning a Circuit Simulator in JavaScript.pptx
20240520 Planning a Circuit Simulator in JavaScript.pptx
Sharon Liu
 
Deep Software Variability and Frictionless Reproducibility
Deep Software Variability and Frictionless ReproducibilityDeep Software Variability and Frictionless Reproducibility
Deep Software Variability and Frictionless Reproducibility
University of Rennes, INSA Rennes, Inria/IRISA, CNRS
 
The debris of the ‘last major merger’ is dynamically young
The debris of the ‘last major merger’ is dynamically youngThe debris of the ‘last major merger’ is dynamically young
The debris of the ‘last major merger’ is dynamically young
Sérgio Sacani
 
Shallowest Oil Discovery of Turkiye.pptx
Shallowest Oil Discovery of Turkiye.pptxShallowest Oil Discovery of Turkiye.pptx
Shallowest Oil Discovery of Turkiye.pptx
Gokturk Mehmet Dilci
 
ANAMOLOUS SECONDARY GROWTH IN DICOT ROOTS.pptx
ANAMOLOUS SECONDARY GROWTH IN DICOT ROOTS.pptxANAMOLOUS SECONDARY GROWTH IN DICOT ROOTS.pptx
ANAMOLOUS SECONDARY GROWTH IN DICOT ROOTS.pptx
RASHMI M G
 
Medical Orthopedic PowerPoint Templates.pptx
Medical Orthopedic PowerPoint Templates.pptxMedical Orthopedic PowerPoint Templates.pptx
Medical Orthopedic PowerPoint Templates.pptx
terusbelajar5
 
Nucleic Acid-its structural and functional complexity.
Nucleic Acid-its structural and functional complexity.Nucleic Acid-its structural and functional complexity.
Nucleic Acid-its structural and functional complexity.
Nistarini College, Purulia (W.B) India
 
aziz sancar nobel prize winner: from mardin to nobel
aziz sancar nobel prize winner: from mardin to nobelaziz sancar nobel prize winner: from mardin to nobel
aziz sancar nobel prize winner: from mardin to nobel
İsa Badur
 
What is greenhouse gasses and how many gasses are there to affect the Earth.
What is greenhouse gasses and how many gasses are there to affect the Earth.What is greenhouse gasses and how many gasses are there to affect the Earth.
What is greenhouse gasses and how many gasses are there to affect the Earth.
moosaasad1975
 
DERIVATION OF MODIFIED BERNOULLI EQUATION WITH VISCOUS EFFECTS AND TERMINAL V...
DERIVATION OF MODIFIED BERNOULLI EQUATION WITH VISCOUS EFFECTS AND TERMINAL V...DERIVATION OF MODIFIED BERNOULLI EQUATION WITH VISCOUS EFFECTS AND TERMINAL V...
DERIVATION OF MODIFIED BERNOULLI EQUATION WITH VISCOUS EFFECTS AND TERMINAL V...
Wasswaderrick3
 

Recently uploaded (20)

Topic: SICKLE CELL DISEASE IN CHILDREN-3.pdf
Topic: SICKLE CELL DISEASE IN CHILDREN-3.pdfTopic: SICKLE CELL DISEASE IN CHILDREN-3.pdf
Topic: SICKLE CELL DISEASE IN CHILDREN-3.pdf
 
mô tả các thí nghiệm về đánh giá tác động dòng khí hóa sau đốt
mô tả các thí nghiệm về đánh giá tác động dòng khí hóa sau đốtmô tả các thí nghiệm về đánh giá tác động dòng khí hóa sau đốt
mô tả các thí nghiệm về đánh giá tác động dòng khí hóa sau đốt
 
THEMATIC APPERCEPTION TEST(TAT) cognitive abilities, creativity, and critic...
THEMATIC  APPERCEPTION  TEST(TAT) cognitive abilities, creativity, and critic...THEMATIC  APPERCEPTION  TEST(TAT) cognitive abilities, creativity, and critic...
THEMATIC APPERCEPTION TEST(TAT) cognitive abilities, creativity, and critic...
 
bordetella pertussis.................................ppt
bordetella pertussis.................................pptbordetella pertussis.................................ppt
bordetella pertussis.................................ppt
 
3D Hybrid PIC simulation of the plasma expansion (ISSS-14)
3D Hybrid PIC simulation of the plasma expansion (ISSS-14)3D Hybrid PIC simulation of the plasma expansion (ISSS-14)
3D Hybrid PIC simulation of the plasma expansion (ISSS-14)
 
Eukaryotic Transcription Presentation.pptx
Eukaryotic Transcription Presentation.pptxEukaryotic Transcription Presentation.pptx
Eukaryotic Transcription Presentation.pptx
 
Bob Reedy - Nitrate in Texas Groundwater.pdf
Bob Reedy - Nitrate in Texas Groundwater.pdfBob Reedy - Nitrate in Texas Groundwater.pdf
Bob Reedy - Nitrate in Texas Groundwater.pdf
 
NuGOweek 2024 Ghent programme overview flyer
NuGOweek 2024 Ghent programme overview flyerNuGOweek 2024 Ghent programme overview flyer
NuGOweek 2024 Ghent programme overview flyer
 
The binding of cosmological structures by massless topological defects
The binding of cosmological structures by massless topological defectsThe binding of cosmological structures by massless topological defects
The binding of cosmological structures by massless topological defects
 
Orion Air Quality Monitoring Systems - CWS
Orion Air Quality Monitoring Systems - CWSOrion Air Quality Monitoring Systems - CWS
Orion Air Quality Monitoring Systems - CWS
 
20240520 Planning a Circuit Simulator in JavaScript.pptx
20240520 Planning a Circuit Simulator in JavaScript.pptx20240520 Planning a Circuit Simulator in JavaScript.pptx
20240520 Planning a Circuit Simulator in JavaScript.pptx
 
Deep Software Variability and Frictionless Reproducibility
Deep Software Variability and Frictionless ReproducibilityDeep Software Variability and Frictionless Reproducibility
Deep Software Variability and Frictionless Reproducibility
 
The debris of the ‘last major merger’ is dynamically young
The debris of the ‘last major merger’ is dynamically youngThe debris of the ‘last major merger’ is dynamically young
The debris of the ‘last major merger’ is dynamically young
 
Shallowest Oil Discovery of Turkiye.pptx
Shallowest Oil Discovery of Turkiye.pptxShallowest Oil Discovery of Turkiye.pptx
Shallowest Oil Discovery of Turkiye.pptx
 
ANAMOLOUS SECONDARY GROWTH IN DICOT ROOTS.pptx
ANAMOLOUS SECONDARY GROWTH IN DICOT ROOTS.pptxANAMOLOUS SECONDARY GROWTH IN DICOT ROOTS.pptx
ANAMOLOUS SECONDARY GROWTH IN DICOT ROOTS.pptx
 
Medical Orthopedic PowerPoint Templates.pptx
Medical Orthopedic PowerPoint Templates.pptxMedical Orthopedic PowerPoint Templates.pptx
Medical Orthopedic PowerPoint Templates.pptx
 
Nucleic Acid-its structural and functional complexity.
Nucleic Acid-its structural and functional complexity.Nucleic Acid-its structural and functional complexity.
Nucleic Acid-its structural and functional complexity.
 
aziz sancar nobel prize winner: from mardin to nobel
aziz sancar nobel prize winner: from mardin to nobelaziz sancar nobel prize winner: from mardin to nobel
aziz sancar nobel prize winner: from mardin to nobel
 
What is greenhouse gasses and how many gasses are there to affect the Earth.
What is greenhouse gasses and how many gasses are there to affect the Earth.What is greenhouse gasses and how many gasses are there to affect the Earth.
What is greenhouse gasses and how many gasses are there to affect the Earth.
 
DERIVATION OF MODIFIED BERNOULLI EQUATION WITH VISCOUS EFFECTS AND TERMINAL V...
DERIVATION OF MODIFIED BERNOULLI EQUATION WITH VISCOUS EFFECTS AND TERMINAL V...DERIVATION OF MODIFIED BERNOULLI EQUATION WITH VISCOUS EFFECTS AND TERMINAL V...
DERIVATION OF MODIFIED BERNOULLI EQUATION WITH VISCOUS EFFECTS AND TERMINAL V...
 

Hidden Markov Random Field model and BFGS algorithm for Brain Image Segmentation

  • 1. Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers 1th Mediterranean Conference on Pattern Recognition and Artificial Intelligence EL-Hachemi Samy Dominique Ramdane Guerrout Ait-Aoudia Michelucci Mahiou Hidden Markov Random Field model and BFGS algorithm for Brain Image Segmentation LMCS Laboratory, ESI, Algeria & LE2I Laboratory, UB, France 1 / 19
  • 2. Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers 1 Introduction 2 Hidden Markov Random Field 3 BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm 4 Experimental Results 5 Conclusion & Perspective 2 / 19
  • 3. Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers Problematic & Solution 1 Nowadays, We face a huge number of medical images 2 Manual analysis and interpretation became a tedious task 3 Automatic image analysis and interpretation is a necessity 4 To simplify the representation of an image into items meaningful and easier to analyze, we need a segmentation method 3 / 19
  • 4. Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers A segmentation methods Segmentation methods can be classified in four main categories : 1 Threshold-based methods 2 Region-based methods 3 Model-based methods 4 Classification methods 1 HMRF - Hidden Markov Random Field 2 etc We have chosen HMRF as a model to perform segmentation 4 / 19
  • 5. Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers Hidden Markov Random Field 1 HMRF provides an elegant way to model the segmentation problem 2 HMRF is a generalization of Hidden Markov Model 3 Each pixel is seen as a realization of Markov random variable 4 Each image is seen as a realization of set or family of Markov random variables 5 / 19
  • 6. Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers Hidden Markov Random Field The image to segment y = {ys}s∈S into K classes is a realization of Y 1 Y = {Ys}s∈S is a family of random variables 2 ys ∈ [0...255] The segmented image into K classes x = {xs}s∈S is realization of X 1 X = {Xs}s∈S is a family of random variables 2 xs ∈ {1,...,K} An example of segmentation into K = 4 classes The goal of HMRF is looking for x∗ x∗ = argx∈Ω max {P[X = x | Y = y]} 6 / 19
  • 7. Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers Hidden Markov Random Field 1 This elegant model leads to the optimization of an energy function Ψ(x,y) = ∑s∈S ln(σxs )+ (ys−µxs )2 2σ2 xs + β T ∑c2={s,t} (1 −2δ(xs,xt )) 2 Our way to look for the minimization of Ψ(x,y) is to look for the minimization Ψ(µ), µ = (µ1,...,µK ) where µi are means of gray values of class i 3 The main idea is to focus on the means adjustment instead of treating pixels adjustment 7 / 19
  • 8. Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers Hidden Markov Random Field 1 Now, we seek for u∗    µ∗ = argµ∈[0...255]K min{Ψ(µ)} Ψ(µ) = ∑K j=1 f(µj ) f(µj ) = ∑ s∈Sj [ln(σj )+ (ys−µj )2 2σ2 j ]+ β T ∑ c2={s,t} (1 −2δ(xs,xt )) 2 To apply optimization techniques, we redefine the function Ψ(µ) for µ ∈ RK instead µ ∈ [0...255]K . For that, we distinguish two forms Ψ1 (µ) and Ψ2 (µ). 8 / 19
  • 9. Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers Hidden Markov Random Field Form 1 Ψ1 (µ) =    Ψ(µ) if µ ∈ [0...255]K +∞ otherwise Ψ1 treats all points outside [0...255] in the same way Form 2    Ψ2 (µ) = ∑K j=1 F(µj ) where µj ∈ R F(µj ) =    f(0)−uj if µj < 0 f(µj ) if µj ∈ [0...255] f(255)+(uj −255) if µj > 255 9 / 19
  • 10. Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm 1 BFGS is one of the most powerful methods to solve unconstrained optimization problem 2 BFGS is the most popular quasi-Newton method 3 BFGS is based on the gradient descent to reach the local minimum 4 Main idea of descent gradient is : 1 We start from the initial point µ0 2 At the iteration k +1, the point µk+1 is calculated from the point µk according to the following formula : µk+1 = µk +αk dk - αk is the step size at the iteration k - dk the search direction at the iteration k 10 / 19
  • 11. Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Summary of BFGS algorithm 1 Initialization : Set k := 0, Choose µ0 close to the solution Set H0 := I, Set α0 = 1 Choose the required accuracy ε ∈ R,ε>0 2 At the iteration k : Compute Hessian matrix approximation Hk Compute the inverse of Hessian matrix Compute the search direction dk Compute the step size αk Compute the point µk+1 3 The stopping criterion : If Ψ (µk ) <ε then ˆµ := µk 4 k := k +1 11 / 19
  • 12. Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers DC - The Dice Coefficient The Dice coefficient measures how much the segmentation result is close to the ground truth DC = 2|A ∩B| |A ∪B| 1 DC equals 1 in the best case (perfect segmentation) 2 DC equals 0 in the worst case (every pixel is misclassified) FIGURE – The Dice Coefficient 12 / 19
  • 13. Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers BFGS in practice 1 We used the Gnu Scientific Library implementation of the BFGS 2 To apply BFGS, we need at least the first derivative 3 In our case, computing the first derivative is not obvious 4 We have used the centric form to compute an approximation of the first derivative Centric form of the first derivative    Ψ (µ)) = ( ∂Ψ ∂µ1 ,..., ∂Ψ ∂µn ) ∂Ψ ∂µi = Ψ(µ1,...,µi +ε,...,µn)−Ψ(µ1,...,µi −ε,...,µn) 2ε 5 Good approximation of the first derivative relies on the choice of the value of the parameter ε 13 / 19
  • 14. Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers BFGS in practice - Results 1 Through the numerous tests conducted we have selected ε = 0.01 as the best value for a good approximation of the first derivative 2 We have tested two functions Ψ1 and Ψ2 , Ψ1 treats all points outside [0...255] in the same way 3 In practice, Ψ1 and Ψ2 give nearly the same results 14 / 19
  • 15. Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers HMRF-BFGS VS Classical MRF, MRF-ACO-Gossiping & MRF-ACO Methods Dice coefficient GM WM CSF Mean Classical-MRF 0.763 0.723 0.780 0.756 MRF-ACO 0.770 0.729 0.785 0.762 MRF-ACO-Gossiping 0.770 0.729 0.786 0.762 HMRF-BFGS 0.974 0.991 0.960 0.975 15 / 19
  • 16. Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers Results with (N-Noise,I-Intensity non-uniformity) (N,I) The initial Dice coefficient Time(s) point GM WM CSF Mean (0 % , 0 %) µ0,1 0.974 0.991 0.960 0.975 27.544 (2 % , 20 %) µ0,2 0.942 0.969 0.939 0.950 15.630 (5 % , 20 %) µ0,3 0.919 0.952 0.920 0.930 84.967 16 / 19
  • 17. Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers Example of segmentation using HMRF-BFGS (0%,0%) (3%,20%) (5%,20%) 17 / 19
  • 18. Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers Conclusion & Perspective 1 We have presented a combination method HMRF-BFGS 2 Through the tests conducted, 1 We have figured out good parameter settings 2 HMRF-BFGS method shows a good results 3 We conclude that HMRF-BFGS method it is very promising 4 Nevertheless, the opinion of specialists must be considered in the evaluation 18 / 19
  • 19. Introduction Hidden Markov Random Field BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithm Experimental Results Conclusion & Pers Thank you for your attention 19 / 19