SlideShare a Scribd company logo
1 of 19
Download to read offline
Blind Source Separation based on
Improved Particle Swarm Optimization Ashish Kumar Meshram
INDIAN INSTITUTE OF TECHNOLOGY, INDORE
Content
Problem Definition - Blind Source Separation (BSS)
Independent Component Analysis (ICA)
Particle Swarm Optimization (PSO)
Improved PSO with Dynamic Inertia Weight
Fitness Function
Algorithm for BSS based on PSO
Results & Conclusion
INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 1
Problem Definition - Blind Source Separation
INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 2
Source Mixing Observation Estimation
𝑠(𝑡) 𝐴 ∈ ℝ 𝑀×𝑀
𝑥 𝑡 = 𝐴𝑠(𝑡) 𝑦 𝑡 = 𝑊𝑥(𝑡)
Independent Component Analysis
INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 3
Common method for Blind Source Separation (BSS) can be described as: 𝑥 𝑡 = 𝐴𝑠 𝑡
Where 𝐴 is a nonsingular 𝑁 × 𝑁 mixing matrix;
𝑠 𝑡 = [𝑠1 𝑡 , 𝑠2 𝑡 , … , 𝑠 𝑁 𝑡 ] 𝑇
𝑥 𝑡 = [𝑥1 𝑡 , 𝑥2 𝑡 , … , 𝑥 𝑁 𝑡 ] 𝑇
The basis problem of Independent (ICA) is to find 𝑁 × 𝑁 separation matrix
𝑊 = [𝑤1, 𝑤2, … , 𝑤 𝑁] 𝑇
without any prior knowledge of 𝑠(𝑡) and 𝐴, make
𝑦 𝑡 = 𝑊𝑥(𝑡);
is the estimation of 𝑠 𝑡 .
Solution Procedure:
1. Remove Mean
2. Whitening
3. Find an orthogonal W, optimizing an objective function
Particle Swarm Optimization
INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 4
The velocity and position of the particle at next iteration is calculated according to following
equations:
𝑣𝑖,𝑗 𝑡 + 1 = 𝑤𝑣𝑖,𝑗 𝑡 + 𝑐1 𝑟1,𝑗 𝑡 𝑝𝑏𝑒𝑠𝑡𝑖,𝑗 𝑡 − 𝑥𝑖,𝑗 𝑡 + 𝑐2 𝑟2,𝑗 𝑡 𝑔𝑏𝑒𝑠𝑡𝑖 𝑡 − 𝑥𝑖,𝑗 𝑡
𝑥𝑖 𝑡 + 1 = 𝑥𝑖 𝑡 + 𝑣𝑖 𝑡 + 1
Where 𝑟1and 𝑟2are known as acceleration coefficients namely referred as cognitive and social
parameter, which are responsible for stochastic behavior of algorithm.
(1)
(2)
During the search process, the particle successively adjusts its position toward global optimum
according to the two factors:
1. The best position encountered by itself (𝑝𝑏𝑒𝑠𝑡); 𝑝𝑖,𝑗 = 𝑝𝑖,1, 𝑝𝑖,2, … , 𝑝𝑖,𝐷
2. The best position encountered by whole swarm (g𝑏𝑒𝑠𝑡); 𝑝 𝑔 = [𝑝 𝑔,1, 𝑝 𝑔,2, … , 𝑝 𝑔,𝐷]
Algorithm PSO
INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 4
For each particle
Initialize particle
End
Do
For each particle
Calculate fitness value
if the fitness value is better than the best fitness value (pbest) in history
Set current value as the new pbest
END
Choose the particle with the best fitness value of all particle as the gbest
For each particle
Calculate particle velocity according to eq. (1)
Update particle position according to eq. (2)
END
While maximum iterations or minimum error criteria is not attained
Improved PSO
INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 5
†
Clerc M, Kennedy J. “The particle swarm-explosion, stability, and convergence in a multidimensional complex space, “ IEEE Trans. on Evolutionary Computation, Vol.6, pp. 58-73, 2002.
(references)
Typical parameters in search course of PSO are:†
1. Evolution speed factor
2. Aggregation degree factor of the swarm
𝑤𝑖
𝑡
= 𝑔(ℎ𝑖
𝑡
, 𝑠) = 𝑤𝑖𝑛𝑖 − 𝛼 1 − ℎ𝑖
𝑡
+ 𝛽𝑠
𝑣𝑖,𝑗 𝑡 + 1 = 𝑤𝑖
𝑡
𝑣𝑖,𝑗 𝑡 + 𝑐1 𝑟1,𝑗 𝑡 𝑝𝑏𝑒𝑠𝑡𝑖,𝑗 𝑡 − 𝑥𝑖,𝑗 𝑡 + 𝑐2 𝑟2,𝑗 𝑡 𝑔𝑏𝑒𝑠𝑡𝑖 𝑡 − 𝑥𝑖,𝑗 𝑡
0 ≤ ℎ ≤ 1, 0 ≤ 𝑠 ≤ 1 1 − 𝛼 ≤ 𝑤 ≤ 1 + 𝛽
Fitness Function
INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 6
𝑓𝑖𝑡𝑛𝑒𝑠𝑠 𝑓𝑢𝑛𝑐𝑡𝑖𝑜𝑛 𝑦 =
1
𝐼(𝑦)
𝐼 𝑦 = − log 𝑊 −
𝑖=1
𝑛
𝐸[
𝑘=1
𝑝
(2𝑘 − 1)𝑥𝑖
2𝑘−2
] +
𝑖=1
𝑛
𝐻(𝑦𝑖)
𝐻 𝑦𝑖 =
log(2𝜋𝑒)
2
−
1
2∙3!
𝑘3
𝑗 2
−
1
2∙4!
𝑘4
𝑗 2
+
3
8
(𝑘3
𝑗
)2
𝑘4
𝑗
+
1
16
(𝑘4
𝑗
)3
†
P.Comon,el. “Independent Component Analysis. A new concept?” Signal Processing, Vol.36, pp. 287-314, 1994
So by using this we need to find linear combination of observation vector which makes the fitness function maximum.
Algorithm
INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 7
STEP 1: Center the observed signal, and there after whitening it;
𝑋 = 𝑋 − 𝐸 𝑋 Centering
𝐻 = 𝐷−1/2 𝐹 𝑇 𝑋 Whitening
Where D and F are eigenvalue and eigenvector of covariance matrix of X;
𝐶𝑜𝑣 𝑋 = 𝐸[𝑋𝑋 𝑇
]
STEP 2: Initialize each particle (size of group is K), include random position and velocity
𝑊𝑖 = [𝑊𝑖1, 𝑊𝑖2 , … , 𝑊𝑖𝐽]; and
𝑉𝑖 = [𝑉𝑖1, 𝑉𝑖2 , … , 𝑉𝑖𝐽]; where 1 ≤ 𝑖 ≤ 𝐾, 𝐽 = 𝑚 × 𝑛
To prevent target beyond the mark, initialize the range constraint of 𝑊𝑖𝑗 and 𝑉𝑖𝑗 as
−0.5 < 𝑊𝑖𝑗 < 0.5, −0.5 < 𝑉𝑖𝑗 < 0.5
Evaluate the fitness of each of the particle according to fitness function; where
𝑦𝑖 = 𝑊𝑖 𝐻 = [𝑦𝑖1, 𝑦𝑖2 , … , 𝑦𝑖𝐾];1 ≤ 𝑖 ≤ 𝐾
STEP 3:
Algorithm
INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 8
STEP 4: Define local optimum position 𝑃𝑖 and global optimum position 𝐺;
𝑃𝑖 = 𝑃𝑖1, 𝑃𝑖2 , … , 𝑃𝑖𝐽 = 𝑊𝑖, 1 ≤ 𝑖 ≤ 𝐾
𝐺 = 𝐺1, 𝐺2 , … , 𝐺𝐽 = arg max{𝐹𝑖𝑡𝑛𝑒𝑠𝑠 𝑃𝑖 ∙ 𝐻 }
𝑣𝑖𝑗
𝑛𝑒𝑤
= 𝑤𝑖
𝑡
𝑣𝑖𝑗 + 𝑐1 𝑟1,𝑗 𝑡 𝑃𝑖𝑗 − 𝑊𝑖𝑗 + 𝑐2 𝑟2,𝑗 𝐺𝑗 − 𝑊𝑖𝑗
Thus, 𝑊𝑖𝑗
𝑛𝑒𝑤
= 𝑊𝑖𝑗 + 𝑉𝑖𝑗
𝑛𝑒𝑤
; 1 ≤ 𝑗 ≤ 𝐽, 𝐽 = 𝑚 × 𝑛
Update 𝑉𝑖𝑗 = 𝑉𝑖𝑗
𝑛𝑒𝑤
, 𝑊𝑖𝑗 = 𝑊𝑖𝑗
𝑛𝑒𝑤
STEP 5: Normalize 𝑊𝑖𝑗
𝑛𝑒𝑤
=
𝑊𝑖𝑗
𝑛𝑒𝑤
𝑊𝑖𝑗
𝑛𝑒𝑤
;1 ≤ 𝑖 ≤ 𝐾
Algorithm
INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 9
STEP 6: If the 𝐹𝑖𝑡𝑛𝑒𝑠𝑠 𝑃𝑖 ∙ 𝐻 < 𝐹𝑖𝑡𝑛𝑒𝑠𝑠 𝑊𝑖𝑗
𝑛𝑒𝑤
∙ 𝐻 ,
𝐺 = arg max{𝐹𝑖𝑡𝑛𝑒𝑠𝑠 𝑃𝑖 ∙ 𝐻 }
STEP 7: Compute source signal according to global optimum position G,
𝑌 = 𝐺 ∙ 𝐻 = [𝑌1, 𝑌2, … , 𝑌𝐾]
STEP 8:
Obtain optimum solution 𝑊∗
= [𝑊1
∗
, 𝑊2
∗
, … , 𝑊𝐽
∗
]
If dissatisfy termination condition, loop to step 3; else loop to step 9.
STEP 9:
Results
INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 10
Original Voice Signal Mixing Voice Signal
Results
INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 11
Original Voice Signal DPSO-ICA Output Voice Signal
Results
INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 12
Original Voice Signal FastICA Output Voice Signal
Results/Conclusion
INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 13
Waveform of DPSO-ICA output signal is similar to original voice signal, and is also superior to
FastICA output voice signal
Kurtosis of DPSO-ICA is much closer to the original voice signal.
DPSO-ICA convergence step is also less than FastICA,
Voice Signal
Kurtosis
Of Original Voice Signal
Kurtosis of Mixing Voice
Signal
Kurtosis of ICA Output
Signal
Convergence Steps
Fast ICA DPSO-ICA Fast ICA DPSO-ICA
Voice Signal 1 2.1438 2.1176 2.4826 2.1451
78 56Voice Signal 2 4.6856 3.6053 4.626 4.688
Voice Signal 3 12.467 5.3764 6.8372 12.477
Results/Conclusion
INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 13
Waveform of DPSO-ICA output signal is similar to original voice signal, and is also superior to
FastICA output voice signal
but there is a little change in the amplitude of signal. It is a common problem exists in BSS.
Kurtosis of DPSO-ICA is much closer to the original voice signal.
DPSO-ICA convergence step is also less than FastICA,
Voice Signal
Kurtosis
Of Original Voice Signal
Kurtosis of Mixing Voice
Signal
Kurtosis of ICA Output
Signal
Convergence Steps
Fast ICA DPSO-ICA Fast ICA DPSO-ICA
Voice Signal 1 2.1438 2.1176 2.4826 2.1451
78 56Voice Signal 2 4.6856 3.6053 4.626 4.688
Voice Signal 3 12.467 5.3764 6.8372 12.477
Results/Conclusion
INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 13
Waveform of DPSO-ICA output signal is similar to original voice signal, and is also superior to
FastICA output voice signal
Kurtosis of DPSO-ICA is much closer to the original voice signal.
which indicate that the voice signal recovered by the DPSO-ICA has a better independence than FastICA.
DPSO-ICA convergence step is also less than FastICA,
Voice Signal
Kurtosis
Of Original Voice Signal
Kurtosis of Mixing Voice
Signal
Kurtosis of ICA Output
Signal
Convergence Steps
Fast ICA DPSO-ICA Fast ICA DPSO-ICA
Voice Signal 1 2.1438 2.1176 2.4826 2.1451
78 56Voice Signal 2 4.6856 3.6053 4.626 4.688
Voice Signal 3 12.467 5.3764 6.8372 12.477
Results/Conclusion
INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 13
Waveform of DPSO-ICA output signal is similar to original voice signal, and is also superior to
FastICA output voice signal
Kurtosis of DPSO-ICA is much closer to the original voice signal.
DPSO-ICA convergence step is also less than FastICA
that is to say convergence speed of DPSO-ICA is superior to FastICA.
Voice Signal
Kurtosis
Of Original Voice Signal
Kurtosis of Mixing Voice
Signal
Kurtosis of ICA Output
Signal
Convergence Steps
Fast ICA DPSO-ICA Fast ICA DPSO-ICA
Voice Signal 1 2.1438 2.1176 2.4826 2.1451
78 56Voice Signal 2 4.6856 3.6053 4.626 4.688
Voice Signal 3 12.467 5.3764 6.8372 12.477
THANK YOU

More Related Content

What's hot

PSOk-NN: A Particle Swarm Optimization Approach to Optimize k-Nearest Neighbo...
PSOk-NN: A Particle Swarm Optimization Approach to Optimize k-Nearest Neighbo...PSOk-NN: A Particle Swarm Optimization Approach to Optimize k-Nearest Neighbo...
PSOk-NN: A Particle Swarm Optimization Approach to Optimize k-Nearest Neighbo...Aboul Ella Hassanien
 
Rasool_PhD_Final_Presentation
Rasool_PhD_Final_PresentationRasool_PhD_Final_Presentation
Rasool_PhD_Final_PresentationGhulam Rasool
 
On a Deterministic Property of the Category of k-almost Primes: A Determinist...
On a Deterministic Property of the Category of k-almost Primes: A Determinist...On a Deterministic Property of the Category of k-almost Primes: A Determinist...
On a Deterministic Property of the Category of k-almost Primes: A Determinist...Ramin (A.) Zahedi
 
IRJET- Performance Analysis of Optimization Techniques by using Clustering
IRJET- Performance Analysis of Optimization Techniques by using ClusteringIRJET- Performance Analysis of Optimization Techniques by using Clustering
IRJET- Performance Analysis of Optimization Techniques by using ClusteringIRJET Journal
 
PSO and Its application in Engineering
PSO and Its application in EngineeringPSO and Its application in Engineering
PSO and Its application in EngineeringPrince Jain
 
Optimization of Unit Commitment Problem using Classical Soft Computing Techni...
Optimization of Unit Commitment Problem using Classical Soft Computing Techni...Optimization of Unit Commitment Problem using Classical Soft Computing Techni...
Optimization of Unit Commitment Problem using Classical Soft Computing Techni...IRJET Journal
 

What's hot (11)

PSOk-NN: A Particle Swarm Optimization Approach to Optimize k-Nearest Neighbo...
PSOk-NN: A Particle Swarm Optimization Approach to Optimize k-Nearest Neighbo...PSOk-NN: A Particle Swarm Optimization Approach to Optimize k-Nearest Neighbo...
PSOk-NN: A Particle Swarm Optimization Approach to Optimize k-Nearest Neighbo...
 
H0346065
H0346065H0346065
H0346065
 
Real Time Spectroscopy
Real Time SpectroscopyReal Time Spectroscopy
Real Time Spectroscopy
 
Rasool_PhD_Final_Presentation
Rasool_PhD_Final_PresentationRasool_PhD_Final_Presentation
Rasool_PhD_Final_Presentation
 
Poster Icqc
Poster IcqcPoster Icqc
Poster Icqc
 
40220140505002
4022014050500240220140505002
40220140505002
 
On a Deterministic Property of the Category of k-almost Primes: A Determinist...
On a Deterministic Property of the Category of k-almost Primes: A Determinist...On a Deterministic Property of the Category of k-almost Primes: A Determinist...
On a Deterministic Property of the Category of k-almost Primes: A Determinist...
 
40120130405025
4012013040502540120130405025
40120130405025
 
IRJET- Performance Analysis of Optimization Techniques by using Clustering
IRJET- Performance Analysis of Optimization Techniques by using ClusteringIRJET- Performance Analysis of Optimization Techniques by using Clustering
IRJET- Performance Analysis of Optimization Techniques by using Clustering
 
PSO and Its application in Engineering
PSO and Its application in EngineeringPSO and Its application in Engineering
PSO and Its application in Engineering
 
Optimization of Unit Commitment Problem using Classical Soft Computing Techni...
Optimization of Unit Commitment Problem using Classical Soft Computing Techni...Optimization of Unit Commitment Problem using Classical Soft Computing Techni...
Optimization of Unit Commitment Problem using Classical Soft Computing Techni...
 

Viewers also liked

Project_report_BSS
Project_report_BSSProject_report_BSS
Project_report_BSSKamal Bhagat
 
Blind Source Separation using Dictionary Learning
Blind Source Separation using Dictionary LearningBlind Source Separation using Dictionary Learning
Blind Source Separation using Dictionary LearningDavide Nardone
 
基底変形型教師ありNMFによる実楽器信号分離 (in Japanese)
基底変形型教師ありNMFによる実楽器信号分離 (in Japanese)基底変形型教師ありNMFによる実楽器信号分離 (in Japanese)
基底変形型教師ありNMFによる実楽器信号分離 (in Japanese)Daichi Kitamura
 
Kameoka2012 talk07 1
Kameoka2012 talk07 1Kameoka2012 talk07 1
Kameoka2012 talk07 1kame_hirokazu
 
独立成分分析とPerfume
独立成分分析とPerfume独立成分分析とPerfume
独立成分分析とPerfumeYurie Oka
 
Numpy scipyで独立成分分析
Numpy scipyで独立成分分析Numpy scipyで独立成分分析
Numpy scipyで独立成分分析Shintaro Fukushima
 

Viewers also liked (6)

Project_report_BSS
Project_report_BSSProject_report_BSS
Project_report_BSS
 
Blind Source Separation using Dictionary Learning
Blind Source Separation using Dictionary LearningBlind Source Separation using Dictionary Learning
Blind Source Separation using Dictionary Learning
 
基底変形型教師ありNMFによる実楽器信号分離 (in Japanese)
基底変形型教師ありNMFによる実楽器信号分離 (in Japanese)基底変形型教師ありNMFによる実楽器信号分離 (in Japanese)
基底変形型教師ありNMFによる実楽器信号分離 (in Japanese)
 
Kameoka2012 talk07 1
Kameoka2012 talk07 1Kameoka2012 talk07 1
Kameoka2012 talk07 1
 
独立成分分析とPerfume
独立成分分析とPerfume独立成分分析とPerfume
独立成分分析とPerfume
 
Numpy scipyで独立成分分析
Numpy scipyで独立成分分析Numpy scipyで独立成分分析
Numpy scipyで独立成分分析
 

Similar to Presentation1

Kernal based speaker specific feature extraction and its applications in iTau...
Kernal based speaker specific feature extraction and its applications in iTau...Kernal based speaker specific feature extraction and its applications in iTau...
Kernal based speaker specific feature extraction and its applications in iTau...TELKOMNIKA JOURNAL
 
Principle Component Analysis for Classification of the Quality of Aromatic Rice
Principle Component Analysis for Classification of the Quality of Aromatic RicePrinciple Component Analysis for Classification of the Quality of Aromatic Rice
Principle Component Analysis for Classification of the Quality of Aromatic RiceIJCSIS Research Publications
 
Unsupervised learning for acoustic shadowing artifact removal in ultrasound i...
Unsupervised learning for acoustic shadowing artifact removal in ultrasound i...Unsupervised learning for acoustic shadowing artifact removal in ultrasound i...
Unsupervised learning for acoustic shadowing artifact removal in ultrasound i...JaeyoungHuh2
 
Maximum Likelihood Estimation of Beetle
Maximum Likelihood Estimation of BeetleMaximum Likelihood Estimation of Beetle
Maximum Likelihood Estimation of BeetleLiang Kai Hu
 
An automatic test data generation for data flow
An automatic test data generation for data flowAn automatic test data generation for data flow
An automatic test data generation for data flowWafaQKhan
 
Consistent Nonparametric Spectrum Estimation Via Cepstrum Thresholding
Consistent Nonparametric Spectrum Estimation Via Cepstrum ThresholdingConsistent Nonparametric Spectrum Estimation Via Cepstrum Thresholding
Consistent Nonparametric Spectrum Estimation Via Cepstrum ThresholdingCSCJournals
 
Improving the Efficiency of Spectral Subtraction Method by Combining it with ...
Improving the Efficiency of Spectral Subtraction Method by Combining it with ...Improving the Efficiency of Spectral Subtraction Method by Combining it with ...
Improving the Efficiency of Spectral Subtraction Method by Combining it with ...IJORCS
 
Paper Study: Transformer dissection
Paper Study: Transformer dissectionPaper Study: Transformer dissection
Paper Study: Transformer dissectionChenYiHuang5
 
PREDICTION MODELS BASED ON MAX-STEMS Episode Two: Combinatorial Approach
PREDICTION MODELS BASED ON MAX-STEMS Episode Two: Combinatorial ApproachPREDICTION MODELS BASED ON MAX-STEMS Episode Two: Combinatorial Approach
PREDICTION MODELS BASED ON MAX-STEMS Episode Two: Combinatorial Approachahmet furkan emrehan
 
A hybrid sine cosine optimization algorithm for solving global optimization p...
A hybrid sine cosine optimization algorithm for solving global optimization p...A hybrid sine cosine optimization algorithm for solving global optimization p...
A hybrid sine cosine optimization algorithm for solving global optimization p...Aboul Ella Hassanien
 
A new Reinforcement Scheme for Stochastic Learning Automata
A new Reinforcement Scheme for Stochastic Learning AutomataA new Reinforcement Scheme for Stochastic Learning Automata
A new Reinforcement Scheme for Stochastic Learning Automatainfopapers
 
Lecture 5 backpropagation
Lecture 5 backpropagationLecture 5 backpropagation
Lecture 5 backpropagationParveenMalik18
 
Discrete penguins search optimization algorithm to solve flow shop schedulin...
Discrete penguins search optimization algorithm to solve  flow shop schedulin...Discrete penguins search optimization algorithm to solve  flow shop schedulin...
Discrete penguins search optimization algorithm to solve flow shop schedulin...IJECEIAES
 
A novel particle swarm optimization for papr reduction of ofdm systems
A novel particle swarm optimization for papr reduction of ofdm systemsA novel particle swarm optimization for papr reduction of ofdm systems
A novel particle swarm optimization for papr reduction of ofdm systemsaliasghar1989
 

Similar to Presentation1 (20)

Kernal based speaker specific feature extraction and its applications in iTau...
Kernal based speaker specific feature extraction and its applications in iTau...Kernal based speaker specific feature extraction and its applications in iTau...
Kernal based speaker specific feature extraction and its applications in iTau...
 
Principle Component Analysis for Classification of the Quality of Aromatic Rice
Principle Component Analysis for Classification of the Quality of Aromatic RicePrinciple Component Analysis for Classification of the Quality of Aromatic Rice
Principle Component Analysis for Classification of the Quality of Aromatic Rice
 
Unsupervised learning for acoustic shadowing artifact removal in ultrasound i...
Unsupervised learning for acoustic shadowing artifact removal in ultrasound i...Unsupervised learning for acoustic shadowing artifact removal in ultrasound i...
Unsupervised learning for acoustic shadowing artifact removal in ultrasound i...
 
Maximum Likelihood Estimation of Beetle
Maximum Likelihood Estimation of BeetleMaximum Likelihood Estimation of Beetle
Maximum Likelihood Estimation of Beetle
 
Microphone arrays
Microphone arraysMicrophone arrays
Microphone arrays
 
Presentation
PresentationPresentation
Presentation
 
An automatic test data generation for data flow
An automatic test data generation for data flowAn automatic test data generation for data flow
An automatic test data generation for data flow
 
Consistent Nonparametric Spectrum Estimation Via Cepstrum Thresholding
Consistent Nonparametric Spectrum Estimation Via Cepstrum ThresholdingConsistent Nonparametric Spectrum Estimation Via Cepstrum Thresholding
Consistent Nonparametric Spectrum Estimation Via Cepstrum Thresholding
 
Improving the Efficiency of Spectral Subtraction Method by Combining it with ...
Improving the Efficiency of Spectral Subtraction Method by Combining it with ...Improving the Efficiency of Spectral Subtraction Method by Combining it with ...
Improving the Efficiency of Spectral Subtraction Method by Combining it with ...
 
Paper Study: Transformer dissection
Paper Study: Transformer dissectionPaper Study: Transformer dissection
Paper Study: Transformer dissection
 
Da36615618
Da36615618Da36615618
Da36615618
 
PREDICTION MODELS BASED ON MAX-STEMS Episode Two: Combinatorial Approach
PREDICTION MODELS BASED ON MAX-STEMS Episode Two: Combinatorial ApproachPREDICTION MODELS BASED ON MAX-STEMS Episode Two: Combinatorial Approach
PREDICTION MODELS BASED ON MAX-STEMS Episode Two: Combinatorial Approach
 
A hybrid sine cosine optimization algorithm for solving global optimization p...
A hybrid sine cosine optimization algorithm for solving global optimization p...A hybrid sine cosine optimization algorithm for solving global optimization p...
A hybrid sine cosine optimization algorithm for solving global optimization p...
 
A new Reinforcement Scheme for Stochastic Learning Automata
A new Reinforcement Scheme for Stochastic Learning AutomataA new Reinforcement Scheme for Stochastic Learning Automata
A new Reinforcement Scheme for Stochastic Learning Automata
 
Seminar9
Seminar9Seminar9
Seminar9
 
Lecture 5 backpropagation
Lecture 5 backpropagationLecture 5 backpropagation
Lecture 5 backpropagation
 
CASR-Report
CASR-ReportCASR-Report
CASR-Report
 
Discrete penguins search optimization algorithm to solve flow shop schedulin...
Discrete penguins search optimization algorithm to solve  flow shop schedulin...Discrete penguins search optimization algorithm to solve  flow shop schedulin...
Discrete penguins search optimization algorithm to solve flow shop schedulin...
 
A novel particle swarm optimization for papr reduction of ofdm systems
A novel particle swarm optimization for papr reduction of ofdm systemsA novel particle swarm optimization for papr reduction of ofdm systems
A novel particle swarm optimization for papr reduction of ofdm systems
 
I0414752
I0414752I0414752
I0414752
 

Presentation1

  • 1. Blind Source Separation based on Improved Particle Swarm Optimization Ashish Kumar Meshram INDIAN INSTITUTE OF TECHNOLOGY, INDORE
  • 2. Content Problem Definition - Blind Source Separation (BSS) Independent Component Analysis (ICA) Particle Swarm Optimization (PSO) Improved PSO with Dynamic Inertia Weight Fitness Function Algorithm for BSS based on PSO Results & Conclusion INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 1
  • 3. Problem Definition - Blind Source Separation INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 2 Source Mixing Observation Estimation 𝑠(𝑡) 𝐴 ∈ ℝ 𝑀×𝑀 𝑥 𝑡 = 𝐴𝑠(𝑡) 𝑦 𝑡 = 𝑊𝑥(𝑡)
  • 4. Independent Component Analysis INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 3 Common method for Blind Source Separation (BSS) can be described as: 𝑥 𝑡 = 𝐴𝑠 𝑡 Where 𝐴 is a nonsingular 𝑁 × 𝑁 mixing matrix; 𝑠 𝑡 = [𝑠1 𝑡 , 𝑠2 𝑡 , … , 𝑠 𝑁 𝑡 ] 𝑇 𝑥 𝑡 = [𝑥1 𝑡 , 𝑥2 𝑡 , … , 𝑥 𝑁 𝑡 ] 𝑇 The basis problem of Independent (ICA) is to find 𝑁 × 𝑁 separation matrix 𝑊 = [𝑤1, 𝑤2, … , 𝑤 𝑁] 𝑇 without any prior knowledge of 𝑠(𝑡) and 𝐴, make 𝑦 𝑡 = 𝑊𝑥(𝑡); is the estimation of 𝑠 𝑡 . Solution Procedure: 1. Remove Mean 2. Whitening 3. Find an orthogonal W, optimizing an objective function
  • 5. Particle Swarm Optimization INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 4 The velocity and position of the particle at next iteration is calculated according to following equations: 𝑣𝑖,𝑗 𝑡 + 1 = 𝑤𝑣𝑖,𝑗 𝑡 + 𝑐1 𝑟1,𝑗 𝑡 𝑝𝑏𝑒𝑠𝑡𝑖,𝑗 𝑡 − 𝑥𝑖,𝑗 𝑡 + 𝑐2 𝑟2,𝑗 𝑡 𝑔𝑏𝑒𝑠𝑡𝑖 𝑡 − 𝑥𝑖,𝑗 𝑡 𝑥𝑖 𝑡 + 1 = 𝑥𝑖 𝑡 + 𝑣𝑖 𝑡 + 1 Where 𝑟1and 𝑟2are known as acceleration coefficients namely referred as cognitive and social parameter, which are responsible for stochastic behavior of algorithm. (1) (2) During the search process, the particle successively adjusts its position toward global optimum according to the two factors: 1. The best position encountered by itself (𝑝𝑏𝑒𝑠𝑡); 𝑝𝑖,𝑗 = 𝑝𝑖,1, 𝑝𝑖,2, … , 𝑝𝑖,𝐷 2. The best position encountered by whole swarm (g𝑏𝑒𝑠𝑡); 𝑝 𝑔 = [𝑝 𝑔,1, 𝑝 𝑔,2, … , 𝑝 𝑔,𝐷]
  • 6. Algorithm PSO INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 4 For each particle Initialize particle End Do For each particle Calculate fitness value if the fitness value is better than the best fitness value (pbest) in history Set current value as the new pbest END Choose the particle with the best fitness value of all particle as the gbest For each particle Calculate particle velocity according to eq. (1) Update particle position according to eq. (2) END While maximum iterations or minimum error criteria is not attained
  • 7. Improved PSO INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 5 † Clerc M, Kennedy J. “The particle swarm-explosion, stability, and convergence in a multidimensional complex space, “ IEEE Trans. on Evolutionary Computation, Vol.6, pp. 58-73, 2002. (references) Typical parameters in search course of PSO are:† 1. Evolution speed factor 2. Aggregation degree factor of the swarm 𝑤𝑖 𝑡 = 𝑔(ℎ𝑖 𝑡 , 𝑠) = 𝑤𝑖𝑛𝑖 − 𝛼 1 − ℎ𝑖 𝑡 + 𝛽𝑠 𝑣𝑖,𝑗 𝑡 + 1 = 𝑤𝑖 𝑡 𝑣𝑖,𝑗 𝑡 + 𝑐1 𝑟1,𝑗 𝑡 𝑝𝑏𝑒𝑠𝑡𝑖,𝑗 𝑡 − 𝑥𝑖,𝑗 𝑡 + 𝑐2 𝑟2,𝑗 𝑡 𝑔𝑏𝑒𝑠𝑡𝑖 𝑡 − 𝑥𝑖,𝑗 𝑡 0 ≤ ℎ ≤ 1, 0 ≤ 𝑠 ≤ 1 1 − 𝛼 ≤ 𝑤 ≤ 1 + 𝛽
  • 8. Fitness Function INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 6 𝑓𝑖𝑡𝑛𝑒𝑠𝑠 𝑓𝑢𝑛𝑐𝑡𝑖𝑜𝑛 𝑦 = 1 𝐼(𝑦) 𝐼 𝑦 = − log 𝑊 − 𝑖=1 𝑛 𝐸[ 𝑘=1 𝑝 (2𝑘 − 1)𝑥𝑖 2𝑘−2 ] + 𝑖=1 𝑛 𝐻(𝑦𝑖) 𝐻 𝑦𝑖 = log(2𝜋𝑒) 2 − 1 2∙3! 𝑘3 𝑗 2 − 1 2∙4! 𝑘4 𝑗 2 + 3 8 (𝑘3 𝑗 )2 𝑘4 𝑗 + 1 16 (𝑘4 𝑗 )3 † P.Comon,el. “Independent Component Analysis. A new concept?” Signal Processing, Vol.36, pp. 287-314, 1994 So by using this we need to find linear combination of observation vector which makes the fitness function maximum.
  • 9. Algorithm INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 7 STEP 1: Center the observed signal, and there after whitening it; 𝑋 = 𝑋 − 𝐸 𝑋 Centering 𝐻 = 𝐷−1/2 𝐹 𝑇 𝑋 Whitening Where D and F are eigenvalue and eigenvector of covariance matrix of X; 𝐶𝑜𝑣 𝑋 = 𝐸[𝑋𝑋 𝑇 ] STEP 2: Initialize each particle (size of group is K), include random position and velocity 𝑊𝑖 = [𝑊𝑖1, 𝑊𝑖2 , … , 𝑊𝑖𝐽]; and 𝑉𝑖 = [𝑉𝑖1, 𝑉𝑖2 , … , 𝑉𝑖𝐽]; where 1 ≤ 𝑖 ≤ 𝐾, 𝐽 = 𝑚 × 𝑛 To prevent target beyond the mark, initialize the range constraint of 𝑊𝑖𝑗 and 𝑉𝑖𝑗 as −0.5 < 𝑊𝑖𝑗 < 0.5, −0.5 < 𝑉𝑖𝑗 < 0.5 Evaluate the fitness of each of the particle according to fitness function; where 𝑦𝑖 = 𝑊𝑖 𝐻 = [𝑦𝑖1, 𝑦𝑖2 , … , 𝑦𝑖𝐾];1 ≤ 𝑖 ≤ 𝐾 STEP 3:
  • 10. Algorithm INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 8 STEP 4: Define local optimum position 𝑃𝑖 and global optimum position 𝐺; 𝑃𝑖 = 𝑃𝑖1, 𝑃𝑖2 , … , 𝑃𝑖𝐽 = 𝑊𝑖, 1 ≤ 𝑖 ≤ 𝐾 𝐺 = 𝐺1, 𝐺2 , … , 𝐺𝐽 = arg max{𝐹𝑖𝑡𝑛𝑒𝑠𝑠 𝑃𝑖 ∙ 𝐻 } 𝑣𝑖𝑗 𝑛𝑒𝑤 = 𝑤𝑖 𝑡 𝑣𝑖𝑗 + 𝑐1 𝑟1,𝑗 𝑡 𝑃𝑖𝑗 − 𝑊𝑖𝑗 + 𝑐2 𝑟2,𝑗 𝐺𝑗 − 𝑊𝑖𝑗 Thus, 𝑊𝑖𝑗 𝑛𝑒𝑤 = 𝑊𝑖𝑗 + 𝑉𝑖𝑗 𝑛𝑒𝑤 ; 1 ≤ 𝑗 ≤ 𝐽, 𝐽 = 𝑚 × 𝑛 Update 𝑉𝑖𝑗 = 𝑉𝑖𝑗 𝑛𝑒𝑤 , 𝑊𝑖𝑗 = 𝑊𝑖𝑗 𝑛𝑒𝑤 STEP 5: Normalize 𝑊𝑖𝑗 𝑛𝑒𝑤 = 𝑊𝑖𝑗 𝑛𝑒𝑤 𝑊𝑖𝑗 𝑛𝑒𝑤 ;1 ≤ 𝑖 ≤ 𝐾
  • 11. Algorithm INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 9 STEP 6: If the 𝐹𝑖𝑡𝑛𝑒𝑠𝑠 𝑃𝑖 ∙ 𝐻 < 𝐹𝑖𝑡𝑛𝑒𝑠𝑠 𝑊𝑖𝑗 𝑛𝑒𝑤 ∙ 𝐻 , 𝐺 = arg max{𝐹𝑖𝑡𝑛𝑒𝑠𝑠 𝑃𝑖 ∙ 𝐻 } STEP 7: Compute source signal according to global optimum position G, 𝑌 = 𝐺 ∙ 𝐻 = [𝑌1, 𝑌2, … , 𝑌𝐾] STEP 8: Obtain optimum solution 𝑊∗ = [𝑊1 ∗ , 𝑊2 ∗ , … , 𝑊𝐽 ∗ ] If dissatisfy termination condition, loop to step 3; else loop to step 9. STEP 9:
  • 12. Results INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 10 Original Voice Signal Mixing Voice Signal
  • 13. Results INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 11 Original Voice Signal DPSO-ICA Output Voice Signal
  • 14. Results INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 12 Original Voice Signal FastICA Output Voice Signal
  • 15. Results/Conclusion INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 13 Waveform of DPSO-ICA output signal is similar to original voice signal, and is also superior to FastICA output voice signal Kurtosis of DPSO-ICA is much closer to the original voice signal. DPSO-ICA convergence step is also less than FastICA, Voice Signal Kurtosis Of Original Voice Signal Kurtosis of Mixing Voice Signal Kurtosis of ICA Output Signal Convergence Steps Fast ICA DPSO-ICA Fast ICA DPSO-ICA Voice Signal 1 2.1438 2.1176 2.4826 2.1451 78 56Voice Signal 2 4.6856 3.6053 4.626 4.688 Voice Signal 3 12.467 5.3764 6.8372 12.477
  • 16. Results/Conclusion INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 13 Waveform of DPSO-ICA output signal is similar to original voice signal, and is also superior to FastICA output voice signal but there is a little change in the amplitude of signal. It is a common problem exists in BSS. Kurtosis of DPSO-ICA is much closer to the original voice signal. DPSO-ICA convergence step is also less than FastICA, Voice Signal Kurtosis Of Original Voice Signal Kurtosis of Mixing Voice Signal Kurtosis of ICA Output Signal Convergence Steps Fast ICA DPSO-ICA Fast ICA DPSO-ICA Voice Signal 1 2.1438 2.1176 2.4826 2.1451 78 56Voice Signal 2 4.6856 3.6053 4.626 4.688 Voice Signal 3 12.467 5.3764 6.8372 12.477
  • 17. Results/Conclusion INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 13 Waveform of DPSO-ICA output signal is similar to original voice signal, and is also superior to FastICA output voice signal Kurtosis of DPSO-ICA is much closer to the original voice signal. which indicate that the voice signal recovered by the DPSO-ICA has a better independence than FastICA. DPSO-ICA convergence step is also less than FastICA, Voice Signal Kurtosis Of Original Voice Signal Kurtosis of Mixing Voice Signal Kurtosis of ICA Output Signal Convergence Steps Fast ICA DPSO-ICA Fast ICA DPSO-ICA Voice Signal 1 2.1438 2.1176 2.4826 2.1451 78 56Voice Signal 2 4.6856 3.6053 4.626 4.688 Voice Signal 3 12.467 5.3764 6.8372 12.477
  • 18. Results/Conclusion INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 13 Waveform of DPSO-ICA output signal is similar to original voice signal, and is also superior to FastICA output voice signal Kurtosis of DPSO-ICA is much closer to the original voice signal. DPSO-ICA convergence step is also less than FastICA that is to say convergence speed of DPSO-ICA is superior to FastICA. Voice Signal Kurtosis Of Original Voice Signal Kurtosis of Mixing Voice Signal Kurtosis of ICA Output Signal Convergence Steps Fast ICA DPSO-ICA Fast ICA DPSO-ICA Voice Signal 1 2.1438 2.1176 2.4826 2.1451 78 56Voice Signal 2 4.6856 3.6053 4.626 4.688 Voice Signal 3 12.467 5.3764 6.8372 12.477