1. Blind Source Separation based on
Improved Particle Swarm Optimization Ashish Kumar Meshram
INDIAN INSTITUTE OF TECHNOLOGY, INDORE
2. Content
Problem Definition - Blind Source Separation (BSS)
Independent Component Analysis (ICA)
Particle Swarm Optimization (PSO)
Improved PSO with Dynamic Inertia Weight
Fitness Function
Algorithm for BSS based on PSO
Results & Conclusion
INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 1
3. Problem Definition - Blind Source Separation
INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 2
Source Mixing Observation Estimation
𝑠(𝑡) 𝐴 ∈ ℝ 𝑀×𝑀
𝑥 𝑡 = 𝐴𝑠(𝑡) 𝑦 𝑡 = 𝑊𝑥(𝑡)
4. Independent Component Analysis
INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 3
Common method for Blind Source Separation (BSS) can be described as: 𝑥 𝑡 = 𝐴𝑠 𝑡
Where 𝐴 is a nonsingular 𝑁 × 𝑁 mixing matrix;
𝑠 𝑡 = [𝑠1 𝑡 , 𝑠2 𝑡 , … , 𝑠 𝑁 𝑡 ] 𝑇
𝑥 𝑡 = [𝑥1 𝑡 , 𝑥2 𝑡 , … , 𝑥 𝑁 𝑡 ] 𝑇
The basis problem of Independent (ICA) is to find 𝑁 × 𝑁 separation matrix
𝑊 = [𝑤1, 𝑤2, … , 𝑤 𝑁] 𝑇
without any prior knowledge of 𝑠(𝑡) and 𝐴, make
𝑦 𝑡 = 𝑊𝑥(𝑡);
is the estimation of 𝑠 𝑡 .
Solution Procedure:
1. Remove Mean
2. Whitening
3. Find an orthogonal W, optimizing an objective function
5. Particle Swarm Optimization
INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 4
The velocity and position of the particle at next iteration is calculated according to following
equations:
𝑣𝑖,𝑗 𝑡 + 1 = 𝑤𝑣𝑖,𝑗 𝑡 + 𝑐1 𝑟1,𝑗 𝑡 𝑝𝑏𝑒𝑠𝑡𝑖,𝑗 𝑡 − 𝑥𝑖,𝑗 𝑡 + 𝑐2 𝑟2,𝑗 𝑡 𝑔𝑏𝑒𝑠𝑡𝑖 𝑡 − 𝑥𝑖,𝑗 𝑡
𝑥𝑖 𝑡 + 1 = 𝑥𝑖 𝑡 + 𝑣𝑖 𝑡 + 1
Where 𝑟1and 𝑟2are known as acceleration coefficients namely referred as cognitive and social
parameter, which are responsible for stochastic behavior of algorithm.
(1)
(2)
During the search process, the particle successively adjusts its position toward global optimum
according to the two factors:
1. The best position encountered by itself (𝑝𝑏𝑒𝑠𝑡); 𝑝𝑖,𝑗 = 𝑝𝑖,1, 𝑝𝑖,2, … , 𝑝𝑖,𝐷
2. The best position encountered by whole swarm (g𝑏𝑒𝑠𝑡); 𝑝 𝑔 = [𝑝 𝑔,1, 𝑝 𝑔,2, … , 𝑝 𝑔,𝐷]
6. Algorithm PSO
INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 4
For each particle
Initialize particle
End
Do
For each particle
Calculate fitness value
if the fitness value is better than the best fitness value (pbest) in history
Set current value as the new pbest
END
Choose the particle with the best fitness value of all particle as the gbest
For each particle
Calculate particle velocity according to eq. (1)
Update particle position according to eq. (2)
END
While maximum iterations or minimum error criteria is not attained
7. Improved PSO
INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 5
†
Clerc M, Kennedy J. “The particle swarm-explosion, stability, and convergence in a multidimensional complex space, “ IEEE Trans. on Evolutionary Computation, Vol.6, pp. 58-73, 2002.
(references)
Typical parameters in search course of PSO are:†
1. Evolution speed factor
2. Aggregation degree factor of the swarm
𝑤𝑖
𝑡
= 𝑔(ℎ𝑖
𝑡
, 𝑠) = 𝑤𝑖𝑛𝑖 − 𝛼 1 − ℎ𝑖
𝑡
+ 𝛽𝑠
𝑣𝑖,𝑗 𝑡 + 1 = 𝑤𝑖
𝑡
𝑣𝑖,𝑗 𝑡 + 𝑐1 𝑟1,𝑗 𝑡 𝑝𝑏𝑒𝑠𝑡𝑖,𝑗 𝑡 − 𝑥𝑖,𝑗 𝑡 + 𝑐2 𝑟2,𝑗 𝑡 𝑔𝑏𝑒𝑠𝑡𝑖 𝑡 − 𝑥𝑖,𝑗 𝑡
0 ≤ ℎ ≤ 1, 0 ≤ 𝑠 ≤ 1 1 − 𝛼 ≤ 𝑤 ≤ 1 + 𝛽
8. Fitness Function
INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 6
𝑓𝑖𝑡𝑛𝑒𝑠𝑠 𝑓𝑢𝑛𝑐𝑡𝑖𝑜𝑛 𝑦 =
1
𝐼(𝑦)
𝐼 𝑦 = − log 𝑊 −
𝑖=1
𝑛
𝐸[
𝑘=1
𝑝
(2𝑘 − 1)𝑥𝑖
2𝑘−2
] +
𝑖=1
𝑛
𝐻(𝑦𝑖)
𝐻 𝑦𝑖 =
log(2𝜋𝑒)
2
−
1
2∙3!
𝑘3
𝑗 2
−
1
2∙4!
𝑘4
𝑗 2
+
3
8
(𝑘3
𝑗
)2
𝑘4
𝑗
+
1
16
(𝑘4
𝑗
)3
†
P.Comon,el. “Independent Component Analysis. A new concept?” Signal Processing, Vol.36, pp. 287-314, 1994
So by using this we need to find linear combination of observation vector which makes the fitness function maximum.
9. Algorithm
INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 7
STEP 1: Center the observed signal, and there after whitening it;
𝑋 = 𝑋 − 𝐸 𝑋 Centering
𝐻 = 𝐷−1/2 𝐹 𝑇 𝑋 Whitening
Where D and F are eigenvalue and eigenvector of covariance matrix of X;
𝐶𝑜𝑣 𝑋 = 𝐸[𝑋𝑋 𝑇
]
STEP 2: Initialize each particle (size of group is K), include random position and velocity
𝑊𝑖 = [𝑊𝑖1, 𝑊𝑖2 , … , 𝑊𝑖𝐽]; and
𝑉𝑖 = [𝑉𝑖1, 𝑉𝑖2 , … , 𝑉𝑖𝐽]; where 1 ≤ 𝑖 ≤ 𝐾, 𝐽 = 𝑚 × 𝑛
To prevent target beyond the mark, initialize the range constraint of 𝑊𝑖𝑗 and 𝑉𝑖𝑗 as
−0.5 < 𝑊𝑖𝑗 < 0.5, −0.5 < 𝑉𝑖𝑗 < 0.5
Evaluate the fitness of each of the particle according to fitness function; where
𝑦𝑖 = 𝑊𝑖 𝐻 = [𝑦𝑖1, 𝑦𝑖2 , … , 𝑦𝑖𝐾];1 ≤ 𝑖 ≤ 𝐾
STEP 3:
11. Algorithm
INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 9
STEP 6: If the 𝐹𝑖𝑡𝑛𝑒𝑠𝑠 𝑃𝑖 ∙ 𝐻 < 𝐹𝑖𝑡𝑛𝑒𝑠𝑠 𝑊𝑖𝑗
𝑛𝑒𝑤
∙ 𝐻 ,
𝐺 = arg max{𝐹𝑖𝑡𝑛𝑒𝑠𝑠 𝑃𝑖 ∙ 𝐻 }
STEP 7: Compute source signal according to global optimum position G,
𝑌 = 𝐺 ∙ 𝐻 = [𝑌1, 𝑌2, … , 𝑌𝐾]
STEP 8:
Obtain optimum solution 𝑊∗
= [𝑊1
∗
, 𝑊2
∗
, … , 𝑊𝐽
∗
]
If dissatisfy termination condition, loop to step 3; else loop to step 9.
STEP 9:
12. Results
INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 10
Original Voice Signal Mixing Voice Signal
13. Results
INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 11
Original Voice Signal DPSO-ICA Output Voice Signal
14. Results
INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 12
Original Voice Signal FastICA Output Voice Signal
15. Results/Conclusion
INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 13
Waveform of DPSO-ICA output signal is similar to original voice signal, and is also superior to
FastICA output voice signal
Kurtosis of DPSO-ICA is much closer to the original voice signal.
DPSO-ICA convergence step is also less than FastICA,
Voice Signal
Kurtosis
Of Original Voice Signal
Kurtosis of Mixing Voice
Signal
Kurtosis of ICA Output
Signal
Convergence Steps
Fast ICA DPSO-ICA Fast ICA DPSO-ICA
Voice Signal 1 2.1438 2.1176 2.4826 2.1451
78 56Voice Signal 2 4.6856 3.6053 4.626 4.688
Voice Signal 3 12.467 5.3764 6.8372 12.477
16. Results/Conclusion
INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 13
Waveform of DPSO-ICA output signal is similar to original voice signal, and is also superior to
FastICA output voice signal
but there is a little change in the amplitude of signal. It is a common problem exists in BSS.
Kurtosis of DPSO-ICA is much closer to the original voice signal.
DPSO-ICA convergence step is also less than FastICA,
Voice Signal
Kurtosis
Of Original Voice Signal
Kurtosis of Mixing Voice
Signal
Kurtosis of ICA Output
Signal
Convergence Steps
Fast ICA DPSO-ICA Fast ICA DPSO-ICA
Voice Signal 1 2.1438 2.1176 2.4826 2.1451
78 56Voice Signal 2 4.6856 3.6053 4.626 4.688
Voice Signal 3 12.467 5.3764 6.8372 12.477
17. Results/Conclusion
INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 13
Waveform of DPSO-ICA output signal is similar to original voice signal, and is also superior to
FastICA output voice signal
Kurtosis of DPSO-ICA is much closer to the original voice signal.
which indicate that the voice signal recovered by the DPSO-ICA has a better independence than FastICA.
DPSO-ICA convergence step is also less than FastICA,
Voice Signal
Kurtosis
Of Original Voice Signal
Kurtosis of Mixing Voice
Signal
Kurtosis of ICA Output
Signal
Convergence Steps
Fast ICA DPSO-ICA Fast ICA DPSO-ICA
Voice Signal 1 2.1438 2.1176 2.4826 2.1451
78 56Voice Signal 2 4.6856 3.6053 4.626 4.688
Voice Signal 3 12.467 5.3764 6.8372 12.477
18. Results/Conclusion
INDIAN INSTITUTE OF TECHNOLOGY, INDORE OPTIMIZATION THEORY 13
Waveform of DPSO-ICA output signal is similar to original voice signal, and is also superior to
FastICA output voice signal
Kurtosis of DPSO-ICA is much closer to the original voice signal.
DPSO-ICA convergence step is also less than FastICA
that is to say convergence speed of DPSO-ICA is superior to FastICA.
Voice Signal
Kurtosis
Of Original Voice Signal
Kurtosis of Mixing Voice
Signal
Kurtosis of ICA Output
Signal
Convergence Steps
Fast ICA DPSO-ICA Fast ICA DPSO-ICA
Voice Signal 1 2.1438 2.1176 2.4826 2.1451
78 56Voice Signal 2 4.6856 3.6053 4.626 4.688
Voice Signal 3 12.467 5.3764 6.8372 12.477