SlideShare a Scribd company logo
1 of 5
Download to read offline
Sadiqua khan et al Int. Journal of Engineering Research and Applications
ISSN : 2248-9622, Vol. 3, Issue 6, Nov-Dec 2013, pp.1367-1371

RESEARCH ARTICLE

www.ijera.com

OPEN ACCESS

Fast Independent Component Analysis Algorithm for Blind
Separation of Non-Stationary Audio Signal
Sadiqua Khan*, Y. Rama Krishna **, Dr. Panakala Rajesh Kumar***
*(Department of Electronics and Communications, PVPSIT, Vijayawada.
** (Department of Electronics and Communications, PVPSIT, Vijayawada.
***(Department of Electronics and Communications, PVPSIT, Vijayawada.

ABSTRACT
FastICA is a statistical method for transforming an observed multidimensional random vector into components
that are statistically as independent from each other as possible. Acoustic signals recorded simultaneously in
a reverberant environment can be described as sum of differently convolved sources. The task of source
separation is to identify the multiple channels and possibly to invert those in order to obtain estimates of the
underlying sources. We tackle the problem by explicitly exploiting the non-stationary components of the
acoustic sources. Using maximum entropy approximations of differential entropy, we introduce a family of
new contrast (objective) functions for ICA. Here we propose an algorithm for blind source separation in
which frequency domain ICA and time domain ICA are used for successful separation of signals
Keywords - Blind source separation, Entropy, Independent Component Analysis, Non-guassianity.

I. INTRODUCTION.
Blind source separation (BSS) has been
proposed for various fields in recent years [1]. It is
used to extract individual signals from observed mixed
signals. It can be potentially used in communication
systems, biomedical signal processing, image
restoration and the classical cocktail party problem. In
the communication field, it is a promising tool for the
design of multi-input multi-output (MIMO) equalizers
for suppression of intersymbol interference, cochannel and adjacent channel interference and multiaccess interference. In biomedical signal processing,
BSS can be used to process electrocardiography
(ECG),
electroencephalography
(EEG),
electromyography
(EMG)
and
magneto
encephalograph (MEG) signals. In the image signal
processing field, it can be used for image restoration
and understanding. The cocktail party problem is our
focus, where the target is to mimic in a machine the
ability of a human to separate one speaker from a
mixture of sounds. We focus on audio signal
processing in a room environment, which can for
example be used for teleconferencing.
During the past decades, there has been
considerable research performed in the field of
convolutive blind source separation (CBSS). Initially,
research was aimed at solutions based in the time
domain. In real room recording, however, where the
impulse response is on the order of thousands of
samples in length, the time domain algorithm would be
computationally very expensive to separate the
sources. To overcome this problem, a solution in the
frequency domain was proposed. As convolution in the
time domain corresponds to multiplication in the
frequency domain, the transformation into the
www.ijera.com

frequency domain converts the convolutive mixing
problem to that of independent complex instantaneous
mixing operations at each frequency bin provided the
block length is not too large.
In realization,
moreover, care is necessary to overcome circular
convolution effects.
In this paper, we present the implementation
of blind source separation using FastICA (independent
component analysis). The aspiration of this paper is
to recover two independent source signals composed
of unknown linear combinations [2]. Through BSS, we
have successfully separated the two signals apart with
and without background noise.

II.

Entropy

A central problem in BSS is cocktail party, as
well as in statistics and signal processing, is finding a
suitable representation or transformation of the data.
For computational
and conceptual simplicity, the
representation is often sought as a linear
transformation of the original data. Let us denote by x
= (x1,x2, ..., xm)T a zero-mean m-dimensional random
variable that can be observed, and by s = (s1, s2, ...,
sn)T its n-dimensional transform. Then the problem is
to determine a constant (weight) matrix W so that the
linear transformation of the observed variables has
some suitable properties.
S= Wx
(1)
Several principles and methods have been
developed to find such a linear representation,
including principal component analysis, factor
analysis, projection pursuit, independent component
analysis etc. The transformation may be defined using
such criteria as optimal dimension reduction, statistical
’interestingness’ of the resulting components ‘s’
1367 | P a g e
Sadiqua khan et al Int. Journal of Engineering Research and Applications
ISSN : 2248-9622, Vol. 3, Issue 6, Nov-Dec 2013, pp.1367-1371
simplicity of the transformation, or other criteria,
including application-oriented ones.
We treat in this paper the problem of
estimating the transformation given by (linear)
independent component analysis (ICA). Thus this
method is a special case of redundancy reduction. One
popular way of formulating the ICA problem is to
consider the estimation of the following generative
model for the data.
x = As
(2)
where x is an observed m-dimensional vector, s is an
n-dimensional (latent) random vector whose
components are assumed mutually independent, and A
is a constant m × n matrix to be estimated . It is
usually further assumed that the dimensions of x and s
are equal, i.e., m = n; we make this assumption in the
rest of the paper.
A noise vector may also be present. The
matrix W defining the transformation as in (1) is then
obtained as the (pseudo)inverse of the estimate of the
matrix A. Non-Gaussianity of the independent
components is necessary for the identity ability of the
model. General formulation for ICA that does not
need to assume an underlying data model. This
definition is based on the concept of mutual
information. First, we define the differential entropy
H of a random vector y = (y1,... yn)T with density f
(.):
H(y) = - ∫f (y) log f (y) dy
(3)
Differential entropy can be normalized to
give rise to the determination of negentropy, which
has the appealing property of being invariant for
linear transformations. The definition of negentropy
J is given by:
J(y)= H(ygauss) - H(y )
(4)
where y gauss is a Gaussian random vector of the
same covariance matrix as y. Negentropy can also be
interpreted as a measure of nongaussianity. Using the
concept of differential entropy, one can define the
mutual information I between the n (scalar) random
variables yi, i = 1...n [8, 7]. Mutual information is a
natural measure of the dependence between random
variables. It is particularly interesting to express
mutual information using negentropy, constraining the
variables to be uncorrelated. In this case, we have [7]
I(y1, y2, ..., yn ) = J(y) -∑ J(yi).
(5)
Since mutual information is the information-theoretic
measure of the independence of random variables, it
is natural to use it as the criterion for finding the ICA
transform. The ICA of a random vector x as an
invertible transformation s = Wx where the matrix W
is determined so that the mutual information of the
transformed components si is minimized.
Two promising applications of ICA are blind
source separation and feature extraction. In blind
source separation, the observed values of x correspond
to a realization of an m-dimensional discrete-time
signal x(t), t = 1, 2, .... Then the components s(t) are
called source signals, which are usually original,
uncorrupted signals or noise sources. Often such
www.ijera.com

www.ijera.com

sources are statistically independent from each other,
and thus the signals can be recovered from linear
mixtures x by n finding a transformation in which the
transformed signals are as independent as possible as
in ICA.
III. Functions for ICA
3.1 ICA data model, minimization of
mutual
information.
One popular way of formulating the ICA
problem is to consider the estimation of the
following generative model for the data [1, 3, 5, 6]
From (2) x is an observed m-dimensional vector, s is
an n-dimensional (latent) random vector whose
components are assumed mutually independent, and
A is a constant m × n matrix to be estimated. It is
usually further assumed that the dimensions of x and
s are equal, i.e., m = n; we make this assumption in
the rest of the paper. A noise vector may also be
present. The matrix W defining the transformation as
in (1) is then obtained as the (pseudo)inverse of the
estimate of the matrix A. Non-Gaussianity of the
independent components is necessary for the
identibility of the model (2), see [7].
Comon [7] showed how to obtain a more
general formulation for ICA that does not need to
assume an underlying data model. This definition is
based on the concept of mutual information. First,
we define the differential entropy H of a random
vector y = (y1 , yn)T with density f (.) as follows:
H(y) = -∫f (y) log f (y) dy
(6)
Differential entropy can be normalized to give rise to
the definition of negentropy, which has the appealing
property
of
being
invariant
for
linear
transformations. The definition of negentropy J is
given by
J(y)=H(ygauss) - H(y)
(7)
where ygauss is a Gaussian random vector of the
same covariance matrix as y. Negentropy can also
be interpreted as a measure of nongaussianity [7].
Using the concept of differential entropy, one can
define the mutual information I between the n
(scalar) random variables yi, i = 1...n [8, 7].
Mutual information is a natural measure of the
dependence between random variables. It is particularly interesting to express mutual information using
negentropy, constraining the variables to be
uncorrelated. In this case, we have [7]
I(y1, y2, ..., yn ) = J(y) - ∑J(yi).
(8)
Since mutual information is the information-theoretic
measure of the independence of random variables, it
is natural to use it as the criterion for finding the
ICA transform. Thus we define in this paper,
following [7], the ICA of a random vector x as an
invertible transformation as in (1) where the matrix
W is determined so that the mutual information of
the transformed components si is minimized. This
constraint is not strictly necessary, but simplifies the
computations considerably. Because negentropy is
invariant for invertible linear transformations is now
1368 | P a g e
Sadiqua khan et al Int. Journal of Engineering Research and Applications
ISSN : 2248-9622, Vol. 3, Issue 6, Nov-Dec 2013, pp.1367-1371
obvious from (8) that finding an invertible
transformation W that minimizes the mutual
information is roughly equivalent to directions in
which the negentropy is maximized.
3.2 Approximations of Negentropy
To use the definition of ICA given above, a
simple estimate of the negentropy (or of differential
entropy) is needed. We use here the new
approximations developed based on the maximum
entropy principle. In the simplest case, these new
approximations are of the form:
J(yi)≈c[E{G(yi)}−E{G(v)}]2
(9)
where G is practically any non-quadratic function, c
is an irrelevant constant, and
is a Gaussian
variable of zero mean and unit variance (i.e.,
standardized). The random variable yi is assumed to
be of zero mean and unit variance. For symmetric
variables, this is a generalization of the cumulantbased approximation in [7], which is obtained by
taking G(yi) = y4i
i . The choice of the function.
The approximation of negentropy given
above in (9) gives readily a new objective function
for estimating the ICA transform in our framework.
First, to find one independent component, or
projection pursuit direction as yi=wTx, we maximize
the function JG given by
JG(w)= [E{G(wTx)} − E{G(v)}]2
(10)
where w is an m-dimensional (weight) vector
constrained so that E{(wTx)2} = 1 (we can fix the
scale arbitrarily). Several independent components
can then be estimated one-by-one using a scheme,
see Section 4. Second, using the approach of
minimizing mutual information, the above on e-unit
contrast function can be simply extended to comp ute the whole matrix W in (1). To do this, recall from
(8) that mutual information is minimized (under the
constraint of decorrelation) when the sum of the
negentropies of the components in maximized.

IV. FIXED POINT ALGORITHM
To begin with, we shall derive the fixedpoint algorithm with sphered data. First note that the
maxima of JG(w) are obtained at certain optima of
E{G(wTx)}. According to the Kuhn-Tucker conditions[18], the optima of E{G(wTx)} under the
constraint E{(wTx)2}=kwk2=1 are obtained at points
where
E{xg(wT x)} − βw = 0
(11)
where ß is a constant that can be easily evaluated to
give β = E{wT0 xg(wT0 x)}, where w0 is the value of
w at the optimum. Let us try to solve this equation by
Newton’s method. Denoting the function on the lefthand side of (11) by F, we obtain its Jacobian matrix
JF (w) as
JF (w) = E{xxT g'(wT x)} − βI
(12)
To simplify the inversion of this matrix, we decide to
approximate the first term in (12). Since the data is
sphered, a reasonable approximation seems to be
www.ijera.com

www.ijera.com

E{xxT g'(wT x)} ≈ E{xxT }E{g'(wT x)} = E{g'(wT x)}I.
Thus the jacobian matrix becomes diagonal and can
easily inverted. We also approximate ß using the
current value of w instead of w0. Thus we obtain the
following approximative newton iteration:
w+= w−[E{xg(wT x)} − βw]/[E{g'(wT x)} − β]
∗ = w + /kw +k
w
(13)
where w*denote the value of w β = E{wT xg(wTx)}
and the normalization has been added to improve the
stability. This algorithm can be further simplified by
multiplying both sides of the equation in (16) by β −
E{g'(wT x)}. This gives the following fixed point
algorithm:
w+ = E{xg(wT x)} − E{g'(wT x)}w
(14)
w∗ = w+ /kw+k
which was introduced in [17] using a more heuristic
derivation. It is well-known that the convergence of
the Newton method may be rather uncertain. To
ameliorate this, one may add a step size in (16),
obtaining the stabilized fixed-point algorithm
w+=w −µ [E{xg(wT x)} −βw]/[E{g'(wT x)} − β]
w∗ = w+ /kw+k
(15)
where β = E{wT xg(wT x)} as above, and µ is a step
size parameter that may change with the iteration
count. Taking a µ that is much smaller than unity
(say, 0.1 or 0.01), the algorithm (15) converges with
much more certainty. In particular, it is often a good
strategy to start with µ = 1, in which case the
algorithm is equivalent to the original fixed-point
algorithm in (17). If convergence seems problematic,
µ may then be decreased gradually until convergence
is satisfactory.
The fixed-point algorithms may also be simply used
for the original, that is, not sphered data.
Transforming the data back to the non-sphered
variables, one sees easily that the following
modification of the algorithm (14) works for nonsphered data:
w+ = C−1E{xg(wT x)} − E{g'(wT x)}w
∗ = w + / (w + )T Cw +
w
(16)
where C = E{xxT }is the covariance matrix of the
data. The stabilized version, algorithm (15), can also
be modified as follows to work with no n-sphered
data:
w+= w − µ [C−1E{xg(wTx)} − βw]/[E{g'(wTx)}− β]
w∗ = w+ / (w+ )T Cw +
(17)
Using this algorithm, one obtains directly an
independent component as the linear combination
wTx, where x need not be sphered (pre-whitened).
These modifications presuppose, of course, that the
covariance matrix is not singular. If it is singular or
near-singular, the dimension of the data must be
reduced, for example with PCA [7].

1369 | P a g e
Sadiqua khan et al Int. Journal of Engineering Research and Applications
ISSN : 2248-9622, Vol. 3, Issue 6, Nov-Dec 2013, pp.1367-1371
V. Experimental results

www.ijera.com

6

mixed signal
5

4

4
3

2
2

amplitude

1

0
0
-1

-2

-2

-4

-3
-4
-5

-6
0

1000

2000

3000

4000

5000
time

6000

7000

8000

9000

10000

0

1000

1.2

Gradient Magnitude

1

0.8

0.6

0.4

0.2

10

20

30

40

50
Iteration

60

70

4000

5000

6000

7000

8000

9000 10000

VI. Conclusion

Magnitude of Entropy Gradient
1.4

0

3000

Fig.5: separated source audio of bird.

Fig.1: Mixed audio signals of a bird and horn.

0

2000

80

90

100

Fig.2 calculation of entropy.
Function values - Entropy
1.4

The problem of linear independent
component analysis (ICA), which is a form of
redundancy reduction, was addressed. The main
advantage of the fixed-point algorithms is that their
convergence can be shown to be very fast (cubic or
at least quadratic). Combining the good statistical
properties (e.g. robustness) of the new contrast
functions, and the good algorithmic properties of the
fixed-point algorithm, a very appealing method for
ICA was obtained. Simulations as well as
applications on real-life data have validated the novel
contrast functions and algorithms introduced. Some
extensions of the methods introduced in this paper
are present in which the problem of noisy data is
addressed which deals with the situation where there
are more independent components than observed
variables..

1.2
1

References
[1]

h(Y)

0.8
0.6
0.4
0.2
0
-0.2

[2]
0

10

20

30

40

50
Iteration

60

70

80

90

100

Fig.3: Calculation of gradient entropy.
[3]

3

2

1

[4]

0

-1

-2

[5]
-3

-4
0

1000

2000

3000

4000

5000

6000

7000

8000

9000

Fig.4: separated source audio of horn.

www.ijera.com

10000

[6]

S.-I. Amari, A. Cichocki, and H.H. Yang. A
new learning algorithm for blind source
separation.
In
Advances in Neural
Information Processing Systems 8, pages
757–763. MIT Press, 1996.
H. B. Barlow. Possible principles underlying
the transformations of sensory messages. In
W.
A.Rosen-blith,
editor,
Sensory
Communication, pages 217–234. MIT Press,
1961.
A.J. Bell and T.J. Sejnowski. An
information-maximization approach to blind
separation and blind deconvolution. Neural
Computation, 7:1129–1159, 1995.
A.J. Bell and T.J. Sejnowski. The
'independent components' of natural scenes
are edge filters. Vision Research, 37:3327–
3338, 1997.
J.-F. Cardoso and B. Hvam Laheld.
Equivariant adaptive source separation. IEEE
Trans. on Signal Processing, 44(12):3017–
3030, 1996.
A. Cichocki and R. Unbehauen. Neural
Networks for Signal Processing and
1370 | P a g e
Sadiqua khan et al Int. Journal of Engineering Research and Applications
ISSN : 2248-9622, Vol. 3, Issue 6, Nov-Dec 2013, pp.1367-1371
[7]

[8]
[9]

[10]
[11]

[12]

[13]

[14]

[15]

[16]
[17]

[18]

www.ijera.com

Optimization. Wiley, 1994.
P.
Comon.
Independent
component
analysis—a new concept
? Signal
Processing, 36:287–314, 1994.
T. M. Cover and J. A. Thomas. Elements of
Information Theory. Wiley, 1991.
N. Delfosse and P. Loubaton. Adaptive blind
separation of independent sources: a deflation
approach. Signal Processing, 45:59–83,
1995.
The FastICA MATLAB package. Available
at http://www.cis.hut.fi/projects/ica/fastica/.
J. H. Friedman and J. W. Tukey. A
projection pursuit algorithm for exploratory
data analysis. IEEE Trans. of Computers, c23(9):881–890, 1974. J.H. Friedman.
Exploratory projection pursuit. J. of the
American
Statistical
Association,
82(397):249– 266, 1987.
J. H. Friedman and J. W. Tukey. A
projection pursuit algorithm for exploratory
data analysis. IEEE Trans. of Computers, c23(9):881–890, 1974.
X. Giannakopoulos, J. Karhunen, and E.
Oja. Experimental comparison of neural ICA
algorithms. In Proc. Int. Conf. on Artificial
Neural Networks (ICANN'98) , pages 651–
656, Skövde, Sweden, 1998.
F.R. Hampel, E.M. Ronchetti, P.J.
Rousseuw, and W.A. Stahel. Robust
Statistics. Wiley, 1986.
H. H. Harman. Modern Factor Analysis.
University of Chicago Press, 2nd edition,
1967.
P.J. Huber. Projection pursuit. The Annals of
Statistics, 13(2):435–475, 1985.
A. Hyvärinen. A family of fixed-point
algorithms for independent component
analysis. In Proc. IEEE Int. Conf. on
Acoustics, Speech and Signal Processing
(ICASSP'97), pages 3917–3920, Munich,
Germany, 1997.
D. Luenberger. Optimization by Vector
Space Methods. Wiley, 1969.

www.ijera.com

1371 | P a g e

More Related Content

What's hot

Improved probabilistic distance based locality preserving projections method ...
Improved probabilistic distance based locality preserving projections method ...Improved probabilistic distance based locality preserving projections method ...
Improved probabilistic distance based locality preserving projections method ...IJECEIAES
 
Algorithmic Information Theory and Computational Biology
Algorithmic Information Theory and Computational BiologyAlgorithmic Information Theory and Computational Biology
Algorithmic Information Theory and Computational BiologyHector Zenil
 
Comparison of Different Methods for Fusion of Multimodal Medical Images
Comparison of Different Methods for Fusion of Multimodal Medical ImagesComparison of Different Methods for Fusion of Multimodal Medical Images
Comparison of Different Methods for Fusion of Multimodal Medical ImagesIRJET Journal
 
InternshipReport
InternshipReportInternshipReport
InternshipReportHamza Ameur
 
CHN and Swap Heuristic to Solve the Maximum Independent Set Problem
CHN and Swap Heuristic to Solve the Maximum Independent Set ProblemCHN and Swap Heuristic to Solve the Maximum Independent Set Problem
CHN and Swap Heuristic to Solve the Maximum Independent Set ProblemIJECEIAES
 
Regularized Compression of A Noisy Blurred Image
Regularized Compression of A Noisy Blurred Image Regularized Compression of A Noisy Blurred Image
Regularized Compression of A Noisy Blurred Image ijcsa
 
Matrix and Tensor Tools for Computer Vision
Matrix and Tensor Tools for Computer VisionMatrix and Tensor Tools for Computer Vision
Matrix and Tensor Tools for Computer VisionActiveEon
 
Complexity and Computation in Nature: How can we test for Artificial Life?
Complexity and Computation in Nature: How can we test for Artificial Life?Complexity and Computation in Nature: How can we test for Artificial Life?
Complexity and Computation in Nature: How can we test for Artificial Life?Hector Zenil
 
Unsupervised multispectral image Classification By fuzzy hidden Markov chains...
Unsupervised multispectral image Classification By fuzzy hidden Markov chains...Unsupervised multispectral image Classification By fuzzy hidden Markov chains...
Unsupervised multispectral image Classification By fuzzy hidden Markov chains...CSCJournals
 
Detection of Neural Activities in FMRI Using Jensen-Shannon Divergence
Detection of Neural Activities in FMRI Using Jensen-Shannon DivergenceDetection of Neural Activities in FMRI Using Jensen-Shannon Divergence
Detection of Neural Activities in FMRI Using Jensen-Shannon DivergenceCSCJournals
 
Reduct generation for the incremental data using rough set theory
Reduct generation for the incremental data using rough set theoryReduct generation for the incremental data using rough set theory
Reduct generation for the incremental data using rough set theorycsandit
 
Quantum brain a recurrent quantum neural network model to describe eye trac...
Quantum brain   a recurrent quantum neural network model to describe eye trac...Quantum brain   a recurrent quantum neural network model to describe eye trac...
Quantum brain a recurrent quantum neural network model to describe eye trac...Elsa von Licy
 
Information topology, Deep Network generalization and Consciousness quantific...
Information topology, Deep Network generalization and Consciousness quantific...Information topology, Deep Network generalization and Consciousness quantific...
Information topology, Deep Network generalization and Consciousness quantific...Pierre BAUDOT
 
11.final paper -0047www.iiste.org call-for_paper-58
11.final paper -0047www.iiste.org call-for_paper-5811.final paper -0047www.iiste.org call-for_paper-58
11.final paper -0047www.iiste.org call-for_paper-58Alexander Decker
 

What's hot (19)

505 260-266
505 260-266505 260-266
505 260-266
 
Improved probabilistic distance based locality preserving projections method ...
Improved probabilistic distance based locality preserving projections method ...Improved probabilistic distance based locality preserving projections method ...
Improved probabilistic distance based locality preserving projections method ...
 
Algorithmic Information Theory and Computational Biology
Algorithmic Information Theory and Computational BiologyAlgorithmic Information Theory and Computational Biology
Algorithmic Information Theory and Computational Biology
 
Comparison of Different Methods for Fusion of Multimodal Medical Images
Comparison of Different Methods for Fusion of Multimodal Medical ImagesComparison of Different Methods for Fusion of Multimodal Medical Images
Comparison of Different Methods for Fusion of Multimodal Medical Images
 
InternshipReport
InternshipReportInternshipReport
InternshipReport
 
CHN and Swap Heuristic to Solve the Maximum Independent Set Problem
CHN and Swap Heuristic to Solve the Maximum Independent Set ProblemCHN and Swap Heuristic to Solve the Maximum Independent Set Problem
CHN and Swap Heuristic to Solve the Maximum Independent Set Problem
 
Dycops2019
Dycops2019 Dycops2019
Dycops2019
 
Regularized Compression of A Noisy Blurred Image
Regularized Compression of A Noisy Blurred Image Regularized Compression of A Noisy Blurred Image
Regularized Compression of A Noisy Blurred Image
 
11
1111
11
 
Am26242246
Am26242246Am26242246
Am26242246
 
Matrix and Tensor Tools for Computer Vision
Matrix and Tensor Tools for Computer VisionMatrix and Tensor Tools for Computer Vision
Matrix and Tensor Tools for Computer Vision
 
Complexity and Computation in Nature: How can we test for Artificial Life?
Complexity and Computation in Nature: How can we test for Artificial Life?Complexity and Computation in Nature: How can we test for Artificial Life?
Complexity and Computation in Nature: How can we test for Artificial Life?
 
Unsupervised multispectral image Classification By fuzzy hidden Markov chains...
Unsupervised multispectral image Classification By fuzzy hidden Markov chains...Unsupervised multispectral image Classification By fuzzy hidden Markov chains...
Unsupervised multispectral image Classification By fuzzy hidden Markov chains...
 
Detection of Neural Activities in FMRI Using Jensen-Shannon Divergence
Detection of Neural Activities in FMRI Using Jensen-Shannon DivergenceDetection of Neural Activities in FMRI Using Jensen-Shannon Divergence
Detection of Neural Activities in FMRI Using Jensen-Shannon Divergence
 
Reduct generation for the incremental data using rough set theory
Reduct generation for the incremental data using rough set theoryReduct generation for the incremental data using rough set theory
Reduct generation for the incremental data using rough set theory
 
Quantum brain a recurrent quantum neural network model to describe eye trac...
Quantum brain   a recurrent quantum neural network model to describe eye trac...Quantum brain   a recurrent quantum neural network model to describe eye trac...
Quantum brain a recurrent quantum neural network model to describe eye trac...
 
Information topology, Deep Network generalization and Consciousness quantific...
Information topology, Deep Network generalization and Consciousness quantific...Information topology, Deep Network generalization and Consciousness quantific...
Information topology, Deep Network generalization and Consciousness quantific...
 
11.final paper -0047www.iiste.org call-for_paper-58
11.final paper -0047www.iiste.org call-for_paper-5811.final paper -0047www.iiste.org call-for_paper-58
11.final paper -0047www.iiste.org call-for_paper-58
 
Clustering tutorial
Clustering tutorialClustering tutorial
Clustering tutorial
 

Viewers also liked (20)

Ia3613981403
Ia3613981403Ia3613981403
Ia3613981403
 
Hx3613861389
Hx3613861389Hx3613861389
Hx3613861389
 
Id3614141421
Id3614141421Id3614141421
Id3614141421
 
Ih3614371444
Ih3614371444Ih3614371444
Ih3614371444
 
Ij3614591468
Ij3614591468Ij3614591468
Ij3614591468
 
Jw3617091714
Jw3617091714Jw3617091714
Jw3617091714
 
Jr3616801683
Jr3616801683Jr3616801683
Jr3616801683
 
Iv3615381548
Iv3615381548Iv3615381548
Iv3615381548
 
Jk3616291636
Jk3616291636Jk3616291636
Jk3616291636
 
Jd3615921597
Jd3615921597Jd3615921597
Jd3615921597
 
Ju3617981704
Ju3617981704Ju3617981704
Ju3617981704
 
Hn3613321337
Hn3613321337Hn3613321337
Hn3613321337
 
Hy3613901394
Hy3613901394Hy3613901394
Hy3613901394
 
Ii3614451458
Ii3614451458Ii3614451458
Ii3614451458
 
Ig3614301436
Ig3614301436Ig3614301436
Ig3614301436
 
Ic36140914013
Ic36140914013Ic36140914013
Ic36140914013
 
Ik3614691472
Ik3614691472Ik3614691472
Ik3614691472
 
Ie3614221424
Ie3614221424Ie3614221424
Ie3614221424
 
If3614251429
If3614251429If3614251429
If3614251429
 
Hz3613951397
Hz3613951397Hz3613951397
Hz3613951397
 

Similar to Ht3613671371

An efficient approach to wavelet image Denoising
An efficient approach to wavelet image DenoisingAn efficient approach to wavelet image Denoising
An efficient approach to wavelet image Denoisingijcsit
 
Vibration source identification caused by bearing faults based on SVD-EMD-ICA
Vibration source identification caused by bearing faults based on SVD-EMD-ICAVibration source identification caused by bearing faults based on SVD-EMD-ICA
Vibration source identification caused by bearing faults based on SVD-EMD-ICAIJRES Journal
 
Detection of epileptic indicators on clinical subbands of eeg
Detection of epileptic indicators on clinical subbands of eegDetection of epileptic indicators on clinical subbands of eeg
Detection of epileptic indicators on clinical subbands of eegOlusola Adeyemi
 
Independent Component Analysis
Independent Component Analysis Independent Component Analysis
Independent Component Analysis Ibrahim Amer
 
MULTI RESOLUTION LATTICE DISCRETE FOURIER TRANSFORM (MRL-DFT)
MULTI RESOLUTION LATTICE DISCRETE FOURIER TRANSFORM (MRL-DFT)MULTI RESOLUTION LATTICE DISCRETE FOURIER TRANSFORM (MRL-DFT)
MULTI RESOLUTION LATTICE DISCRETE FOURIER TRANSFORM (MRL-DFT)csandit
 
Project_report_BSS
Project_report_BSSProject_report_BSS
Project_report_BSSKamal Bhagat
 
Single Channel Speech De-noising Using Kernel Independent Component Analysis...
	Single Channel Speech De-noising Using Kernel Independent Component Analysis...	Single Channel Speech De-noising Using Kernel Independent Component Analysis...
Single Channel Speech De-noising Using Kernel Independent Component Analysis...theijes
 
Performance Analysis of Adaptive DOA Estimation Algorithms For Mobile Applica...
Performance Analysis of Adaptive DOA Estimation Algorithms For Mobile Applica...Performance Analysis of Adaptive DOA Estimation Algorithms For Mobile Applica...
Performance Analysis of Adaptive DOA Estimation Algorithms For Mobile Applica...IJERA Editor
 
A linear-Discriminant-Analysis-Based Approach to Enhance the Performance of F...
A linear-Discriminant-Analysis-Based Approach to Enhance the Performance of F...A linear-Discriminant-Analysis-Based Approach to Enhance the Performance of F...
A linear-Discriminant-Analysis-Based Approach to Enhance the Performance of F...CSCJournals
 
Mimo radar detection in compound gaussian clutter using orthogonal discrete f...
Mimo radar detection in compound gaussian clutter using orthogonal discrete f...Mimo radar detection in compound gaussian clutter using orthogonal discrete f...
Mimo radar detection in compound gaussian clutter using orthogonal discrete f...ijma
 
GraphSignalProcessingFinalPaper
GraphSignalProcessingFinalPaperGraphSignalProcessingFinalPaper
GraphSignalProcessingFinalPaperChiraz Nafouki
 
DIGITAL TOPOLOGY OPERATING IN MEDICAL IMAGING WITH MRI TECHNOLOGY.pptx
DIGITAL TOPOLOGY OPERATING IN MEDICAL IMAGING WITH MRI TECHNOLOGY.pptxDIGITAL TOPOLOGY OPERATING IN MEDICAL IMAGING WITH MRI TECHNOLOGY.pptx
DIGITAL TOPOLOGY OPERATING IN MEDICAL IMAGING WITH MRI TECHNOLOGY.pptxmathematicssac
 
Srilakshmi alla blindsourceseperation
Srilakshmi alla blindsourceseperationSrilakshmi alla blindsourceseperation
Srilakshmi alla blindsourceseperationSrilakshmi Alla
 
Paper id 24201464
Paper id 24201464Paper id 24201464
Paper id 24201464IJRAT
 

Similar to Ht3613671371 (20)

An efficient approach to wavelet image Denoising
An efficient approach to wavelet image DenoisingAn efficient approach to wavelet image Denoising
An efficient approach to wavelet image Denoising
 
Vibration source identification caused by bearing faults based on SVD-EMD-ICA
Vibration source identification caused by bearing faults based on SVD-EMD-ICAVibration source identification caused by bearing faults based on SVD-EMD-ICA
Vibration source identification caused by bearing faults based on SVD-EMD-ICA
 
Detection of epileptic indicators on clinical subbands of eeg
Detection of epileptic indicators on clinical subbands of eegDetection of epileptic indicators on clinical subbands of eeg
Detection of epileptic indicators on clinical subbands of eeg
 
REU_paper
REU_paperREU_paper
REU_paper
 
Independent Component Analysis
Independent Component Analysis Independent Component Analysis
Independent Component Analysis
 
MULTI RESOLUTION LATTICE DISCRETE FOURIER TRANSFORM (MRL-DFT)
MULTI RESOLUTION LATTICE DISCRETE FOURIER TRANSFORM (MRL-DFT)MULTI RESOLUTION LATTICE DISCRETE FOURIER TRANSFORM (MRL-DFT)
MULTI RESOLUTION LATTICE DISCRETE FOURIER TRANSFORM (MRL-DFT)
 
Project_report_BSS
Project_report_BSSProject_report_BSS
Project_report_BSS
 
Single Channel Speech De-noising Using Kernel Independent Component Analysis...
	Single Channel Speech De-noising Using Kernel Independent Component Analysis...	Single Channel Speech De-noising Using Kernel Independent Component Analysis...
Single Channel Speech De-noising Using Kernel Independent Component Analysis...
 
Performance Analysis of Adaptive DOA Estimation Algorithms For Mobile Applica...
Performance Analysis of Adaptive DOA Estimation Algorithms For Mobile Applica...Performance Analysis of Adaptive DOA Estimation Algorithms For Mobile Applica...
Performance Analysis of Adaptive DOA Estimation Algorithms For Mobile Applica...
 
A linear-Discriminant-Analysis-Based Approach to Enhance the Performance of F...
A linear-Discriminant-Analysis-Based Approach to Enhance the Performance of F...A linear-Discriminant-Analysis-Based Approach to Enhance the Performance of F...
A linear-Discriminant-Analysis-Based Approach to Enhance the Performance of F...
 
icarsn
icarsnicarsn
icarsn
 
Mimo radar detection in compound gaussian clutter using orthogonal discrete f...
Mimo radar detection in compound gaussian clutter using orthogonal discrete f...Mimo radar detection in compound gaussian clutter using orthogonal discrete f...
Mimo radar detection in compound gaussian clutter using orthogonal discrete f...
 
40120130406002
4012013040600240120130406002
40120130406002
 
GraphSignalProcessingFinalPaper
GraphSignalProcessingFinalPaperGraphSignalProcessingFinalPaper
GraphSignalProcessingFinalPaper
 
overviewPCA
overviewPCAoverviewPCA
overviewPCA
 
Nc2421532161
Nc2421532161Nc2421532161
Nc2421532161
 
50120140501009
5012014050100950120140501009
50120140501009
 
DIGITAL TOPOLOGY OPERATING IN MEDICAL IMAGING WITH MRI TECHNOLOGY.pptx
DIGITAL TOPOLOGY OPERATING IN MEDICAL IMAGING WITH MRI TECHNOLOGY.pptxDIGITAL TOPOLOGY OPERATING IN MEDICAL IMAGING WITH MRI TECHNOLOGY.pptx
DIGITAL TOPOLOGY OPERATING IN MEDICAL IMAGING WITH MRI TECHNOLOGY.pptx
 
Srilakshmi alla blindsourceseperation
Srilakshmi alla blindsourceseperationSrilakshmi alla blindsourceseperation
Srilakshmi alla blindsourceseperation
 
Paper id 24201464
Paper id 24201464Paper id 24201464
Paper id 24201464
 

Recently uploaded

Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationBeyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationSafe Software
 
Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...Alan Dix
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking MenDelhi Call girls
 
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptxMaking_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptxnull - The Open Security Community
 
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 3652toLead Limited
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreternaman860154
 
Install Stable Diffusion in windows machine
Install Stable Diffusion in windows machineInstall Stable Diffusion in windows machine
Install Stable Diffusion in windows machinePadma Pradeep
 
Benefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other FrameworksBenefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other FrameworksSoftradix Technologies
 
Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Allon Mureinik
 
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024BookNet Canada
 
Pigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food ManufacturingPigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food ManufacturingPigging Solutions
 
Pigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping ElbowsPigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping ElbowsPigging Solutions
 
AI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsAI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsMemoori
 
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...HostedbyConfluent
 
Key Features Of Token Development (1).pptx
Key  Features Of Token  Development (1).pptxKey  Features Of Token  Development (1).pptx
Key Features Of Token Development (1).pptxLBM Solutions
 
Artificial intelligence in the post-deep learning era
Artificial intelligence in the post-deep learning eraArtificial intelligence in the post-deep learning era
Artificial intelligence in the post-deep learning eraDeakin University
 
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...shyamraj55
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsEnterprise Knowledge
 
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | DelhiFULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhisoniya singh
 
Understanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitectureUnderstanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitecturePixlogix Infotech
 

Recently uploaded (20)

Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationBeyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
 
Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men
 
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptxMaking_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
 
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreter
 
Install Stable Diffusion in windows machine
Install Stable Diffusion in windows machineInstall Stable Diffusion in windows machine
Install Stable Diffusion in windows machine
 
Benefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other FrameworksBenefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other Frameworks
 
Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)
 
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
 
Pigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food ManufacturingPigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food Manufacturing
 
Pigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping ElbowsPigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping Elbows
 
AI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsAI as an Interface for Commercial Buildings
AI as an Interface for Commercial Buildings
 
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
 
Key Features Of Token Development (1).pptx
Key  Features Of Token  Development (1).pptxKey  Features Of Token  Development (1).pptx
Key Features Of Token Development (1).pptx
 
Artificial intelligence in the post-deep learning era
Artificial intelligence in the post-deep learning eraArtificial intelligence in the post-deep learning era
Artificial intelligence in the post-deep learning era
 
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI Solutions
 
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | DelhiFULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
 
Understanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitectureUnderstanding the Laravel MVC Architecture
Understanding the Laravel MVC Architecture
 

Ht3613671371

  • 1. Sadiqua khan et al Int. Journal of Engineering Research and Applications ISSN : 2248-9622, Vol. 3, Issue 6, Nov-Dec 2013, pp.1367-1371 RESEARCH ARTICLE www.ijera.com OPEN ACCESS Fast Independent Component Analysis Algorithm for Blind Separation of Non-Stationary Audio Signal Sadiqua Khan*, Y. Rama Krishna **, Dr. Panakala Rajesh Kumar*** *(Department of Electronics and Communications, PVPSIT, Vijayawada. ** (Department of Electronics and Communications, PVPSIT, Vijayawada. ***(Department of Electronics and Communications, PVPSIT, Vijayawada. ABSTRACT FastICA is a statistical method for transforming an observed multidimensional random vector into components that are statistically as independent from each other as possible. Acoustic signals recorded simultaneously in a reverberant environment can be described as sum of differently convolved sources. The task of source separation is to identify the multiple channels and possibly to invert those in order to obtain estimates of the underlying sources. We tackle the problem by explicitly exploiting the non-stationary components of the acoustic sources. Using maximum entropy approximations of differential entropy, we introduce a family of new contrast (objective) functions for ICA. Here we propose an algorithm for blind source separation in which frequency domain ICA and time domain ICA are used for successful separation of signals Keywords - Blind source separation, Entropy, Independent Component Analysis, Non-guassianity. I. INTRODUCTION. Blind source separation (BSS) has been proposed for various fields in recent years [1]. It is used to extract individual signals from observed mixed signals. It can be potentially used in communication systems, biomedical signal processing, image restoration and the classical cocktail party problem. In the communication field, it is a promising tool for the design of multi-input multi-output (MIMO) equalizers for suppression of intersymbol interference, cochannel and adjacent channel interference and multiaccess interference. In biomedical signal processing, BSS can be used to process electrocardiography (ECG), electroencephalography (EEG), electromyography (EMG) and magneto encephalograph (MEG) signals. In the image signal processing field, it can be used for image restoration and understanding. The cocktail party problem is our focus, where the target is to mimic in a machine the ability of a human to separate one speaker from a mixture of sounds. We focus on audio signal processing in a room environment, which can for example be used for teleconferencing. During the past decades, there has been considerable research performed in the field of convolutive blind source separation (CBSS). Initially, research was aimed at solutions based in the time domain. In real room recording, however, where the impulse response is on the order of thousands of samples in length, the time domain algorithm would be computationally very expensive to separate the sources. To overcome this problem, a solution in the frequency domain was proposed. As convolution in the time domain corresponds to multiplication in the frequency domain, the transformation into the www.ijera.com frequency domain converts the convolutive mixing problem to that of independent complex instantaneous mixing operations at each frequency bin provided the block length is not too large. In realization, moreover, care is necessary to overcome circular convolution effects. In this paper, we present the implementation of blind source separation using FastICA (independent component analysis). The aspiration of this paper is to recover two independent source signals composed of unknown linear combinations [2]. Through BSS, we have successfully separated the two signals apart with and without background noise. II. Entropy A central problem in BSS is cocktail party, as well as in statistics and signal processing, is finding a suitable representation or transformation of the data. For computational and conceptual simplicity, the representation is often sought as a linear transformation of the original data. Let us denote by x = (x1,x2, ..., xm)T a zero-mean m-dimensional random variable that can be observed, and by s = (s1, s2, ..., sn)T its n-dimensional transform. Then the problem is to determine a constant (weight) matrix W so that the linear transformation of the observed variables has some suitable properties. S= Wx (1) Several principles and methods have been developed to find such a linear representation, including principal component analysis, factor analysis, projection pursuit, independent component analysis etc. The transformation may be defined using such criteria as optimal dimension reduction, statistical ’interestingness’ of the resulting components ‘s’ 1367 | P a g e
  • 2. Sadiqua khan et al Int. Journal of Engineering Research and Applications ISSN : 2248-9622, Vol. 3, Issue 6, Nov-Dec 2013, pp.1367-1371 simplicity of the transformation, or other criteria, including application-oriented ones. We treat in this paper the problem of estimating the transformation given by (linear) independent component analysis (ICA). Thus this method is a special case of redundancy reduction. One popular way of formulating the ICA problem is to consider the estimation of the following generative model for the data. x = As (2) where x is an observed m-dimensional vector, s is an n-dimensional (latent) random vector whose components are assumed mutually independent, and A is a constant m × n matrix to be estimated . It is usually further assumed that the dimensions of x and s are equal, i.e., m = n; we make this assumption in the rest of the paper. A noise vector may also be present. The matrix W defining the transformation as in (1) is then obtained as the (pseudo)inverse of the estimate of the matrix A. Non-Gaussianity of the independent components is necessary for the identity ability of the model. General formulation for ICA that does not need to assume an underlying data model. This definition is based on the concept of mutual information. First, we define the differential entropy H of a random vector y = (y1,... yn)T with density f (.): H(y) = - ∫f (y) log f (y) dy (3) Differential entropy can be normalized to give rise to the determination of negentropy, which has the appealing property of being invariant for linear transformations. The definition of negentropy J is given by: J(y)= H(ygauss) - H(y ) (4) where y gauss is a Gaussian random vector of the same covariance matrix as y. Negentropy can also be interpreted as a measure of nongaussianity. Using the concept of differential entropy, one can define the mutual information I between the n (scalar) random variables yi, i = 1...n [8, 7]. Mutual information is a natural measure of the dependence between random variables. It is particularly interesting to express mutual information using negentropy, constraining the variables to be uncorrelated. In this case, we have [7] I(y1, y2, ..., yn ) = J(y) -∑ J(yi). (5) Since mutual information is the information-theoretic measure of the independence of random variables, it is natural to use it as the criterion for finding the ICA transform. The ICA of a random vector x as an invertible transformation s = Wx where the matrix W is determined so that the mutual information of the transformed components si is minimized. Two promising applications of ICA are blind source separation and feature extraction. In blind source separation, the observed values of x correspond to a realization of an m-dimensional discrete-time signal x(t), t = 1, 2, .... Then the components s(t) are called source signals, which are usually original, uncorrupted signals or noise sources. Often such www.ijera.com www.ijera.com sources are statistically independent from each other, and thus the signals can be recovered from linear mixtures x by n finding a transformation in which the transformed signals are as independent as possible as in ICA. III. Functions for ICA 3.1 ICA data model, minimization of mutual information. One popular way of formulating the ICA problem is to consider the estimation of the following generative model for the data [1, 3, 5, 6] From (2) x is an observed m-dimensional vector, s is an n-dimensional (latent) random vector whose components are assumed mutually independent, and A is a constant m × n matrix to be estimated. It is usually further assumed that the dimensions of x and s are equal, i.e., m = n; we make this assumption in the rest of the paper. A noise vector may also be present. The matrix W defining the transformation as in (1) is then obtained as the (pseudo)inverse of the estimate of the matrix A. Non-Gaussianity of the independent components is necessary for the identibility of the model (2), see [7]. Comon [7] showed how to obtain a more general formulation for ICA that does not need to assume an underlying data model. This definition is based on the concept of mutual information. First, we define the differential entropy H of a random vector y = (y1 , yn)T with density f (.) as follows: H(y) = -∫f (y) log f (y) dy (6) Differential entropy can be normalized to give rise to the definition of negentropy, which has the appealing property of being invariant for linear transformations. The definition of negentropy J is given by J(y)=H(ygauss) - H(y) (7) where ygauss is a Gaussian random vector of the same covariance matrix as y. Negentropy can also be interpreted as a measure of nongaussianity [7]. Using the concept of differential entropy, one can define the mutual information I between the n (scalar) random variables yi, i = 1...n [8, 7]. Mutual information is a natural measure of the dependence between random variables. It is particularly interesting to express mutual information using negentropy, constraining the variables to be uncorrelated. In this case, we have [7] I(y1, y2, ..., yn ) = J(y) - ∑J(yi). (8) Since mutual information is the information-theoretic measure of the independence of random variables, it is natural to use it as the criterion for finding the ICA transform. Thus we define in this paper, following [7], the ICA of a random vector x as an invertible transformation as in (1) where the matrix W is determined so that the mutual information of the transformed components si is minimized. This constraint is not strictly necessary, but simplifies the computations considerably. Because negentropy is invariant for invertible linear transformations is now 1368 | P a g e
  • 3. Sadiqua khan et al Int. Journal of Engineering Research and Applications ISSN : 2248-9622, Vol. 3, Issue 6, Nov-Dec 2013, pp.1367-1371 obvious from (8) that finding an invertible transformation W that minimizes the mutual information is roughly equivalent to directions in which the negentropy is maximized. 3.2 Approximations of Negentropy To use the definition of ICA given above, a simple estimate of the negentropy (or of differential entropy) is needed. We use here the new approximations developed based on the maximum entropy principle. In the simplest case, these new approximations are of the form: J(yi)≈c[E{G(yi)}−E{G(v)}]2 (9) where G is practically any non-quadratic function, c is an irrelevant constant, and is a Gaussian variable of zero mean and unit variance (i.e., standardized). The random variable yi is assumed to be of zero mean and unit variance. For symmetric variables, this is a generalization of the cumulantbased approximation in [7], which is obtained by taking G(yi) = y4i i . The choice of the function. The approximation of negentropy given above in (9) gives readily a new objective function for estimating the ICA transform in our framework. First, to find one independent component, or projection pursuit direction as yi=wTx, we maximize the function JG given by JG(w)= [E{G(wTx)} − E{G(v)}]2 (10) where w is an m-dimensional (weight) vector constrained so that E{(wTx)2} = 1 (we can fix the scale arbitrarily). Several independent components can then be estimated one-by-one using a scheme, see Section 4. Second, using the approach of minimizing mutual information, the above on e-unit contrast function can be simply extended to comp ute the whole matrix W in (1). To do this, recall from (8) that mutual information is minimized (under the constraint of decorrelation) when the sum of the negentropies of the components in maximized. IV. FIXED POINT ALGORITHM To begin with, we shall derive the fixedpoint algorithm with sphered data. First note that the maxima of JG(w) are obtained at certain optima of E{G(wTx)}. According to the Kuhn-Tucker conditions[18], the optima of E{G(wTx)} under the constraint E{(wTx)2}=kwk2=1 are obtained at points where E{xg(wT x)} − βw = 0 (11) where ß is a constant that can be easily evaluated to give β = E{wT0 xg(wT0 x)}, where w0 is the value of w at the optimum. Let us try to solve this equation by Newton’s method. Denoting the function on the lefthand side of (11) by F, we obtain its Jacobian matrix JF (w) as JF (w) = E{xxT g'(wT x)} − βI (12) To simplify the inversion of this matrix, we decide to approximate the first term in (12). Since the data is sphered, a reasonable approximation seems to be www.ijera.com www.ijera.com E{xxT g'(wT x)} ≈ E{xxT }E{g'(wT x)} = E{g'(wT x)}I. Thus the jacobian matrix becomes diagonal and can easily inverted. We also approximate ß using the current value of w instead of w0. Thus we obtain the following approximative newton iteration: w+= w−[E{xg(wT x)} − βw]/[E{g'(wT x)} − β] ∗ = w + /kw +k w (13) where w*denote the value of w β = E{wT xg(wTx)} and the normalization has been added to improve the stability. This algorithm can be further simplified by multiplying both sides of the equation in (16) by β − E{g'(wT x)}. This gives the following fixed point algorithm: w+ = E{xg(wT x)} − E{g'(wT x)}w (14) w∗ = w+ /kw+k which was introduced in [17] using a more heuristic derivation. It is well-known that the convergence of the Newton method may be rather uncertain. To ameliorate this, one may add a step size in (16), obtaining the stabilized fixed-point algorithm w+=w −µ [E{xg(wT x)} −βw]/[E{g'(wT x)} − β] w∗ = w+ /kw+k (15) where β = E{wT xg(wT x)} as above, and µ is a step size parameter that may change with the iteration count. Taking a µ that is much smaller than unity (say, 0.1 or 0.01), the algorithm (15) converges with much more certainty. In particular, it is often a good strategy to start with µ = 1, in which case the algorithm is equivalent to the original fixed-point algorithm in (17). If convergence seems problematic, µ may then be decreased gradually until convergence is satisfactory. The fixed-point algorithms may also be simply used for the original, that is, not sphered data. Transforming the data back to the non-sphered variables, one sees easily that the following modification of the algorithm (14) works for nonsphered data: w+ = C−1E{xg(wT x)} − E{g'(wT x)}w ∗ = w + / (w + )T Cw + w (16) where C = E{xxT }is the covariance matrix of the data. The stabilized version, algorithm (15), can also be modified as follows to work with no n-sphered data: w+= w − µ [C−1E{xg(wTx)} − βw]/[E{g'(wTx)}− β] w∗ = w+ / (w+ )T Cw + (17) Using this algorithm, one obtains directly an independent component as the linear combination wTx, where x need not be sphered (pre-whitened). These modifications presuppose, of course, that the covariance matrix is not singular. If it is singular or near-singular, the dimension of the data must be reduced, for example with PCA [7]. 1369 | P a g e
  • 4. Sadiqua khan et al Int. Journal of Engineering Research and Applications ISSN : 2248-9622, Vol. 3, Issue 6, Nov-Dec 2013, pp.1367-1371 V. Experimental results www.ijera.com 6 mixed signal 5 4 4 3 2 2 amplitude 1 0 0 -1 -2 -2 -4 -3 -4 -5 -6 0 1000 2000 3000 4000 5000 time 6000 7000 8000 9000 10000 0 1000 1.2 Gradient Magnitude 1 0.8 0.6 0.4 0.2 10 20 30 40 50 Iteration 60 70 4000 5000 6000 7000 8000 9000 10000 VI. Conclusion Magnitude of Entropy Gradient 1.4 0 3000 Fig.5: separated source audio of bird. Fig.1: Mixed audio signals of a bird and horn. 0 2000 80 90 100 Fig.2 calculation of entropy. Function values - Entropy 1.4 The problem of linear independent component analysis (ICA), which is a form of redundancy reduction, was addressed. The main advantage of the fixed-point algorithms is that their convergence can be shown to be very fast (cubic or at least quadratic). Combining the good statistical properties (e.g. robustness) of the new contrast functions, and the good algorithmic properties of the fixed-point algorithm, a very appealing method for ICA was obtained. Simulations as well as applications on real-life data have validated the novel contrast functions and algorithms introduced. Some extensions of the methods introduced in this paper are present in which the problem of noisy data is addressed which deals with the situation where there are more independent components than observed variables.. 1.2 1 References [1] h(Y) 0.8 0.6 0.4 0.2 0 -0.2 [2] 0 10 20 30 40 50 Iteration 60 70 80 90 100 Fig.3: Calculation of gradient entropy. [3] 3 2 1 [4] 0 -1 -2 [5] -3 -4 0 1000 2000 3000 4000 5000 6000 7000 8000 9000 Fig.4: separated source audio of horn. www.ijera.com 10000 [6] S.-I. Amari, A. Cichocki, and H.H. Yang. A new learning algorithm for blind source separation. In Advances in Neural Information Processing Systems 8, pages 757–763. MIT Press, 1996. H. B. Barlow. Possible principles underlying the transformations of sensory messages. In W. A.Rosen-blith, editor, Sensory Communication, pages 217–234. MIT Press, 1961. A.J. Bell and T.J. Sejnowski. An information-maximization approach to blind separation and blind deconvolution. Neural Computation, 7:1129–1159, 1995. A.J. Bell and T.J. Sejnowski. The 'independent components' of natural scenes are edge filters. Vision Research, 37:3327– 3338, 1997. J.-F. Cardoso and B. Hvam Laheld. Equivariant adaptive source separation. IEEE Trans. on Signal Processing, 44(12):3017– 3030, 1996. A. Cichocki and R. Unbehauen. Neural Networks for Signal Processing and 1370 | P a g e
  • 5. Sadiqua khan et al Int. Journal of Engineering Research and Applications ISSN : 2248-9622, Vol. 3, Issue 6, Nov-Dec 2013, pp.1367-1371 [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] www.ijera.com Optimization. Wiley, 1994. P. Comon. Independent component analysis—a new concept ? Signal Processing, 36:287–314, 1994. T. M. Cover and J. A. Thomas. Elements of Information Theory. Wiley, 1991. N. Delfosse and P. Loubaton. Adaptive blind separation of independent sources: a deflation approach. Signal Processing, 45:59–83, 1995. The FastICA MATLAB package. Available at http://www.cis.hut.fi/projects/ica/fastica/. J. H. Friedman and J. W. Tukey. A projection pursuit algorithm for exploratory data analysis. IEEE Trans. of Computers, c23(9):881–890, 1974. J.H. Friedman. Exploratory projection pursuit. J. of the American Statistical Association, 82(397):249– 266, 1987. J. H. Friedman and J. W. Tukey. A projection pursuit algorithm for exploratory data analysis. IEEE Trans. of Computers, c23(9):881–890, 1974. X. Giannakopoulos, J. Karhunen, and E. Oja. Experimental comparison of neural ICA algorithms. In Proc. Int. Conf. on Artificial Neural Networks (ICANN'98) , pages 651– 656, Skövde, Sweden, 1998. F.R. Hampel, E.M. Ronchetti, P.J. Rousseuw, and W.A. Stahel. Robust Statistics. Wiley, 1986. H. H. Harman. Modern Factor Analysis. University of Chicago Press, 2nd edition, 1967. P.J. Huber. Projection pursuit. The Annals of Statistics, 13(2):435–475, 1985. A. Hyvärinen. A family of fixed-point algorithms for independent component analysis. In Proc. IEEE Int. Conf. on Acoustics, Speech and Signal Processing (ICASSP'97), pages 3917–3920, Munich, Germany, 1997. D. Luenberger. Optimization by Vector Space Methods. Wiley, 1969. www.ijera.com 1371 | P a g e