SlideShare a Scribd company logo
Alternative Approach for Computing the Activation
               Factor of the PNLMS Algorithm
                                       Authors
         Francisco das C. de Souza,                       Dennis R. Morgan
     Orlando José Tobias , and Rui Seara            Bell Laboratories, Alcatel-Lucent
LINSE - Circuits and Signal Processing Laboratory         drrm@bell-labs.com
      Federal University of Santa Catarina
     {fsouza, orlando, seara}@linse.ufsc.br
INTRODUCTION

Sparse impulse responses are encountered in many real-world applications:
   Communications, acoustics, seismic and chemical processes
   NLMS algorithm (the same step-size value for all filter coefficients)
    ⇒ Slow convergence

Algorithms that exploit the sparse nature of the impulse response:
 PNLMS (proportionate normalized least-mean-square)
 PNLMS ++ (both the NLMS and PNLMS are used
           in the coefficient vector update)
 IPNLMS (improved PNLMS)
 SC-PNLMS (sparseness controlled PNLMS)
INTRODUCTION
The standard PNLMS algorithm Formulation

                                                               μG (n) e(n)x(n)
Coefficient update ( N × 1) :          w (n + 1) = w (n) +
                                                              xT (n)G (n)x(n) + ε


Gain distribution matrix ( N × N ) :   G (n) = diag [ g1 (n) g 2 (n)    g N (n) ]

                                                     φi (n)
Individual gain:                       gi ( n) =   N
                                                   ∑ φi (n)
                                                   i =1

Proportionality function:              φi (n) = max ⎡ f (n), wi (n) ⎤
                                                    ⎣               ⎦


Activation factor:                     f (n) = ρ max ⎡δ, w (n) ∞ ⎤
                                                     ⎣           ⎦
INTRODUCTION
The standard PNLMS algorithm performance depends on predefined parameters:
       δ (initialization)

       ρ (proportionality or activation)


  •       These parameters are related to the algorithm variable, termed
                             ACTIVATION FACTOR f ( n)
   •      The initialization parameter permits starting the adaptation process at
          n = 0, when all filter coefficients are initialized to zero.
                       ⇒ f (0) = ρ max ⎡δ, w (0) ∞ ⎦ = ρδ
                                        ⎣
                                                    ⎤
   •      The proportionality parameter prevents an individual coefficient from
          freezing when its magnitude is much smaller than the largest coefficient
          magnitude.
A central point: How to set suitable values for these parameters,
                 since they impact the algorithm convergence speed?
INTRODUCTION
Activation factor in the standard PNLMS algorithm

Common to all coefficients, computed sample-by-sample.

Depends on w (n) ∞ .
Leads to a gain distribution between the adaptive filter coefficients not
entirely in line with the concept of proportionality.

 Proposed approach: Individual activation factor PNLMS (IAF-PNLMS)
An individual activation factor is used for each adaptive filter coefficient.
Each individual activation factor is computed in terms of the
corresponding coefficient magnitude.
 Consequence
 For impulse responses having high sparseness, numerical simulations show that
 the proposed approach has faster convergence as well as faster response to
 perturbations of the system plant than both the PNLMS and IPNLMS
 algorithms.
STANDARD PNLMS ALGORITHM DISCUSSION

                                                                  1
Gain for inactive coefficients     g inactive (n) =       N
                                                                       f ( n)
                                                      ∑ φi (n)
                                                      i =1

                                                              1
Gain for active coefficients       giactive (n) =     N
                                                                      wi (n)
                                                    ∑ φi (n)
                                                    i =1

Total gain distributed over the filter coefficients at each iteration
                            N−N
               tr [G (n) ] = N active f (n) + ∑ giactive (n) = 1
                             ∑ φi (n)         i∈A

                            i =1
   The activation factor affects the gains assigned to both
    active and inactive coefficients.
Standard PNLMS algorithm performance with respect f (n)
Scenario for all numerical simulations
 • Sparse impulse response p with N =100 coefficients,
   Active coefficient values:
   {0.1, 1.0, 0.5, 0.1} located at positions {1, 30, 35, 85}, respectively.
    S (p) = 0.9435
  • Input signal: Correlated unity-variance
                                                           μ = 0.5, δ = 0.01
    AR(2) process with χ = 74


Normalized misalignment measure:
                                  2
                     p − w ( n)
    κ(n) = 10log10           2
                                  2
                         p   2
Total gain distribution over L iterations
                      L −1
             θi = ∑ gi ( n)
                      n =0

Average of θi over the inactive coefficients
                          1
        θinactive =
         mean
                      N − N active
                                     ∑ θi ,   A = {1, 30, 35,85}
                                     i∉A
At the beginning of
the learning phase

    (0 ≤ n < 30)


  w30 ( n) < w1 (n)


   g30 (n) < g1 (n)




 Desired condition
 p1 = 0.1, p30 = 1.0


   g30 (n) > g1 (n)
MODIFIED PNLMS ALGORITHM

Features of the standard PNLMS algorithm:
1) When wi (n) is an active coefficient, its gain is always proportional to wi (n) .
2) When wi (n) is inactive, the gain is not proportional to wi (n) .

Objective:
   To overcome the drawback (2) by making the gain gi (n)
   tend towards being proportional to wi (n) even when
   wi (n) is inactive.
                                     ⎧⇒ φi (n) = max[ fi (n), wi (n) ]
                                     ⎪
                                     ⎪                          1
       f (n) is replaced by fi (n)   ⎨⇒ gi          ( n) = N
                                           inactive
                                                                    f i ( n)
                                     ⎪
                                     ⎪                     ∑ φi (n)
                                     ⎩                     i =1
MODIFIED PNLMS ALGORITHM
Conditions Required for the New Activation Factor fi (n)

C1) fi (n) must converge to the corresponding coefficient magnitude wi (n)

         lim [ fi (n) − wi (n) ] = 0 ,      i = 1, 2, …, N
        n→∞


C2) fi (n) must always be greater than zero, i.e.,


         f i ( n) > 0 ,    i = 1, 2, …, N

                        inactive
   If C1 is fulfilled, gi        (n) tends to be proportional to wi (n) as n → ∞

   C2 ensures that gi (n) > 0 when wi (n) = 0,
    avoiding the freezing of wi (n)
MODIFIED PNLMS ALGORITHM
Proposed Approach for Computing fi (n)


                   fi (n) = γ wi (n) + (1 − γ )φi (n − 1)


                        INTENDED                        C2
                         AIM (C1)

 where 0 < γ < 1
    By considering that no knowledge of the
    system plant is available a priori, it is reasonable to choose γ = 1/ 2
    The activation factors are initialized
    with a small positive constant (typically, fi (0) = 10−2 / N )
MODIFIED PNLMS ALGORITHM

For proper algorithm operation, it is required that the instantaneous
magnitude of the estimated coefficients be proportional to the magnitude
of the corresponding plant coefficients.

wi (n) may not be proportional to pi (n)
at the beginning of the adaptation process

                          IAF-PNLMS
                  ⎧1            1
                  ⎪ wi (n) + φi ( n − 1), n = mN
       f i ( n) = ⎨ 2           2
                  ⎪ fi (n − 1),
                  ⎩                     otherwise

       N : adaptive filter length
       m = 1, 2, 3, …
NUMERICAL SIMULATIONS
Example 1
A perturbation in the plant takes place at n = 2500, whereby the plant
 vector p is changed to −p


Parameter values:
μ = 0.5
ρ = 0.05
δ = 0.01
α=0
 fi (0) = 10−4
NUMERICAL SIMULATIONS
Example 2
A perturbation in the plant takes place at n = 2500, whereby the plant vector
 p is shifted to the right by 12 samples, changing the position of all
 active coefficients
  {1, 30, 35, 85}
         ⇓
 {13, 42, 47, 97}
Parameter values:
 μ = 0.5
ρ = 0.05
δ = 0.01
α=0
 fi (0) = 10−4
CONCLUSIONS


The IAF-PNLMS algorithm uses an individual activation factor for each
adaptive filter coefficient.


The IAF-PNLMS algorithm presents better gain distribution than the
PNLMS and IPNLMS algorithms.



The IAF-PNLMS algorithm provides an improvement in convergence
speed for plant impulse responses having high sparseness.

More Related Content

What's hot

Dsp Lab Record
Dsp Lab RecordDsp Lab Record
Dsp Lab Record
Aleena Varghese
 
Digital Signal Processing[ECEG-3171]-Ch1_L04
Digital Signal Processing[ECEG-3171]-Ch1_L04Digital Signal Processing[ECEG-3171]-Ch1_L04
Digital Signal Processing[ECEG-3171]-Ch1_L04
Rediet Moges
 
Digital Signal Processing[ECEG-3171]-Ch1_L02
Digital Signal Processing[ECEG-3171]-Ch1_L02Digital Signal Processing[ECEG-3171]-Ch1_L02
Digital Signal Processing[ECEG-3171]-Ch1_L02
Rediet Moges
 
The FFT And Spectral Analysis
The FFT And Spectral AnalysisThe FFT And Spectral Analysis
The FFT And Spectral Analysis
Athanasios Anastasiou
 
The multilayer perceptron
The multilayer perceptronThe multilayer perceptron
The multilayer perceptron
ESCOM
 
Ec8352 signals and systems 2 marks with answers
Ec8352 signals and systems   2 marks with answersEc8352 signals and systems   2 marks with answers
Ec8352 signals and systems 2 marks with answers
Gayathri Krishnamoorthy
 
1 - Linear Regression
1 - Linear Regression1 - Linear Regression
1 - Linear Regression
Nikita Zhiltsov
 
Before quiz 2
Before quiz 2Before quiz 2
Before quiz 2
Gopi Saiteja
 
Dsp U Lec07 Realization Of Discrete Time Systems
Dsp U   Lec07 Realization Of Discrete Time SystemsDsp U   Lec07 Realization Of Discrete Time Systems
Dsp U Lec07 Realization Of Discrete Time Systems
taha25
 
Signal transmission and filtering section 3.1
Signal transmission and filtering section 3.1Signal transmission and filtering section 3.1
Signal transmission and filtering section 3.1nahrain university
 
03 image transform
03 image transform03 image transform
03 image transform
Rumah Belajar
 
Label propagation - Semisupervised Learning with Applications to NLP
Label propagation - Semisupervised Learning with Applications to NLPLabel propagation - Semisupervised Learning with Applications to NLP
Label propagation - Semisupervised Learning with Applications to NLP
David Przybilla
 
Wiener filters
Wiener filtersWiener filters
Wiener filters
Rayeesa
 
3.Properties of signals
3.Properties of signals3.Properties of signals
3.Properties of signals
INDIAN NAVY
 
Digital Signal Processing
Digital Signal ProcessingDigital Signal Processing
Digital Signal Processing
PRABHAHARAN429
 
inverse z-transform ppt
inverse z-transform pptinverse z-transform ppt
inverse z-transform ppt
mihir jain
 
Discreate time system and z transform
Discreate time system and z transformDiscreate time system and z transform
Discreate time system and z transform
VIKAS KUMAR MANJHI
 
2.time domain analysis of lti systems
2.time domain analysis of lti systems2.time domain analysis of lti systems
2.time domain analysis of lti systems
INDIAN NAVY
 
13 fourierfiltrationen
13 fourierfiltrationen13 fourierfiltrationen
13 fourierfiltrationen
hoailinhtinh
 

What's hot (20)

Dsp Lab Record
Dsp Lab RecordDsp Lab Record
Dsp Lab Record
 
Digital Signal Processing[ECEG-3171]-Ch1_L04
Digital Signal Processing[ECEG-3171]-Ch1_L04Digital Signal Processing[ECEG-3171]-Ch1_L04
Digital Signal Processing[ECEG-3171]-Ch1_L04
 
Digital Signal Processing[ECEG-3171]-Ch1_L02
Digital Signal Processing[ECEG-3171]-Ch1_L02Digital Signal Processing[ECEG-3171]-Ch1_L02
Digital Signal Processing[ECEG-3171]-Ch1_L02
 
The FFT And Spectral Analysis
The FFT And Spectral AnalysisThe FFT And Spectral Analysis
The FFT And Spectral Analysis
 
The multilayer perceptron
The multilayer perceptronThe multilayer perceptron
The multilayer perceptron
 
Ec8352 signals and systems 2 marks with answers
Ec8352 signals and systems   2 marks with answersEc8352 signals and systems   2 marks with answers
Ec8352 signals and systems 2 marks with answers
 
1 - Linear Regression
1 - Linear Regression1 - Linear Regression
1 - Linear Regression
 
Before quiz 2
Before quiz 2Before quiz 2
Before quiz 2
 
Dsp U Lec07 Realization Of Discrete Time Systems
Dsp U   Lec07 Realization Of Discrete Time SystemsDsp U   Lec07 Realization Of Discrete Time Systems
Dsp U Lec07 Realization Of Discrete Time Systems
 
Signal transmission and filtering section 3.1
Signal transmission and filtering section 3.1Signal transmission and filtering section 3.1
Signal transmission and filtering section 3.1
 
03 image transform
03 image transform03 image transform
03 image transform
 
Label propagation - Semisupervised Learning with Applications to NLP
Label propagation - Semisupervised Learning with Applications to NLPLabel propagation - Semisupervised Learning with Applications to NLP
Label propagation - Semisupervised Learning with Applications to NLP
 
Wiener filters
Wiener filtersWiener filters
Wiener filters
 
3.Properties of signals
3.Properties of signals3.Properties of signals
3.Properties of signals
 
Digital Signal Processing
Digital Signal ProcessingDigital Signal Processing
Digital Signal Processing
 
inverse z-transform ppt
inverse z-transform pptinverse z-transform ppt
inverse z-transform ppt
 
Discreate time system and z transform
Discreate time system and z transformDiscreate time system and z transform
Discreate time system and z transform
 
2.time domain analysis of lti systems
2.time domain analysis of lti systems2.time domain analysis of lti systems
2.time domain analysis of lti systems
 
00e isi
00e isi00e isi
00e isi
 
13 fourierfiltrationen
13 fourierfiltrationen13 fourierfiltrationen
13 fourierfiltrationen
 

Similar to Alternative Approach for Computing the Activation Factor of the PNLMS Algorithm

Dsp U Lec08 Fir Filter Design
Dsp U   Lec08 Fir Filter DesignDsp U   Lec08 Fir Filter Design
Dsp U Lec08 Fir Filter Design
taha25
 
Lecture 4 - Growth of Functions.ppt
Lecture 4 - Growth of Functions.pptLecture 4 - Growth of Functions.ppt
Lecture 4 - Growth of Functions.ppt
ZohairMughal1
 
Lecture 4 - Growth of Functions (1).ppt
Lecture 4 - Growth of Functions (1).pptLecture 4 - Growth of Functions (1).ppt
Lecture 4 - Growth of Functions (1).ppt
ZohairMughal1
 
On probability distributions
On probability distributionsOn probability distributions
On probability distributions
Eric Xihui Lin
 
02 2d systems matrix
02 2d systems matrix02 2d systems matrix
02 2d systems matrix
Rumah Belajar
 
Sls clm demo_new
Sls clm demo_newSls clm demo_new
Sls clm demo_newtvpszen
 
100 things I know
100 things I know100 things I know
100 things I know
r-uribe
 
Signal Processing Homework Help
Signal Processing Homework HelpSignal Processing Homework Help
Signal Processing Homework Help
Matlab Assignment Experts
 
Introducing Zap Q-Learning
Introducing Zap Q-Learning   Introducing Zap Q-Learning
Introducing Zap Q-Learning
Sean Meyn
 
WAVELET-PACKET-BASED ADAPTIVE ALGORITHM FOR SPARSE IMPULSE RESPONSE IDENTIFI...
WAVELET-PACKET-BASED ADAPTIVE ALGORITHM FOR  SPARSE IMPULSE RESPONSE IDENTIFI...WAVELET-PACKET-BASED ADAPTIVE ALGORITHM FOR  SPARSE IMPULSE RESPONSE IDENTIFI...
WAVELET-PACKET-BASED ADAPTIVE ALGORITHM FOR SPARSE IMPULSE RESPONSE IDENTIFI...
bermudez_jcm
 
Brief summary of signals
Brief summary of signalsBrief summary of signals
Brief summary of signals
aroosa khan
 
MUMS: Bayesian, Fiducial, and Frequentist Conference - Objective Bayesian Ana...
MUMS: Bayesian, Fiducial, and Frequentist Conference - Objective Bayesian Ana...MUMS: Bayesian, Fiducial, and Frequentist Conference - Objective Bayesian Ana...
MUMS: Bayesian, Fiducial, and Frequentist Conference - Objective Bayesian Ana...
The Statistical and Applied Mathematical Sciences Institute
 
QMC: Operator Splitting Workshop, Stochastic Block-Coordinate Fixed Point Alg...
QMC: Operator Splitting Workshop, Stochastic Block-Coordinate Fixed Point Alg...QMC: Operator Splitting Workshop, Stochastic Block-Coordinate Fixed Point Alg...
QMC: Operator Splitting Workshop, Stochastic Block-Coordinate Fixed Point Alg...
The Statistical and Applied Mathematical Sciences Institute
 
Arithmetic sequence in elementary and HS
Arithmetic sequence in elementary and HSArithmetic sequence in elementary and HS
Arithmetic sequence in elementary and HS
RoseEdenAbitong2
 
Introduction to the theory of optimization
Introduction to the theory of optimizationIntroduction to the theory of optimization
Introduction to the theory of optimization
Delta Pi Systems
 

Similar to Alternative Approach for Computing the Activation Factor of the PNLMS Algorithm (20)

Dsp U Lec08 Fir Filter Design
Dsp U   Lec08 Fir Filter DesignDsp U   Lec08 Fir Filter Design
Dsp U Lec08 Fir Filter Design
 
Lecture 4 - Growth of Functions.ppt
Lecture 4 - Growth of Functions.pptLecture 4 - Growth of Functions.ppt
Lecture 4 - Growth of Functions.ppt
 
Lecture 4 - Growth of Functions (1).ppt
Lecture 4 - Growth of Functions (1).pptLecture 4 - Growth of Functions (1).ppt
Lecture 4 - Growth of Functions (1).ppt
 
On probability distributions
On probability distributionsOn probability distributions
On probability distributions
 
02 2d systems matrix
02 2d systems matrix02 2d systems matrix
02 2d systems matrix
 
Sls clm demo_new
Sls clm demo_newSls clm demo_new
Sls clm demo_new
 
100 things I know
100 things I know100 things I know
100 things I know
 
Signal Processing Homework Help
Signal Processing Homework HelpSignal Processing Homework Help
Signal Processing Homework Help
 
z transforms
z transformsz transforms
z transforms
 
Fourier series
Fourier seriesFourier series
Fourier series
 
Fourier series
Fourier seriesFourier series
Fourier series
 
Introducing Zap Q-Learning
Introducing Zap Q-Learning   Introducing Zap Q-Learning
Introducing Zap Q-Learning
 
WAVELET-PACKET-BASED ADAPTIVE ALGORITHM FOR SPARSE IMPULSE RESPONSE IDENTIFI...
WAVELET-PACKET-BASED ADAPTIVE ALGORITHM FOR  SPARSE IMPULSE RESPONSE IDENTIFI...WAVELET-PACKET-BASED ADAPTIVE ALGORITHM FOR  SPARSE IMPULSE RESPONSE IDENTIFI...
WAVELET-PACKET-BASED ADAPTIVE ALGORITHM FOR SPARSE IMPULSE RESPONSE IDENTIFI...
 
02 asymp
02 asymp02 asymp
02 asymp
 
Rousseau
RousseauRousseau
Rousseau
 
Brief summary of signals
Brief summary of signalsBrief summary of signals
Brief summary of signals
 
MUMS: Bayesian, Fiducial, and Frequentist Conference - Objective Bayesian Ana...
MUMS: Bayesian, Fiducial, and Frequentist Conference - Objective Bayesian Ana...MUMS: Bayesian, Fiducial, and Frequentist Conference - Objective Bayesian Ana...
MUMS: Bayesian, Fiducial, and Frequentist Conference - Objective Bayesian Ana...
 
QMC: Operator Splitting Workshop, Stochastic Block-Coordinate Fixed Point Alg...
QMC: Operator Splitting Workshop, Stochastic Block-Coordinate Fixed Point Alg...QMC: Operator Splitting Workshop, Stochastic Block-Coordinate Fixed Point Alg...
QMC: Operator Splitting Workshop, Stochastic Block-Coordinate Fixed Point Alg...
 
Arithmetic sequence in elementary and HS
Arithmetic sequence in elementary and HSArithmetic sequence in elementary and HS
Arithmetic sequence in elementary and HS
 
Introduction to the theory of optimization
Introduction to the theory of optimizationIntroduction to the theory of optimization
Introduction to the theory of optimization
 

Recently uploaded

A Strategic Approach: GenAI in Education
A Strategic Approach: GenAI in EducationA Strategic Approach: GenAI in Education
A Strategic Approach: GenAI in Education
Peter Windle
 
Operation Blue Star - Saka Neela Tara
Operation Blue Star   -  Saka Neela TaraOperation Blue Star   -  Saka Neela Tara
Operation Blue Star - Saka Neela Tara
Balvir Singh
 
1.4 modern child centered education - mahatma gandhi-2.pptx
1.4 modern child centered education - mahatma gandhi-2.pptx1.4 modern child centered education - mahatma gandhi-2.pptx
1.4 modern child centered education - mahatma gandhi-2.pptx
JosvitaDsouza2
 
Supporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptxSupporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptx
Jisc
 
The basics of sentences session 5pptx.pptx
The basics of sentences session 5pptx.pptxThe basics of sentences session 5pptx.pptx
The basics of sentences session 5pptx.pptx
heathfieldcps1
 
Thesis Statement for students diagnonsed withADHD.ppt
Thesis Statement for students diagnonsed withADHD.pptThesis Statement for students diagnonsed withADHD.ppt
Thesis Statement for students diagnonsed withADHD.ppt
EverAndrsGuerraGuerr
 
Normal Labour/ Stages of Labour/ Mechanism of Labour
Normal Labour/ Stages of Labour/ Mechanism of LabourNormal Labour/ Stages of Labour/ Mechanism of Labour
Normal Labour/ Stages of Labour/ Mechanism of Labour
Wasim Ak
 
Unit 2- Research Aptitude (UGC NET Paper I).pdf
Unit 2- Research Aptitude (UGC NET Paper I).pdfUnit 2- Research Aptitude (UGC NET Paper I).pdf
Unit 2- Research Aptitude (UGC NET Paper I).pdf
Thiyagu K
 
Overview on Edible Vaccine: Pros & Cons with Mechanism
Overview on Edible Vaccine: Pros & Cons with MechanismOverview on Edible Vaccine: Pros & Cons with Mechanism
Overview on Edible Vaccine: Pros & Cons with Mechanism
DeeptiGupta154
 
Unit 8 - Information and Communication Technology (Paper I).pdf
Unit 8 - Information and Communication Technology (Paper I).pdfUnit 8 - Information and Communication Technology (Paper I).pdf
Unit 8 - Information and Communication Technology (Paper I).pdf
Thiyagu K
 
Language Across the Curriculm LAC B.Ed.
Language Across the  Curriculm LAC B.Ed.Language Across the  Curriculm LAC B.Ed.
Language Across the Curriculm LAC B.Ed.
Atul Kumar Singh
 
Embracing GenAI - A Strategic Imperative
Embracing GenAI - A Strategic ImperativeEmbracing GenAI - A Strategic Imperative
Embracing GenAI - A Strategic Imperative
Peter Windle
 
Introduction to AI for Nonprofits with Tapp Network
Introduction to AI for Nonprofits with Tapp NetworkIntroduction to AI for Nonprofits with Tapp Network
Introduction to AI for Nonprofits with Tapp Network
TechSoup
 
Chapter -12, Antibiotics (One Page Notes).pdf
Chapter -12, Antibiotics (One Page Notes).pdfChapter -12, Antibiotics (One Page Notes).pdf
Chapter -12, Antibiotics (One Page Notes).pdf
Kartik Tiwari
 
Azure Interview Questions and Answers PDF By ScholarHat
Azure Interview Questions and Answers PDF By ScholarHatAzure Interview Questions and Answers PDF By ScholarHat
Azure Interview Questions and Answers PDF By ScholarHat
Scholarhat
 
S1-Introduction-Biopesticides in ICM.pptx
S1-Introduction-Biopesticides in ICM.pptxS1-Introduction-Biopesticides in ICM.pptx
S1-Introduction-Biopesticides in ICM.pptx
tarandeep35
 
The French Revolution Class 9 Study Material pdf free download
The French Revolution Class 9 Study Material pdf free downloadThe French Revolution Class 9 Study Material pdf free download
The French Revolution Class 9 Study Material pdf free download
Vivekanand Anglo Vedic Academy
 
Biological Screening of Herbal Drugs in detailed.
Biological Screening of Herbal Drugs in detailed.Biological Screening of Herbal Drugs in detailed.
Biological Screening of Herbal Drugs in detailed.
Ashokrao Mane college of Pharmacy Peth-Vadgaon
 
"Protectable subject matters, Protection in biotechnology, Protection of othe...
"Protectable subject matters, Protection in biotechnology, Protection of othe..."Protectable subject matters, Protection in biotechnology, Protection of othe...
"Protectable subject matters, Protection in biotechnology, Protection of othe...
SACHIN R KONDAGURI
 
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdf
Welcome to TechSoup   New Member Orientation and Q&A (May 2024).pdfWelcome to TechSoup   New Member Orientation and Q&A (May 2024).pdf
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdf
TechSoup
 

Recently uploaded (20)

A Strategic Approach: GenAI in Education
A Strategic Approach: GenAI in EducationA Strategic Approach: GenAI in Education
A Strategic Approach: GenAI in Education
 
Operation Blue Star - Saka Neela Tara
Operation Blue Star   -  Saka Neela TaraOperation Blue Star   -  Saka Neela Tara
Operation Blue Star - Saka Neela Tara
 
1.4 modern child centered education - mahatma gandhi-2.pptx
1.4 modern child centered education - mahatma gandhi-2.pptx1.4 modern child centered education - mahatma gandhi-2.pptx
1.4 modern child centered education - mahatma gandhi-2.pptx
 
Supporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptxSupporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptx
 
The basics of sentences session 5pptx.pptx
The basics of sentences session 5pptx.pptxThe basics of sentences session 5pptx.pptx
The basics of sentences session 5pptx.pptx
 
Thesis Statement for students diagnonsed withADHD.ppt
Thesis Statement for students diagnonsed withADHD.pptThesis Statement for students diagnonsed withADHD.ppt
Thesis Statement for students diagnonsed withADHD.ppt
 
Normal Labour/ Stages of Labour/ Mechanism of Labour
Normal Labour/ Stages of Labour/ Mechanism of LabourNormal Labour/ Stages of Labour/ Mechanism of Labour
Normal Labour/ Stages of Labour/ Mechanism of Labour
 
Unit 2- Research Aptitude (UGC NET Paper I).pdf
Unit 2- Research Aptitude (UGC NET Paper I).pdfUnit 2- Research Aptitude (UGC NET Paper I).pdf
Unit 2- Research Aptitude (UGC NET Paper I).pdf
 
Overview on Edible Vaccine: Pros & Cons with Mechanism
Overview on Edible Vaccine: Pros & Cons with MechanismOverview on Edible Vaccine: Pros & Cons with Mechanism
Overview on Edible Vaccine: Pros & Cons with Mechanism
 
Unit 8 - Information and Communication Technology (Paper I).pdf
Unit 8 - Information and Communication Technology (Paper I).pdfUnit 8 - Information and Communication Technology (Paper I).pdf
Unit 8 - Information and Communication Technology (Paper I).pdf
 
Language Across the Curriculm LAC B.Ed.
Language Across the  Curriculm LAC B.Ed.Language Across the  Curriculm LAC B.Ed.
Language Across the Curriculm LAC B.Ed.
 
Embracing GenAI - A Strategic Imperative
Embracing GenAI - A Strategic ImperativeEmbracing GenAI - A Strategic Imperative
Embracing GenAI - A Strategic Imperative
 
Introduction to AI for Nonprofits with Tapp Network
Introduction to AI for Nonprofits with Tapp NetworkIntroduction to AI for Nonprofits with Tapp Network
Introduction to AI for Nonprofits with Tapp Network
 
Chapter -12, Antibiotics (One Page Notes).pdf
Chapter -12, Antibiotics (One Page Notes).pdfChapter -12, Antibiotics (One Page Notes).pdf
Chapter -12, Antibiotics (One Page Notes).pdf
 
Azure Interview Questions and Answers PDF By ScholarHat
Azure Interview Questions and Answers PDF By ScholarHatAzure Interview Questions and Answers PDF By ScholarHat
Azure Interview Questions and Answers PDF By ScholarHat
 
S1-Introduction-Biopesticides in ICM.pptx
S1-Introduction-Biopesticides in ICM.pptxS1-Introduction-Biopesticides in ICM.pptx
S1-Introduction-Biopesticides in ICM.pptx
 
The French Revolution Class 9 Study Material pdf free download
The French Revolution Class 9 Study Material pdf free downloadThe French Revolution Class 9 Study Material pdf free download
The French Revolution Class 9 Study Material pdf free download
 
Biological Screening of Herbal Drugs in detailed.
Biological Screening of Herbal Drugs in detailed.Biological Screening of Herbal Drugs in detailed.
Biological Screening of Herbal Drugs in detailed.
 
"Protectable subject matters, Protection in biotechnology, Protection of othe...
"Protectable subject matters, Protection in biotechnology, Protection of othe..."Protectable subject matters, Protection in biotechnology, Protection of othe...
"Protectable subject matters, Protection in biotechnology, Protection of othe...
 
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdf
Welcome to TechSoup   New Member Orientation and Q&A (May 2024).pdfWelcome to TechSoup   New Member Orientation and Q&A (May 2024).pdf
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdf
 

Alternative Approach for Computing the Activation Factor of the PNLMS Algorithm

  • 1. Alternative Approach for Computing the Activation Factor of the PNLMS Algorithm Authors Francisco das C. de Souza, Dennis R. Morgan Orlando José Tobias , and Rui Seara Bell Laboratories, Alcatel-Lucent LINSE - Circuits and Signal Processing Laboratory drrm@bell-labs.com Federal University of Santa Catarina {fsouza, orlando, seara}@linse.ufsc.br
  • 2. INTRODUCTION Sparse impulse responses are encountered in many real-world applications: Communications, acoustics, seismic and chemical processes NLMS algorithm (the same step-size value for all filter coefficients) ⇒ Slow convergence Algorithms that exploit the sparse nature of the impulse response: PNLMS (proportionate normalized least-mean-square) PNLMS ++ (both the NLMS and PNLMS are used in the coefficient vector update) IPNLMS (improved PNLMS) SC-PNLMS (sparseness controlled PNLMS)
  • 3. INTRODUCTION The standard PNLMS algorithm Formulation μG (n) e(n)x(n) Coefficient update ( N × 1) : w (n + 1) = w (n) + xT (n)G (n)x(n) + ε Gain distribution matrix ( N × N ) : G (n) = diag [ g1 (n) g 2 (n) g N (n) ] φi (n) Individual gain: gi ( n) = N ∑ φi (n) i =1 Proportionality function: φi (n) = max ⎡ f (n), wi (n) ⎤ ⎣ ⎦ Activation factor: f (n) = ρ max ⎡δ, w (n) ∞ ⎤ ⎣ ⎦
  • 4. INTRODUCTION The standard PNLMS algorithm performance depends on predefined parameters: δ (initialization) ρ (proportionality or activation) • These parameters are related to the algorithm variable, termed ACTIVATION FACTOR f ( n) • The initialization parameter permits starting the adaptation process at n = 0, when all filter coefficients are initialized to zero. ⇒ f (0) = ρ max ⎡δ, w (0) ∞ ⎦ = ρδ ⎣ ⎤ • The proportionality parameter prevents an individual coefficient from freezing when its magnitude is much smaller than the largest coefficient magnitude. A central point: How to set suitable values for these parameters, since they impact the algorithm convergence speed?
  • 5. INTRODUCTION Activation factor in the standard PNLMS algorithm Common to all coefficients, computed sample-by-sample. Depends on w (n) ∞ . Leads to a gain distribution between the adaptive filter coefficients not entirely in line with the concept of proportionality. Proposed approach: Individual activation factor PNLMS (IAF-PNLMS) An individual activation factor is used for each adaptive filter coefficient. Each individual activation factor is computed in terms of the corresponding coefficient magnitude. Consequence For impulse responses having high sparseness, numerical simulations show that the proposed approach has faster convergence as well as faster response to perturbations of the system plant than both the PNLMS and IPNLMS algorithms.
  • 6. STANDARD PNLMS ALGORITHM DISCUSSION 1 Gain for inactive coefficients g inactive (n) = N f ( n) ∑ φi (n) i =1 1 Gain for active coefficients giactive (n) = N wi (n) ∑ φi (n) i =1 Total gain distributed over the filter coefficients at each iteration N−N tr [G (n) ] = N active f (n) + ∑ giactive (n) = 1 ∑ φi (n) i∈A i =1 The activation factor affects the gains assigned to both active and inactive coefficients.
  • 7. Standard PNLMS algorithm performance with respect f (n) Scenario for all numerical simulations • Sparse impulse response p with N =100 coefficients, Active coefficient values: {0.1, 1.0, 0.5, 0.1} located at positions {1, 30, 35, 85}, respectively. S (p) = 0.9435 • Input signal: Correlated unity-variance μ = 0.5, δ = 0.01 AR(2) process with χ = 74 Normalized misalignment measure: 2 p − w ( n) κ(n) = 10log10 2 2 p 2
  • 8. Total gain distribution over L iterations L −1 θi = ∑ gi ( n) n =0 Average of θi over the inactive coefficients 1 θinactive = mean N − N active ∑ θi , A = {1, 30, 35,85} i∉A
  • 9. At the beginning of the learning phase (0 ≤ n < 30) w30 ( n) < w1 (n) g30 (n) < g1 (n) Desired condition p1 = 0.1, p30 = 1.0 g30 (n) > g1 (n)
  • 10. MODIFIED PNLMS ALGORITHM Features of the standard PNLMS algorithm: 1) When wi (n) is an active coefficient, its gain is always proportional to wi (n) . 2) When wi (n) is inactive, the gain is not proportional to wi (n) . Objective: To overcome the drawback (2) by making the gain gi (n) tend towards being proportional to wi (n) even when wi (n) is inactive. ⎧⇒ φi (n) = max[ fi (n), wi (n) ] ⎪ ⎪ 1 f (n) is replaced by fi (n) ⎨⇒ gi ( n) = N inactive f i ( n) ⎪ ⎪ ∑ φi (n) ⎩ i =1
  • 11. MODIFIED PNLMS ALGORITHM Conditions Required for the New Activation Factor fi (n) C1) fi (n) must converge to the corresponding coefficient magnitude wi (n) lim [ fi (n) − wi (n) ] = 0 , i = 1, 2, …, N n→∞ C2) fi (n) must always be greater than zero, i.e., f i ( n) > 0 , i = 1, 2, …, N inactive If C1 is fulfilled, gi (n) tends to be proportional to wi (n) as n → ∞ C2 ensures that gi (n) > 0 when wi (n) = 0, avoiding the freezing of wi (n)
  • 12. MODIFIED PNLMS ALGORITHM Proposed Approach for Computing fi (n) fi (n) = γ wi (n) + (1 − γ )φi (n − 1) INTENDED C2 AIM (C1) where 0 < γ < 1 By considering that no knowledge of the system plant is available a priori, it is reasonable to choose γ = 1/ 2 The activation factors are initialized with a small positive constant (typically, fi (0) = 10−2 / N )
  • 13. MODIFIED PNLMS ALGORITHM For proper algorithm operation, it is required that the instantaneous magnitude of the estimated coefficients be proportional to the magnitude of the corresponding plant coefficients. wi (n) may not be proportional to pi (n) at the beginning of the adaptation process IAF-PNLMS ⎧1 1 ⎪ wi (n) + φi ( n − 1), n = mN f i ( n) = ⎨ 2 2 ⎪ fi (n − 1), ⎩ otherwise N : adaptive filter length m = 1, 2, 3, …
  • 14. NUMERICAL SIMULATIONS Example 1 A perturbation in the plant takes place at n = 2500, whereby the plant vector p is changed to −p Parameter values: μ = 0.5 ρ = 0.05 δ = 0.01 α=0 fi (0) = 10−4
  • 15. NUMERICAL SIMULATIONS Example 2 A perturbation in the plant takes place at n = 2500, whereby the plant vector p is shifted to the right by 12 samples, changing the position of all active coefficients {1, 30, 35, 85} ⇓ {13, 42, 47, 97} Parameter values: μ = 0.5 ρ = 0.05 δ = 0.01 α=0 fi (0) = 10−4
  • 16. CONCLUSIONS The IAF-PNLMS algorithm uses an individual activation factor for each adaptive filter coefficient. The IAF-PNLMS algorithm presents better gain distribution than the PNLMS and IPNLMS algorithms. The IAF-PNLMS algorithm provides an improvement in convergence speed for plant impulse responses having high sparseness.