SlideShare a Scribd company logo
1 of 16
Download to read offline
Alternative Approach for Computing the Activation
               Factor of the PNLMS Algorithm
                                       Authors
         Francisco das C. de Souza,                       Dennis R. Morgan
     Orlando José Tobias , and Rui Seara            Bell Laboratories, Alcatel-Lucent
LINSE - Circuits and Signal Processing Laboratory         drrm@bell-labs.com
      Federal University of Santa Catarina
     {fsouza, orlando, seara}@linse.ufsc.br
INTRODUCTION

Sparse impulse responses are encountered in many real-world applications:
   Communications, acoustics, seismic and chemical processes
   NLMS algorithm (the same step-size value for all filter coefficients)
    ⇒ Slow convergence

Algorithms that exploit the sparse nature of the impulse response:
 PNLMS (proportionate normalized least-mean-square)
 PNLMS ++ (both the NLMS and PNLMS are used
           in the coefficient vector update)
 IPNLMS (improved PNLMS)
 SC-PNLMS (sparseness controlled PNLMS)
INTRODUCTION
The standard PNLMS algorithm Formulation

                                                               μG (n) e(n)x(n)
Coefficient update ( N × 1) :          w (n + 1) = w (n) +
                                                              xT (n)G (n)x(n) + ε


Gain distribution matrix ( N × N ) :   G (n) = diag [ g1 (n) g 2 (n)    g N (n) ]

                                                     φi (n)
Individual gain:                       gi ( n) =   N
                                                   ∑ φi (n)
                                                   i =1

Proportionality function:              φi (n) = max ⎡ f (n), wi (n) ⎤
                                                    ⎣               ⎦


Activation factor:                     f (n) = ρ max ⎡δ, w (n) ∞ ⎤
                                                     ⎣           ⎦
INTRODUCTION
The standard PNLMS algorithm performance depends on predefined parameters:
       δ (initialization)

       ρ (proportionality or activation)


  •       These parameters are related to the algorithm variable, termed
                             ACTIVATION FACTOR f ( n)
   •      The initialization parameter permits starting the adaptation process at
          n = 0, when all filter coefficients are initialized to zero.
                       ⇒ f (0) = ρ max ⎡δ, w (0) ∞ ⎦ = ρδ
                                        ⎣
                                                    ⎤
   •      The proportionality parameter prevents an individual coefficient from
          freezing when its magnitude is much smaller than the largest coefficient
          magnitude.
A central point: How to set suitable values for these parameters,
                 since they impact the algorithm convergence speed?
INTRODUCTION
Activation factor in the standard PNLMS algorithm

Common to all coefficients, computed sample-by-sample.

Depends on w (n) ∞ .
Leads to a gain distribution between the adaptive filter coefficients not
entirely in line with the concept of proportionality.

 Proposed approach: Individual activation factor PNLMS (IAF-PNLMS)
An individual activation factor is used for each adaptive filter coefficient.
Each individual activation factor is computed in terms of the
corresponding coefficient magnitude.
 Consequence
 For impulse responses having high sparseness, numerical simulations show that
 the proposed approach has faster convergence as well as faster response to
 perturbations of the system plant than both the PNLMS and IPNLMS
 algorithms.
STANDARD PNLMS ALGORITHM DISCUSSION

                                                                  1
Gain for inactive coefficients     g inactive (n) =       N
                                                                       f ( n)
                                                      ∑ φi (n)
                                                      i =1

                                                              1
Gain for active coefficients       giactive (n) =     N
                                                                      wi (n)
                                                    ∑ φi (n)
                                                    i =1

Total gain distributed over the filter coefficients at each iteration
                            N−N
               tr [G (n) ] = N active f (n) + ∑ giactive (n) = 1
                             ∑ φi (n)         i∈A

                            i =1
   The activation factor affects the gains assigned to both
    active and inactive coefficients.
Standard PNLMS algorithm performance with respect f (n)
Scenario for all numerical simulations
 • Sparse impulse response p with N =100 coefficients,
   Active coefficient values:
   {0.1, 1.0, 0.5, 0.1} located at positions {1, 30, 35, 85}, respectively.
    S (p) = 0.9435
  • Input signal: Correlated unity-variance
                                                           μ = 0.5, δ = 0.01
    AR(2) process with χ = 74


Normalized misalignment measure:
                                  2
                     p − w ( n)
    κ(n) = 10log10           2
                                  2
                         p   2
Total gain distribution over L iterations
                      L −1
             θi = ∑ gi ( n)
                      n =0

Average of θi over the inactive coefficients
                          1
        θinactive =
         mean
                      N − N active
                                     ∑ θi ,   A = {1, 30, 35,85}
                                     i∉A
At the beginning of
the learning phase

    (0 ≤ n < 30)


  w30 ( n) < w1 (n)


   g30 (n) < g1 (n)




 Desired condition
 p1 = 0.1, p30 = 1.0


   g30 (n) > g1 (n)
MODIFIED PNLMS ALGORITHM

Features of the standard PNLMS algorithm:
1) When wi (n) is an active coefficient, its gain is always proportional to wi (n) .
2) When wi (n) is inactive, the gain is not proportional to wi (n) .

Objective:
   To overcome the drawback (2) by making the gain gi (n)
   tend towards being proportional to wi (n) even when
   wi (n) is inactive.
                                     ⎧⇒ φi (n) = max[ fi (n), wi (n) ]
                                     ⎪
                                     ⎪                          1
       f (n) is replaced by fi (n)   ⎨⇒ gi          ( n) = N
                                           inactive
                                                                    f i ( n)
                                     ⎪
                                     ⎪                     ∑ φi (n)
                                     ⎩                     i =1
MODIFIED PNLMS ALGORITHM
Conditions Required for the New Activation Factor fi (n)

C1) fi (n) must converge to the corresponding coefficient magnitude wi (n)

         lim [ fi (n) − wi (n) ] = 0 ,      i = 1, 2, …, N
        n→∞


C2) fi (n) must always be greater than zero, i.e.,


         f i ( n) > 0 ,    i = 1, 2, …, N

                        inactive
   If C1 is fulfilled, gi        (n) tends to be proportional to wi (n) as n → ∞

   C2 ensures that gi (n) > 0 when wi (n) = 0,
    avoiding the freezing of wi (n)
MODIFIED PNLMS ALGORITHM
Proposed Approach for Computing fi (n)


                   fi (n) = γ wi (n) + (1 − γ )φi (n − 1)


                        INTENDED                        C2
                         AIM (C1)

 where 0 < γ < 1
    By considering that no knowledge of the
    system plant is available a priori, it is reasonable to choose γ = 1/ 2
    The activation factors are initialized
    with a small positive constant (typically, fi (0) = 10−2 / N )
MODIFIED PNLMS ALGORITHM

For proper algorithm operation, it is required that the instantaneous
magnitude of the estimated coefficients be proportional to the magnitude
of the corresponding plant coefficients.

wi (n) may not be proportional to pi (n)
at the beginning of the adaptation process

                          IAF-PNLMS
                  ⎧1            1
                  ⎪ wi (n) + φi ( n − 1), n = mN
       f i ( n) = ⎨ 2           2
                  ⎪ fi (n − 1),
                  ⎩                     otherwise

       N : adaptive filter length
       m = 1, 2, 3, …
NUMERICAL SIMULATIONS
Example 1
A perturbation in the plant takes place at n = 2500, whereby the plant
 vector p is changed to −p


Parameter values:
μ = 0.5
ρ = 0.05
δ = 0.01
α=0
 fi (0) = 10−4
NUMERICAL SIMULATIONS
Example 2
A perturbation in the plant takes place at n = 2500, whereby the plant vector
 p is shifted to the right by 12 samples, changing the position of all
 active coefficients
  {1, 30, 35, 85}
         ⇓
 {13, 42, 47, 97}
Parameter values:
 μ = 0.5
ρ = 0.05
δ = 0.01
α=0
 fi (0) = 10−4
CONCLUSIONS


The IAF-PNLMS algorithm uses an individual activation factor for each
adaptive filter coefficient.


The IAF-PNLMS algorithm presents better gain distribution than the
PNLMS and IPNLMS algorithms.



The IAF-PNLMS algorithm provides an improvement in convergence
speed for plant impulse responses having high sparseness.

More Related Content

What's hot

Digital Signal Processing[ECEG-3171]-Ch1_L04
Digital Signal Processing[ECEG-3171]-Ch1_L04Digital Signal Processing[ECEG-3171]-Ch1_L04
Digital Signal Processing[ECEG-3171]-Ch1_L04Rediet Moges
 
Digital Signal Processing[ECEG-3171]-Ch1_L02
Digital Signal Processing[ECEG-3171]-Ch1_L02Digital Signal Processing[ECEG-3171]-Ch1_L02
Digital Signal Processing[ECEG-3171]-Ch1_L02Rediet Moges
 
The multilayer perceptron
The multilayer perceptronThe multilayer perceptron
The multilayer perceptronESCOM
 
Ec8352 signals and systems 2 marks with answers
Ec8352 signals and systems   2 marks with answersEc8352 signals and systems   2 marks with answers
Ec8352 signals and systems 2 marks with answersGayathri Krishnamoorthy
 
Dsp U Lec07 Realization Of Discrete Time Systems
Dsp U   Lec07 Realization Of Discrete Time SystemsDsp U   Lec07 Realization Of Discrete Time Systems
Dsp U Lec07 Realization Of Discrete Time Systemstaha25
 
Signal transmission and filtering section 3.1
Signal transmission and filtering section 3.1Signal transmission and filtering section 3.1
Signal transmission and filtering section 3.1nahrain university
 
Label propagation - Semisupervised Learning with Applications to NLP
Label propagation - Semisupervised Learning with Applications to NLPLabel propagation - Semisupervised Learning with Applications to NLP
Label propagation - Semisupervised Learning with Applications to NLPDavid Przybilla
 
Wiener filters
Wiener filtersWiener filters
Wiener filtersRayeesa
 
3.Properties of signals
3.Properties of signals3.Properties of signals
3.Properties of signalsINDIAN NAVY
 
Digital Signal Processing
Digital Signal ProcessingDigital Signal Processing
Digital Signal ProcessingPRABHAHARAN429
 
inverse z-transform ppt
inverse z-transform pptinverse z-transform ppt
inverse z-transform pptmihir jain
 
Discreate time system and z transform
Discreate time system and z transformDiscreate time system and z transform
Discreate time system and z transformVIKAS KUMAR MANJHI
 
2.time domain analysis of lti systems
2.time domain analysis of lti systems2.time domain analysis of lti systems
2.time domain analysis of lti systemsINDIAN NAVY
 
13 fourierfiltrationen
13 fourierfiltrationen13 fourierfiltrationen
13 fourierfiltrationenhoailinhtinh
 

What's hot (20)

Dsp Lab Record
Dsp Lab RecordDsp Lab Record
Dsp Lab Record
 
Digital Signal Processing[ECEG-3171]-Ch1_L04
Digital Signal Processing[ECEG-3171]-Ch1_L04Digital Signal Processing[ECEG-3171]-Ch1_L04
Digital Signal Processing[ECEG-3171]-Ch1_L04
 
Digital Signal Processing[ECEG-3171]-Ch1_L02
Digital Signal Processing[ECEG-3171]-Ch1_L02Digital Signal Processing[ECEG-3171]-Ch1_L02
Digital Signal Processing[ECEG-3171]-Ch1_L02
 
The FFT And Spectral Analysis
The FFT And Spectral AnalysisThe FFT And Spectral Analysis
The FFT And Spectral Analysis
 
The multilayer perceptron
The multilayer perceptronThe multilayer perceptron
The multilayer perceptron
 
Ec8352 signals and systems 2 marks with answers
Ec8352 signals and systems   2 marks with answersEc8352 signals and systems   2 marks with answers
Ec8352 signals and systems 2 marks with answers
 
1 - Linear Regression
1 - Linear Regression1 - Linear Regression
1 - Linear Regression
 
Before quiz 2
Before quiz 2Before quiz 2
Before quiz 2
 
Dsp U Lec07 Realization Of Discrete Time Systems
Dsp U   Lec07 Realization Of Discrete Time SystemsDsp U   Lec07 Realization Of Discrete Time Systems
Dsp U Lec07 Realization Of Discrete Time Systems
 
Signal transmission and filtering section 3.1
Signal transmission and filtering section 3.1Signal transmission and filtering section 3.1
Signal transmission and filtering section 3.1
 
03 image transform
03 image transform03 image transform
03 image transform
 
Label propagation - Semisupervised Learning with Applications to NLP
Label propagation - Semisupervised Learning with Applications to NLPLabel propagation - Semisupervised Learning with Applications to NLP
Label propagation - Semisupervised Learning with Applications to NLP
 
Wiener filters
Wiener filtersWiener filters
Wiener filters
 
3.Properties of signals
3.Properties of signals3.Properties of signals
3.Properties of signals
 
Digital Signal Processing
Digital Signal ProcessingDigital Signal Processing
Digital Signal Processing
 
inverse z-transform ppt
inverse z-transform pptinverse z-transform ppt
inverse z-transform ppt
 
Discreate time system and z transform
Discreate time system and z transformDiscreate time system and z transform
Discreate time system and z transform
 
2.time domain analysis of lti systems
2.time domain analysis of lti systems2.time domain analysis of lti systems
2.time domain analysis of lti systems
 
00e isi
00e isi00e isi
00e isi
 
13 fourierfiltrationen
13 fourierfiltrationen13 fourierfiltrationen
13 fourierfiltrationen
 

Similar to Alternative Approach for Computing the Activation Factor of the PNLMS Algorithm

Dsp U Lec08 Fir Filter Design
Dsp U   Lec08 Fir Filter DesignDsp U   Lec08 Fir Filter Design
Dsp U Lec08 Fir Filter Designtaha25
 
Lecture 4 - Growth of Functions (1).ppt
Lecture 4 - Growth of Functions (1).pptLecture 4 - Growth of Functions (1).ppt
Lecture 4 - Growth of Functions (1).pptZohairMughal1
 
Lecture 4 - Growth of Functions.ppt
Lecture 4 - Growth of Functions.pptLecture 4 - Growth of Functions.ppt
Lecture 4 - Growth of Functions.pptZohairMughal1
 
On probability distributions
On probability distributionsOn probability distributions
On probability distributionsEric Xihui Lin
 
02 2d systems matrix
02 2d systems matrix02 2d systems matrix
02 2d systems matrixRumah Belajar
 
Sls clm demo_new
Sls clm demo_newSls clm demo_new
Sls clm demo_newtvpszen
 
100 things I know
100 things I know100 things I know
100 things I knowr-uribe
 
Introducing Zap Q-Learning
Introducing Zap Q-Learning   Introducing Zap Q-Learning
Introducing Zap Q-Learning Sean Meyn
 
WAVELET-PACKET-BASED ADAPTIVE ALGORITHM FOR SPARSE IMPULSE RESPONSE IDENTIFI...
WAVELET-PACKET-BASED ADAPTIVE ALGORITHM FOR  SPARSE IMPULSE RESPONSE IDENTIFI...WAVELET-PACKET-BASED ADAPTIVE ALGORITHM FOR  SPARSE IMPULSE RESPONSE IDENTIFI...
WAVELET-PACKET-BASED ADAPTIVE ALGORITHM FOR SPARSE IMPULSE RESPONSE IDENTIFI...bermudez_jcm
 
Brief summary of signals
Brief summary of signalsBrief summary of signals
Brief summary of signalsaroosa khan
 
Arithmetic sequence in elementary and HS
Arithmetic sequence in elementary and HSArithmetic sequence in elementary and HS
Arithmetic sequence in elementary and HSRoseEdenAbitong2
 
Introduction to the theory of optimization
Introduction to the theory of optimizationIntroduction to the theory of optimization
Introduction to the theory of optimizationDelta Pi Systems
 

Similar to Alternative Approach for Computing the Activation Factor of the PNLMS Algorithm (20)

Dsp U Lec08 Fir Filter Design
Dsp U   Lec08 Fir Filter DesignDsp U   Lec08 Fir Filter Design
Dsp U Lec08 Fir Filter Design
 
Lecture 4 - Growth of Functions (1).ppt
Lecture 4 - Growth of Functions (1).pptLecture 4 - Growth of Functions (1).ppt
Lecture 4 - Growth of Functions (1).ppt
 
Lecture 4 - Growth of Functions.ppt
Lecture 4 - Growth of Functions.pptLecture 4 - Growth of Functions.ppt
Lecture 4 - Growth of Functions.ppt
 
On probability distributions
On probability distributionsOn probability distributions
On probability distributions
 
02 2d systems matrix
02 2d systems matrix02 2d systems matrix
02 2d systems matrix
 
Sls clm demo_new
Sls clm demo_newSls clm demo_new
Sls clm demo_new
 
100 things I know
100 things I know100 things I know
100 things I know
 
Signal Processing Homework Help
Signal Processing Homework HelpSignal Processing Homework Help
Signal Processing Homework Help
 
z transforms
z transformsz transforms
z transforms
 
Fourier series
Fourier seriesFourier series
Fourier series
 
Fourier series
Fourier seriesFourier series
Fourier series
 
Introducing Zap Q-Learning
Introducing Zap Q-Learning   Introducing Zap Q-Learning
Introducing Zap Q-Learning
 
WAVELET-PACKET-BASED ADAPTIVE ALGORITHM FOR SPARSE IMPULSE RESPONSE IDENTIFI...
WAVELET-PACKET-BASED ADAPTIVE ALGORITHM FOR  SPARSE IMPULSE RESPONSE IDENTIFI...WAVELET-PACKET-BASED ADAPTIVE ALGORITHM FOR  SPARSE IMPULSE RESPONSE IDENTIFI...
WAVELET-PACKET-BASED ADAPTIVE ALGORITHM FOR SPARSE IMPULSE RESPONSE IDENTIFI...
 
02 asymp
02 asymp02 asymp
02 asymp
 
Rousseau
RousseauRousseau
Rousseau
 
Brief summary of signals
Brief summary of signalsBrief summary of signals
Brief summary of signals
 
MUMS: Bayesian, Fiducial, and Frequentist Conference - Objective Bayesian Ana...
MUMS: Bayesian, Fiducial, and Frequentist Conference - Objective Bayesian Ana...MUMS: Bayesian, Fiducial, and Frequentist Conference - Objective Bayesian Ana...
MUMS: Bayesian, Fiducial, and Frequentist Conference - Objective Bayesian Ana...
 
QMC: Operator Splitting Workshop, Stochastic Block-Coordinate Fixed Point Alg...
QMC: Operator Splitting Workshop, Stochastic Block-Coordinate Fixed Point Alg...QMC: Operator Splitting Workshop, Stochastic Block-Coordinate Fixed Point Alg...
QMC: Operator Splitting Workshop, Stochastic Block-Coordinate Fixed Point Alg...
 
Arithmetic sequence in elementary and HS
Arithmetic sequence in elementary and HSArithmetic sequence in elementary and HS
Arithmetic sequence in elementary and HS
 
Introduction to the theory of optimization
Introduction to the theory of optimizationIntroduction to the theory of optimization
Introduction to the theory of optimization
 

Recently uploaded

ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTiammrhaywood
 
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Celine George
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxOH TEIK BIN
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityGeoBlogs
 
Crayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon ACrayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon AUnboundStockton
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdfssuser54595a
 
Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)eniolaolutunde
 
How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxmanuelaromero2013
 
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdfSoniaTolstoy
 
mini mental status format.docx
mini    mental       status     format.docxmini    mental       status     format.docx
mini mental status format.docxPoojaSen20
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Sapana Sha
 
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptxContemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptxRoyAbrique
 
Science 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsScience 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsKarinaGenton
 
Presiding Officer Training module 2024 lok sabha elections
Presiding Officer Training module 2024 lok sabha electionsPresiding Officer Training module 2024 lok sabha elections
Presiding Officer Training module 2024 lok sabha electionsanshu789521
 
Concept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.CompdfConcept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.CompdfUmakantAnnand
 
KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...
KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...
KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...M56BOOKSTORE PRODUCT/SERVICE
 
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️9953056974 Low Rate Call Girls In Saket, Delhi NCR
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxheathfieldcps1
 
How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17Celine George
 

Recently uploaded (20)

ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
 
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
 
Staff of Color (SOC) Retention Efforts DDSD
Staff of Color (SOC) Retention Efforts DDSDStaff of Color (SOC) Retention Efforts DDSD
Staff of Color (SOC) Retention Efforts DDSD
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptx
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activity
 
Crayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon ACrayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon A
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
 
Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)
 
How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptx
 
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
 
mini mental status format.docx
mini    mental       status     format.docxmini    mental       status     format.docx
mini mental status format.docx
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
 
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptxContemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
 
Science 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsScience 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its Characteristics
 
Presiding Officer Training module 2024 lok sabha elections
Presiding Officer Training module 2024 lok sabha electionsPresiding Officer Training module 2024 lok sabha elections
Presiding Officer Training module 2024 lok sabha elections
 
Concept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.CompdfConcept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.Compdf
 
KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...
KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...
KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...
 
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptx
 
How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17
 

Alternative Approach for Computing the Activation Factor of the PNLMS Algorithm

  • 1. Alternative Approach for Computing the Activation Factor of the PNLMS Algorithm Authors Francisco das C. de Souza, Dennis R. Morgan Orlando José Tobias , and Rui Seara Bell Laboratories, Alcatel-Lucent LINSE - Circuits and Signal Processing Laboratory drrm@bell-labs.com Federal University of Santa Catarina {fsouza, orlando, seara}@linse.ufsc.br
  • 2. INTRODUCTION Sparse impulse responses are encountered in many real-world applications: Communications, acoustics, seismic and chemical processes NLMS algorithm (the same step-size value for all filter coefficients) ⇒ Slow convergence Algorithms that exploit the sparse nature of the impulse response: PNLMS (proportionate normalized least-mean-square) PNLMS ++ (both the NLMS and PNLMS are used in the coefficient vector update) IPNLMS (improved PNLMS) SC-PNLMS (sparseness controlled PNLMS)
  • 3. INTRODUCTION The standard PNLMS algorithm Formulation μG (n) e(n)x(n) Coefficient update ( N × 1) : w (n + 1) = w (n) + xT (n)G (n)x(n) + ε Gain distribution matrix ( N × N ) : G (n) = diag [ g1 (n) g 2 (n) g N (n) ] φi (n) Individual gain: gi ( n) = N ∑ φi (n) i =1 Proportionality function: φi (n) = max ⎡ f (n), wi (n) ⎤ ⎣ ⎦ Activation factor: f (n) = ρ max ⎡δ, w (n) ∞ ⎤ ⎣ ⎦
  • 4. INTRODUCTION The standard PNLMS algorithm performance depends on predefined parameters: δ (initialization) ρ (proportionality or activation) • These parameters are related to the algorithm variable, termed ACTIVATION FACTOR f ( n) • The initialization parameter permits starting the adaptation process at n = 0, when all filter coefficients are initialized to zero. ⇒ f (0) = ρ max ⎡δ, w (0) ∞ ⎦ = ρδ ⎣ ⎤ • The proportionality parameter prevents an individual coefficient from freezing when its magnitude is much smaller than the largest coefficient magnitude. A central point: How to set suitable values for these parameters, since they impact the algorithm convergence speed?
  • 5. INTRODUCTION Activation factor in the standard PNLMS algorithm Common to all coefficients, computed sample-by-sample. Depends on w (n) ∞ . Leads to a gain distribution between the adaptive filter coefficients not entirely in line with the concept of proportionality. Proposed approach: Individual activation factor PNLMS (IAF-PNLMS) An individual activation factor is used for each adaptive filter coefficient. Each individual activation factor is computed in terms of the corresponding coefficient magnitude. Consequence For impulse responses having high sparseness, numerical simulations show that the proposed approach has faster convergence as well as faster response to perturbations of the system plant than both the PNLMS and IPNLMS algorithms.
  • 6. STANDARD PNLMS ALGORITHM DISCUSSION 1 Gain for inactive coefficients g inactive (n) = N f ( n) ∑ φi (n) i =1 1 Gain for active coefficients giactive (n) = N wi (n) ∑ φi (n) i =1 Total gain distributed over the filter coefficients at each iteration N−N tr [G (n) ] = N active f (n) + ∑ giactive (n) = 1 ∑ φi (n) i∈A i =1 The activation factor affects the gains assigned to both active and inactive coefficients.
  • 7. Standard PNLMS algorithm performance with respect f (n) Scenario for all numerical simulations • Sparse impulse response p with N =100 coefficients, Active coefficient values: {0.1, 1.0, 0.5, 0.1} located at positions {1, 30, 35, 85}, respectively. S (p) = 0.9435 • Input signal: Correlated unity-variance μ = 0.5, δ = 0.01 AR(2) process with χ = 74 Normalized misalignment measure: 2 p − w ( n) κ(n) = 10log10 2 2 p 2
  • 8. Total gain distribution over L iterations L −1 θi = ∑ gi ( n) n =0 Average of θi over the inactive coefficients 1 θinactive = mean N − N active ∑ θi , A = {1, 30, 35,85} i∉A
  • 9. At the beginning of the learning phase (0 ≤ n < 30) w30 ( n) < w1 (n) g30 (n) < g1 (n) Desired condition p1 = 0.1, p30 = 1.0 g30 (n) > g1 (n)
  • 10. MODIFIED PNLMS ALGORITHM Features of the standard PNLMS algorithm: 1) When wi (n) is an active coefficient, its gain is always proportional to wi (n) . 2) When wi (n) is inactive, the gain is not proportional to wi (n) . Objective: To overcome the drawback (2) by making the gain gi (n) tend towards being proportional to wi (n) even when wi (n) is inactive. ⎧⇒ φi (n) = max[ fi (n), wi (n) ] ⎪ ⎪ 1 f (n) is replaced by fi (n) ⎨⇒ gi ( n) = N inactive f i ( n) ⎪ ⎪ ∑ φi (n) ⎩ i =1
  • 11. MODIFIED PNLMS ALGORITHM Conditions Required for the New Activation Factor fi (n) C1) fi (n) must converge to the corresponding coefficient magnitude wi (n) lim [ fi (n) − wi (n) ] = 0 , i = 1, 2, …, N n→∞ C2) fi (n) must always be greater than zero, i.e., f i ( n) > 0 , i = 1, 2, …, N inactive If C1 is fulfilled, gi (n) tends to be proportional to wi (n) as n → ∞ C2 ensures that gi (n) > 0 when wi (n) = 0, avoiding the freezing of wi (n)
  • 12. MODIFIED PNLMS ALGORITHM Proposed Approach for Computing fi (n) fi (n) = γ wi (n) + (1 − γ )φi (n − 1) INTENDED C2 AIM (C1) where 0 < γ < 1 By considering that no knowledge of the system plant is available a priori, it is reasonable to choose γ = 1/ 2 The activation factors are initialized with a small positive constant (typically, fi (0) = 10−2 / N )
  • 13. MODIFIED PNLMS ALGORITHM For proper algorithm operation, it is required that the instantaneous magnitude of the estimated coefficients be proportional to the magnitude of the corresponding plant coefficients. wi (n) may not be proportional to pi (n) at the beginning of the adaptation process IAF-PNLMS ⎧1 1 ⎪ wi (n) + φi ( n − 1), n = mN f i ( n) = ⎨ 2 2 ⎪ fi (n − 1), ⎩ otherwise N : adaptive filter length m = 1, 2, 3, …
  • 14. NUMERICAL SIMULATIONS Example 1 A perturbation in the plant takes place at n = 2500, whereby the plant vector p is changed to −p Parameter values: μ = 0.5 ρ = 0.05 δ = 0.01 α=0 fi (0) = 10−4
  • 15. NUMERICAL SIMULATIONS Example 2 A perturbation in the plant takes place at n = 2500, whereby the plant vector p is shifted to the right by 12 samples, changing the position of all active coefficients {1, 30, 35, 85} ⇓ {13, 42, 47, 97} Parameter values: μ = 0.5 ρ = 0.05 δ = 0.01 α=0 fi (0) = 10−4
  • 16. CONCLUSIONS The IAF-PNLMS algorithm uses an individual activation factor for each adaptive filter coefficient. The IAF-PNLMS algorithm presents better gain distribution than the PNLMS and IPNLMS algorithms. The IAF-PNLMS algorithm provides an improvement in convergence speed for plant impulse responses having high sparseness.