eduzhai > Applied Sciences > Engineering >

Conditions for Convergence of Dynamic Regressor Extension and Mixing Parameter Estimator Using LTI Filters

  • king
  • (0) Download
  • 20210505
  • Save

... pages left unread,continue reading

Document pages: 6 pages

Abstract: In this note we study the conditions for convergence of recently introduceddynamic regressor extension and mixing (DREM) parameter estimator when theextended regressor is generated using LTI filters. In particular, we areinterested in relating these conditions with the ones required for convergenceof the classical gradient (or least squares), namely the well-known persistentexcitation (PE) requirement on the original regressor vector, $ phi(t) in mathbb{R}^q$, with $q in mathbb{N}$ the number of unknown parameters.Moreover, we study the case when only interval excitation (IE) is available,under which DREM, concurrent and composite learning schemes ensure globalconvergence, being the convergence for DREM in finite time. Regarding PE weprove that if $ phi(t)$ is PE then the scalar regressor of DREM, $ Delta(t) in mathbb{R}$, is also PE, ensuring exponential convergence. Concerning IE weprove that if $ phi(t)$ is IE then $ Delta(t)$ is also IE. All these resultsare established in the almost sure sense, namely proving that the set of filterparameters for which the claims do not hold is of zero measure. The maintechnical tool used in our proof is inspired by a study of Luenberger observersfor nonautonomous nonlinear systems recently reported in the literature.

Please select stars to rate!

         

0 comments Sign in to leave a comment.

    Data loading, please wait...
×