Abstract
This paper addresses the problem of tracking time-varying optimal trajectories for convex optimization problems where the objective function is time-varying and its explicit form is unknown but measurable. We propose a discrete-time extremum seeking algorithm that leverages sampled-data measurements to approximate derivatives and iteratively update the system state. The algorithm employs five sampling points within each interval to estimate the gradient, Hessian, and mixed time derivatives of the objective function via finite difference approximations. Under the assumptions of uniform strong convexity and smoothness of the objective function with bounded derivatives, we establish the practical stability of the algorithm and derive an ultimate bound for the tracking error. The key innovation lies in transforming the continuous-time dynamic optimization problem into a discrete iterative framework while rigorously quantifying approximation errors. The efficiency of the new approach is demonstrated by an example.
| Original language | English |
|---|---|
| Pages (from-to) | 2347-2352 |
| Number of pages | 6 |
| Journal | Youth Academic Annual Conference of Chinese Association of Automation, YAC |
| Issue number | 2025 |
| DOIs | |
| State | Published - 2025 |
| Event | 40th Youth Academic Annual Conference of Chinese Association of Automation, YAC 2025 - Zhengzhou, China Duration: 17 May 2025 → 19 May 2025 |
Keywords
- Extremum seeking
- sampled-data
- time-varying optimization
Fingerprint
Dive into the research topics of 'Extremum Seeking for Single-Variable Time-Varying Objective Functions'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver