Skip to main navigation Skip to search Skip to main content

Nearly Optimal Algorithms with Sublinear Computational Complexity for Online Kernel Regression

  • Junfan Li
  • , Shizhong Liao*
  • *Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

Abstract

The trade-off between regret and computational cost is a fundamental problem for online kernel regression, and previous algorithms worked on the trade-off can not keep optimal regret bounds at a sublinear computational complexity. In this paper, we propose two new algorithms, AOGD-ALD and NONS-ALD, which can keep nearly optimal regret bounds at a sublinear computational complexity, and give sufficient conditions under which our algorithms work. Both algorithms dynamically maintain a group of nearly orthogonal basis used to approximate the kernel mapping, and keep nearly optimal regret bounds by controlling the approximate error. The number of basis depends on the approximate error and the decay rate of eigenvalues of the kernel matrix. If the eigenvalues decay exponentially, then AOGD-ALD and NONS-ALD separately achieves a regret of O(pL(f)) and O(deff(µ) ln T) at a computational complexity in O(ln2 T). If the eigenvalues decay polynomially with degree p ≥ 1, then our algorithms keep the same regret bounds at a computational complexity in o(T) in the case of p > 4 and p ≥ 10, respectively. L(f) is the cumulative losses of f and deff(µ) is the effective dimension of the problem. The two regret bounds are nearly optimal and are not comparable.

Original languageEnglish
Pages (from-to)20384-20411
Number of pages28
JournalProceedings of Machine Learning Research
Volume202
StatePublished - 2023
Externally publishedYes
Event40th International Conference on Machine Learning, ICML 2023 - Honolulu, United States
Duration: 23 Jul 202329 Jul 2023

Fingerprint

Dive into the research topics of 'Nearly Optimal Algorithms with Sublinear Computational Complexity for Online Kernel Regression'. Together they form a unique fingerprint.

Cite this