Tilted least squares robust estimators

Biqiang Mu, Er Wei Bai, Wei Xing Zheng

Research output: Contribution to journalArticlepeer-review

Abstract

In practical scenarios, collected data for identification may be contaminated by unexpected disturbances with large amplitudes. In such cases, the ordinary least squares estimator, commonly used for identification, may fail to deliver satisfactory performance. To address this issue, robust estimators that can withstand the influence of contaminated data become essential. This paper introduces the tilted least squares (TLS) robust estimator for handling outliers and heavy-tailed noises, which incorporates a weighted quadratic loss function, with the weights constrained by the Kullback–Leibler (KL) divergence. It is proposed that the TLS estimator assigns weights to each data point as an exponential function of the negative squared residuals, effectively mitigating the influence of unexpected disturbances with large amplitudes. Additionally, a tuning criterion is derived for automatically estimating the size of the KL divergence. Furthermore, it is demonstrated that a specific variant of the TLS estimator is equivalent to the relaxed least trimmed squares (RTLS) estimator and the almost sure convergence of the RTLS estimator is also established in the presence of heavy-tailed noises with infinite variance.

Original languageEnglish
Article number112699
JournalAutomatica
Volume184
DOIs
Publication statusPublished - Feb 2026

Bibliographical note

Publisher Copyright:
© 2025 Elsevier Ltd

Keywords

  • Heavy-tailed noises
  • Outliers
  • Relaxed least trimmed squares estimators
  • Tilted least squares estimators

Fingerprint

Dive into the research topics of 'Tilted least squares robust estimators'. Together they form a unique fingerprint.

Cite this