Fast rates of Gaussian empirical gain maximization with heavy-tailed noise

S Huang, Y Feng, Q Wu - IEEE Transactions on Neural …, 2022 - ieeexplore.ieee.org
S Huang, Y Feng, Q Wu
IEEE Transactions on Neural Networks and Learning Systems, 2022ieeexplore.ieee.org
In a regression setup, we study in this brief the performance of Gaussian empirical gain
maximization (EGM), which includes a broad variety of well-established robust estimation
approaches. In particular, we conduct a refined learning theory analysis for Gaussian EGM,
investigate its regression calibration properties, and develop improved convergence rates in
the presence of heavy-tailed noise. To achieve these purposes, we first introduce a new
weak moment condition that could accommodate the cases where the noise distribution may …
In a regression setup, we study in this brief the performance of Gaussian empirical gain maximization (EGM), which includes a broad variety of well-established robust estimation approaches. In particular, we conduct a refined learning theory analysis for Gaussian EGM, investigate its regression calibration properties, and develop improved convergence rates in the presence of heavy-tailed noise. To achieve these purposes, we first introduce a new weak moment condition that could accommodate the cases where the noise distribution may be heavy-tailed. Based on the moment condition, we then develop a novel comparison theorem that can be used to characterize the regression calibration properties of Gaussian EGM. It also plays an essential role in deriving improved convergence rates. Therefore, the present study broadens our theoretical understanding of Gaussian EGM.
ieeexplore.ieee.org
Showing the best result for this search. See all results