Adaptive minimax optimality in statistical inverse problems via SOLIT — Sharp Optimal Lepskii-Inspired Tuning

Authors

Li H, Werner F
 

Journal

Inverse Problems
 

Citation

Inverse Problems 40 (2024) 025005 (29pp).
 

Abstract

We consider statistical linear inverse problems in separable Hilbert spaces and filter-based reconstruction methods of the form ^fα = qα (TT) TY, where Y is the available data, T the forward operator, (qα)α∈A an ordered filter, and α > 0 a regularization parameter. Whenever such a method is used in practice, α has to be appropriately chosen. Typically, the aim is to find or at least approximate the best possible α in the sense that mean squared error (MSE) E[‖ ^fα − f†‖2] w.r.t. the true solution f is minimized. In this paper, we introduce the Sharp Optimal Lepski˘ı-Inspired Tuning (SOLIT) method, which yields an a posteriori parameter choice rule ensuring adaptive minimax rates of convergence.
It depends only on Y and the noise level σ as well as the operator T and the filter (qα)α∈A and does not require any problem-dependent tuning of further parameters. We prove an oracle inequality for the corresponding MSE in a general setting and derive the rates of convergence in different scenarios. By a careful analysis we show that no other a posteriori parameter choice rule can yield a better performance in terms of the order of the convergence rate of the MSE. In particular, our results reveal that the typical understanding of Lepski˘ı-type methods in inverse problems leading to a loss of a log factor is wrong. In addition, the empirical performance of SOLIT is examined in simulations.
 

DOI

10.1088/1361-6420/ad12e0