摘要
In this paper, we consider high-dimensional regression with a P0 constraint. Such optimiza-tion problems were once thought to be hard to solve, but recent advances in optimization have provided efficient solvers including iterative hard thresholding (IHT) with conver-gence guarantees. In particular, linear (geometric) convergence was shown for strongly smooth and strongly convx loss functions. This means the current methodology and theory do not directly apply to non-smooth loss functions that appear in some popular statistical models. Here we consider non-smooth losses with the quantile check loss and the hinge loss as our examples. The two loss functions are neither strongly smooth nor strongly con-vex, even for fixed-dimensional problems. We establish the geometric convergence of IHT up to the statistical precision of the model. Some numerical results are presented to illus-trate the convergence.
-
单位南开大学