Bio/Interests : 

Penalized maximum likelihood methods that perform automatic variable are now ubiquitous in statistical research. It is wellknown, however, that these estimators are nonregular and consequently have limiting distributions that can be highly sensitive to small perturbations of the underlying generative model. This is the case even for the ﬁxed “p” framework. Hence, the usual asymptotic methods for inference, like the bootstrap and series approximations, often perform poorly in small samples and require modiﬁcation. Here, we develop locally asymptotically consistent conﬁdence intervals for regression coeﬃcients when estimation is done using the Adaptive LASSO (Zou, 2006) in the ﬁxed “p” framework. We construct the conﬁdence intervals by sandwiching the nonregular functional of interest between two smooth, datadriven, upper and lower bounds and then approximating the distribution of the bounds using the bootstrap. We leverage the smoothness of the bounds to obtain consistent inference for the nonregular functional under both ﬁxed and local alternatives. The bounds are adaptive to the amount of underlying nonregularity in the sense that they deliver asymptotically exact coverage whenever the underlying generative model is such that the Adaptive LASSO estimators are consistent and asymptotically normal, and conservative otherwise. The resultant conﬁdence intervals possess a certain tightness property among all regular bounds. Although we focus on the Adaptive LASSO, our approach generalizes to other penalized methods. (Originally published as a technical report in 2014.)