CIRJE-F-709 "Selection of Variables in Multivariate Regression Models for Large Dimensions"
Author Name Srivastava, Muni S. and Tatsuya Kubokawa
Date January 2010
Full Paper PDF file
Remarks @Subsequently published in Communications in Statistics - Theory and Methods Vol.41, 2465-2489 (2012).
Abstract

The Akaike information criterion, AIC, and Mallows' Cp statistic have been proposed for selecting a smaller number of regressor variables in the multivariate regression models with fully unknown covariance matrix. All these criteria are, however, based on the implicit assumption that the sample size is substantially larger than the dimension of the covariance matrix. To obtain a stable estimator of the covariance matrix, it is required that the dimension of the covariance matrix be much smaller than the sample size. When the dimension is close to the sample size, it is necessary to use ridge type of estimators for the covariance matrix. In this paper, we use a ridge type of estimators for the covariance matrix and obtain the modified AIC and modified Cp statistic under the asymptotic theory that both the sample size and the dimension go to infinity. It is numerically shown that these modified procedures perform very well in the sense of selecting the true model in large dimensional cases.