Advances and Applications in Statistics
Volume 14, Issue 2, Pages 173 - 189
(February 2010)
|
|
THE BOOTSTRAP-BASED SELECTION CRITERIA: AN OPTIMAL CHOICE FOR MODEL SELECTION IN LINEAR REGRESSION
Junfeng Shang
|
Abstract: This note focuses on a performance comparison between the bootstrap-based selection criteria, AICb1 and AICb2, and the corrected Akaike information criterion, AICc, in linear regression. Because of its extensive applicability, the Akaike information criterion, AIC, is the most recognized and widely utilized tool for model selection. However, its weakness lies in underprivileged behavior in small sample-size applications. To lessen this weakness, AICc was proposed (see Sugiura [9], Hurvich and Tsai [6]), and in linear regression AICc serves as an exact unbiased estimator of the expected Kullback-Leibler discrepancy between the model generating the data and a fitted candidate model (see Cavanaugh [3]). The bootstrap-based selection criteria, AICb1 and AICb2, are justified as the variants of AIC (see Shibata [8], Shang and Cavanaugh [7]), and serve as asymptotically unbiased estimators of the expected Kullback-Leibler discrepancy between the generating model and a fitted candidate model. Our simulation study demonstrates that the bootstrap-based criteria by some means outperform AICc in selecting an appropriate model from a candidate class in small to medium-sample applications. Relying upon the simulation results, the optimal properties of the bootstrap-based selection criteria are discussed, suggesting that apart from AICc, the bootstrap-based criteria are as well an optimal choice for model selection and for discrepancy estimation between the generating model and a fitted candidate model. |
Keywords and phrases: AIC, AICc, small-sample model selection, bootstrap-based selection criteria, parametric bootstrapping, nonparametric bootstrapping. |
|
Number of Downloads: 354 | Number of Views: 1182 |
|