e-Journal
Paper Type ![]() |
Contributed Paper |
Title ![]() |
Comparison of the Model Selection Criteria for Multiple Regression Based on Kullback-Leibler’s Information |
Author ![]() |
Warangkhana Keerativibool* [a] and Pachitjanut Siripanich [b] |
Email ![]() |
warang27@gmail.com |
Abstract: This paper presents the derivations to unify the justifications of the criteria based on Kullback’s divergence; AIC, AICc, KIC, KICcC, KICcSB, and KICcHM. The results show that KICcC has the strongest penalty function under some condition, followed, respectively, by KICcSB, KICcHM, KIC and AIC. Also, KIC is greater than AICc under some condition, but AICc always greater than AIC. The performances of all model selection criteria are examined by the extensive simulation study. It can be concluded that, the model selection with a larger penalty term may lead to underfitting and slow convergence while a smaller penalty term may lead to overfitting and inconsistency. When the sample size is small to moderate and the true model is somewhat difficult to identify, the performances of AIC and AICc are better than others. However, they can identify the true model actually less accurate. When the sample size is large, the performances of all model selection criteria are insignificant difference, but all criteria can identify the true model still less accurate. As a result, we used the observed
![]() |
|
Start & End Page ![]() |
699 - 714 |
Received Date ![]() |
2015-03-26 |
Revised Date ![]() |
|
Accepted Date ![]() |
2016-10-03 |
Full Text ![]() |
Download |
Keyword ![]() |
Kullback’s directed divergence, Kullback’s symmetric divergence, model selection, multiple regression |
Volume ![]() |
Vol.44 No.2 (April 2017) |
DOI |
|
Citation |
[a] W.K. and [b] P.S., Comparison of the Model Selection Criteria for Multiple Regression Based on Kullback-Leibler’s Information , Chiang Mai Journal of Science, 2017; 44(2): 699-714. |
SDGs |
|
View:665 Download:927 |