Journal Volumes


Visitors
ALL : 2,316,403
TODAY : 9,704
ONLINE : 1,097

  JOURNAL DETAIL



Comparison of the Model Selection Criteria for Multiple Regression Based on Kullback-Leibler’s Information


Paper Type 
Contributed Paper
Title 
Comparison of the Model Selection Criteria for Multiple Regression Based on Kullback-Leibler’s Information
Author 
Warangkhana Keerativibool* [a] and Pachitjanut Siripanich [b]
Email 
warang27@gmail.com
Abstract:
This paper presents the derivations to unify the justifications of the criteria based on Kullback’s divergence; AIC, AICc, KIC, KICcC, KICcSB, and KICcHM. The results show that KICcC has the strongest penalty function under some condition, followed, respectively, by KICcSB, KICcHM, KIC and AIC. Also, KIC is greater than AICc under some condition, but AICc always greater than AIC. The performances of all model selection criteria are examined by the extensive simulation study. It can be concluded that, the model selection with a larger penalty term may lead to underfitting and slow convergence while a smaller penalty term may lead to overfitting and inconsistency. When the sample size is small to moderate and the true model is somewhat difficult to identify, the performances of AIC and AICc are better than others. However, they can identify the true model actually less accurate. When the sample size is large, the performances of all model selection criteria are insignificant difference, but all criteria can identify the true model still less accurate. As a result, we used the observed  efficiency to assess model selection criteria performances. On the average, this measure suggests that in a weakly identifiable true model, whether the sample size is small or large, KICcC is the best criterion. For the small sample size and the true model can be specified more easily with small error variance, every model selection criteria still have the ability to select the correct model. If the error variance increase, the performances of all model selection criteria are bad. When the sample sizes are moderate to large, KICc performs the best, it can identify a lot of true model for small error variance. But, if the error variance increases and the sample size is not large enough, all model selection criteria can identify a little true model. 

Start & End Page 
699 - 714
Received Date 
2015-03-26
Revised Date 
Accepted Date 
2016-10-03
Full Text 
  Download
Keyword 
Kullback’s directed divergence, Kullback’s symmetric divergence, model selection, multiple regression
Volume 
Vol.44 No.2 (April 2017)
DOI 
Citation 
Keerativibool W. and Siripanich P., Comparison of the Model Selection Criteria for Multiple Regression Based on Kullback-Leibler’s Information , Chiang Mai J. Sci., 2017; 44(2): 699-714.
SDGs
View:604 Download:255

  RELATED ARTICLE

Parametric and Non-parametric Statistics as Special Cases of Canonical Analysis
page: 97 - 104
Author:Putipong Bookkamana* [a] and Richard L. Gorsuch [b]
Vol.31 No.2 May 2004 View: 619 Download:228



Search in this journal


Document Search


Author Search

A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z

Popular Search






Chiang Mai Journal of Science

Faculty of Science, Chiang Mai University
239 Huaykaew Road, Tumbol Suthep, Amphur Muang, Chiang Mai 50200 THAILAND
Tel: +6653-943-467




Faculty of Science,
Chiang Mai University




EMAIL
cmjs@cmu.ac.th




Copyrights © Since 2021 All Rights Reserved by Chiang Mai Journal of Science