Kernel methods for regression model based on variable selection
International Journal of Knowledge Engineering and Soft Data Paradigms
A variable selection method in principal canonical correlation analysis
Computational Statistics & Data Analysis
Blind recognition of linear space-time block codes: a likelihood-based approach
IEEE Transactions on Signal Processing
Prospective scientific methodology in knowledge society
PAKDD'08 Proceedings of the 12th Pacific-Asia conference on Advances in knowledge discovery and data mining
Conditional information criteria for selecting variables in linear mixed models
Journal of Multivariate Analysis
Conditional and unconditional methods for selecting variables in linear mixed models
Journal of Multivariate Analysis
Estimation of prediction error by using K-fold cross-validation
Statistics and Computing
Kernel Regression in the Presence of Correlated Errors
The Journal of Machine Learning Research
A non-iterative optimization method for smoothness in penalized spline regression
Statistics and Computing
A global-local optimization approach to parameter estimation of RBF-type models
Information Sciences: an International Journal
Predictive analytics in information systems research
MIS Quarterly
Tuning parameter selection in sparse regression modeling
Computational Statistics & Data Analysis
Journal of Multivariate Analysis
Knowledge discovery from massive healthcare claims data
Proceedings of the 19th ACM SIGKDD international conference on Knowledge discovery and data mining
Information criteria: How do they behave in different models?
Computational Statistics & Data Analysis
Hi-index | 0.00 |
The Akaike information criterion (AIC) derived as an estimator of the Kullback-Leibler information discrepancy provides a useful tool for evaluating statistical models, and numerous successful applications of the AIC have been reported in various fields of natural sciences, social sciences and engineering. One of the main objectives of this book is to provide comprehensive explanations of the concepts and derivations of the AIC and related criteria, including Schwarzs Bayesian information criterion (BIC), together with a wide range of practical examples of model selection and evaluation criteria. A secondary objective is to provide a theoretical basis for the analysis and extension of information criteria via a statistical functional approach. A generalized information criterion (GIC) and a bootstrap information criterion are presented, which provide unified tools for modeling and model evaluation for a diverse range of models, including various types of nonlinear models and model estimation procedures such as robust estimation, the maximum penalized likelihood method and a Bayesian approach.