A Nearest Hyperrectangle Learning Method
Machine Learning
Predicting bank failures: A neural network approach
Applied Artificial Intelligence
Using collaborative filtering to weave an information tapestry
Communications of the ACM - Special issue on information filtering
Computers and Operations Research
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Using k-nearest-neighbor classification in the leaves of a tree
Computational Statistics & Data Analysis
A simple weighting scheme for classification in two-group discriminant problems
Computers and Operations Research
A comparative assessment of classification methods
Decision Support Systems
Logistic regression and artificial neural network classification models: a methodology review
Journal of Biomedical Informatics
Neural Computing and Applications
Data Mining for Business Intelligence: Concepts, Techniques, and Applications in Microsoft Office Excel with XLMiner
Review: Neural networks and statistical techniques: A review of applications
Expert Systems with Applications: An International Journal
A comparison of supervised and unsupervised neural networks in predicting bankruptcy of Korean firms
Expert Systems with Applications: An International Journal
Expert Systems with Applications: An International Journal
Analyzing the balancing of error rates for multi-group classification
Expert Systems with Applications: An International Journal
Hi-index | 12.05 |
This study evaluates the relative performance of some well-known classification techniques, as well as a proposed hybrid method. The proposed hybrid method is a combination of k-nearest neighbor (kNN) and linear programming (LP) method for four group classification. Computational experiments are conducted to evaluate the performances of these classification techniques. Monte Carlo simulation is used to generate dataset with varying characteristics such as multicollinearity, nonlinearity, etc. for the experiments. The experimental results indicate that LP approaches, in general, and the proposed hybrid method, in particular, consistently have lower misclassification rates for most data characteristics. Furthermore, the hybrid method utilizes the strengths of both methods - k-NN and linear programming - resulting in considerable improvement in the classification accuracy. The results of this study can aid in the design of various hybrid techniques that combine the strengths of different methods to improve classification accuracy and reliability.