Matrix multiplication via arithmetic progressions
Journal of Symbolic Computation - Special issue on computational algebraic complexity
Unsupervised learning through symbolic clustering
Pattern Recognition Letters
Symbolic clustering using a new dissimilarity measure
Pattern Recognition
Artificial Intelligence Review - Special issue on lazy learning
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
Analysis of recommendation algorithms for e-commerce
Proceedings of the 2nd ACM conference on Electronic commerce
Digital Image Warping
Machine Learning
Locally Adaptive Metric Nearest-Neighbor Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Examining Locally Varying Weights for Nearest Neighbor Algorithms
ICCBR '97 Proceedings of the Second International Conference on Case-Based Reasoning Research and Development
Think globally, fit locally: unsupervised learning of low dimensional manifolds
The Journal of Machine Learning Research
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Collaborative recommendation: A robustness analysis
ACM Transactions on Internet Technology (TOIT)
Scalable collaborative filtering using cluster-based smoothing
Proceedings of the 28th annual international ACM SIGIR conference on Research and development in information retrieval
Estimation of Dependences Based on Empirical Data: Springer Series in Statistics (Springer Series in Statistics)
Scouts, promoters, and connectors: the roles of ratings in nearest neighbor collaborative filtering
EC '06 Proceedings of the 7th ACM conference on Electronic commerce
Learning Weighted Metrics to Minimize Nearest-Neighbor Classification Error
IEEE Transactions on Pattern Analysis and Machine Intelligence
A clustering-based method for unsupervised intrusion detections
Pattern Recognition Letters
Recency-based collaborative filtering
ADC '06 Proceedings of the 17th Australasian Database Conference - Volume 49
Face recognition based on discriminant fractional Fourier feature extraction
Pattern Recognition Letters
Face recognition with local steerable phase feature
Pattern Recognition Letters
Neighbor-weighted K-nearest neighbor for unbalanced text corpus
Expert Systems with Applications: An International Journal
Nearest neighbor pattern classification
IEEE Transactions on Information Theory
An approach to model building for accelerated cooling process using instance-based learning
Expert Systems with Applications: An International Journal
K-means clustering seeds initialization based on centrality, sparsity, and isotropy
IDEAL'09 Proceedings of the 10th international conference on Intelligent data engineering and automated learning
Semi-supervised learning based on nearest neighbor rule and cut edges
Knowledge-Based Systems
Virtual metrology for run-to-run control in semiconductor manufacturing
Expert Systems with Applications: An International Journal
Sparse neighbor representation for classification
Pattern Recognition Letters
Improved response modeling based on clustering, under-sampling, and ensemble
Expert Systems with Applications: An International Journal
Hi-index | 0.01 |
Instance-based learning (IBL), so called memory-based reasoning (MBR), is a commonly used non-parametric learning algorithm. k-nearest neighbor (k-NN) learning is the most popular realization of IBL. Due to its usability and adaptability, k-NN has been successfully applied to a wide range of applications. However, in practice, one has to set important model parameters only empirically: the number of neighbors (k) and weights to those neighbors. In this paper, we propose structured ways to set these parameters, based on locally linear reconstruction (LLR). We then employed sequential minimal optimization (SMO) for solving quadratic programming step involved in LLR for classification to reduce the computational complexity. Experimental results from 11 classification and eight regression tasks were promising enough to merit further investigation: not only did LLR outperform the conventional weight allocation methods without much additional computational cost, but also LLR was found to be robust to the change of k.