C4.5: programs for machine learning
C4.5: programs for machine learning
The weighted majority algorithm
Information and Computation
Learning in the presence of concept drift and hidden contexts
Machine Learning
Lazy learning
On the Optimality of the Simple Bayesian Classifier under Zero-One Loss
Machine Learning - Special issue on learning with probabilistic representations
On-line learning in neural networks
On-line learning in neural networks
The application of AdaBoost for distributed, scalable and on-line learning
KDD '99 Proceedings of the fifth ACM SIGKDD international conference on Knowledge discovery and data mining
Using analytic QP and sparseness to speed training of support vector machines
Proceedings of the 1998 conference on Advances in neural information processing systems II
Machine Learning
Online Ensemble Learning: An Empirical Study
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Incremental Learning of Ensemble Classifiers on ECG Data
CBMS '05 Proceedings of the 18th IEEE Symposium on Computer-Based Medical Systems
Rotation Forest: A New Classifier Ensemble Method
IEEE Transactions on Pattern Analysis and Machine Intelligence
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Educational data mining: A survey from 1995 to 2005
Expert Systems with Applications: An International Journal
Evaluating Bayesian networks' precision for detecting students' learning styles
Computers & Education
Machine learning: a review of classification and combining techniques
Artificial Intelligence Review
Data mining in course management systems: Moodle case study and tutorial
Computers & Education
Incremental construction of classifier and discriminant ensembles
Information Sciences: an International Journal
Augmented fuzzy cognitive maps for modelling LMS critical success factors
Knowledge-Based Systems
Implement web learning environment based on data mining
Knowledge-Based Systems
Increasing On-line Classification Performance Using Incremental Classifier Fusion
ICAIS '09 Proceedings of the 2009 International Conference on Adaptive and Intelligent Systems
An ensemble approach for incremental learning in nonstationary environments
MCS'07 Proceedings of the 7th international conference on Multiple classifier systems
Comparison of machine learning methods for intelligent tutoring systems
ITS'06 Proceedings of the 8th international conference on Intelligent Tutoring Systems
Bucket Learning: Improving model quality through enhancing local patterns
Knowledge-Based Systems
Incremental learning of complete linear discriminant analysis for face recognition
Knowledge-Based Systems
Multiple extreme learning machines for a two-class imbalance corporate life cycle prediction
Knowledge-Based Systems
Proceedings of the 6th Balkan Conference in Informatics
Student's participation aspects in asynchronous discussions for distance education
Proceedings of the 17th Panhellenic Conference on Informatics
Proceedings of the 17th Panhellenic Conference on Informatics
Predicting students' final performance from participation in on-line discussion forums
Computers & Education
International Journal of Mobile Communications
Hi-index | 0.00 |
The ability to predict a student's performance could be useful in a great number of different ways associated with university-level distance learning. Students' marks in a few written assignments can constitute the training set for a supervised machine learning algorithm. Along with the explosive increase of data and information, incremental learning ability has become more and more important for machine learning approaches. The online algorithms try to forget irrelevant information instead of synthesizing all available information (as opposed to classic batch learning algorithms). Nowadays, combining classifiers is proposed as a new direction for the improvement of the classification accuracy. However, most ensemble algorithms operate in batch mode. Therefore a better proposal is an online ensemble of classifiers that combines an incremental version of Naive Bayes, the 1-NN and the WINNOW algorithms using the voting methodology. Among other significant conclusions it was found that the proposed algorithm is the most appropriate to be used for the construction of a software support tool.