Efficient incremental induction of decision trees
Machine Learning
Decision Tree Induction Based on Efficient Tree Restructuring
Machine Learning
Mining high-speed data streams
Proceedings of the sixth ACM SIGKDD international conference on Knowledge discovery and data mining
IEEE Transactions on Pattern Analysis and Machine Intelligence
Histograms of Oriented Gradients for Human Detection
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 1 - Volume 01
Machine Learning
IDEAS '07 Proceedings of the 11th International Database Engineering and Applications Symposium
Pattern Recognition Letters
Information Sciences: an International Journal
Multiple-Cue-Based visual object contour tracking with incremental learning
Transactions on Edutainment IX
Robust object tracking using enhanced random ferns
The Visual Computer: International Journal of Computer Graphics
Hi-index | 0.00 |
Decision trees have been widely used for online learning classification. Many approaches usually need large data stream to finish decision trees induction, as show notable limitations (even fail) with small data stream. In fact, there exist many real instances with small data stream. In the paper, we propose a novel incremental extremely random forest algorithm, dealing with online learning classification with small streaming data. In our method, arriving examples are stored at the leaf nodes and used to determine when to split the leaf nodes combined with Gini index, so the trees can be expanded efficiently with a few examples. Our algorithm has been applied to solve both online learning and video object tracking problems, and the results on UCI datasets and challenging video sequences demonstrate its effectiveness and robustness.