SOSP '03 Proceedings of the nineteenth ACM symposium on Operating systems principles
MapReduce: simplified data processing on large clusters
OSDI'04 Proceedings of the 6th conference on Symposium on Opearting Systems Design & Implementation - Volume 6
Letters: Convex incremental extreme learning machine
Neurocomputing
Hadoop: The Definitive Guide
The high-activity parallel implementation of data preprocessing based on MapReduce
RSKT'10 Proceedings of the 5th international conference on Rough set and knowledge technology
Parallel implementation of classification algorithms based on MapReduce
RSKT'10 Proceedings of the 5th international conference on Rough set and knowledge technology
Universal approximation using incremental constructive feedforward networks with random hidden nodes
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
Regression is one of the most basic problems in data mining. For regression problem, extreme learning machine (ELM) can get better generalization performance at a much faster learning speed. However, the enlarging volume of datasets makes regression by ELM on very large scale datasets a challenging task. Through analyzing the mechanism of ELM algorithm, an efficient parallel ELM for regression is designed and implemented based on MapReduce framework, which is a simple but powerful parallel programming technique currently. The experimental results demonstrate that the proposed parallel ELM for regression can efficiently handle very large datasets on commodity hardware with a good performance on different evaluation criterions, including speedup, scaleup and sizeup.