Parallel machine learning on big data
XRDS: Crossroads, The ACM Magazine for Students - Big Data
ACM Transactions on Architecture and Code Optimization (TACO) - Special Issue on High-Performance Embedded Architectures and Compilers
Efficient protocols for distributed classification and optimization
ALT'12 Proceedings of the 23rd international conference on Algorithmic Learning Theory
Optimizing parallel belief propagation in junction treesusing regression
Proceedings of the 19th ACM SIGKDD international conference on Knowledge discovery and data mining
Ad click prediction: a view from the trenches
Proceedings of the 19th ACM SIGKDD international conference on Knowledge discovery and data mining
Proceedings of the 17th ACM conference on Computer supported cooperative work & social computing
Random walks based modularity: application to semi-supervised learning
Proceedings of the 23rd international conference on World wide web
ACM SIGMOD Record
Hi-index | 0.00 |
This book presents an integrated collection of representative approaches for scaling up machine learning and data mining methods on parallel and distributed computing platforms. Demand for parallelizing learning algorithms is highly task-specific: in some settings it is driven by the enormous dataset sizes, in others by model complexity or by real-time performance requirements. Making task-appropriate algorithm and platform choices for large-scale machine learning requires understanding the benefits, trade-offs, and constraints of the available options. Solutions presented in the book cover a range of parallelization platforms from FPGAs and GPUs to multi-core systems and commodity clusters, concurrent programming frameworks including CUDA, MPI, MapReduce, and DryadLINQ, and learning settings (supervised, unsupervised, semi-supervised, and online learning). Extensive coverage of parallelization of boosted trees, SVMs, spectral clustering, belief propagation and other popular learning algorithms and deep dives into several applications make the book equally useful for researchers, students, and practitioners.