Machine Learning
Using the BSP cost model to optimise parallel neural network training
Future Generation Computer Systems - Special issue: Bio-inspired solutions to parallel processing problems
Scientific knowledge discovery using inductive logic programming
Communications of the ACM
Inductive logic programming: issues, results and the challenge of learning language in logic
Artificial Intelligence - Special issue on applications of artificial intelligence
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Mathematical Programming for Data Mining: Formulations and Challenges
INFORMS Journal on Computing
Hi-index | 0.00 |
Classification and regression are fundamental data mining techniques. The goal of such techniques is to build predictors based on a training dataset and use them to predict the properties of new data. For a wide range of techniques, combining predictors built on samples from the training dataset provides lower error rates, faster construction, or both, than a predictor built from the entire training dataset. This provides a natural parallelization strategy in which predictors based on samples are built independently and hence concurrently. We discuss the performance implications for two subclasses: those in which predictors are independent, and those in which knowing a set of predictors reduces the difficulty of finding a new one.