Interactive Concept-Learning and Constructive Induction by Analogy
Machine Learning
Applications of inductive logic programming
Communications of the ACM
A framework for Response Surface Methodology for simulation optimization
Proceedings of the 32nd conference on Winter simulation
Extracting Context-Sensitive Models in Inductive Logic Programming
Machine Learning
Relational data mining applications: an overview
Relational Data Mining
Four suggestions and a rule concerning the application of ILP
Relational Data Mining
No Unbiased Estimator of the Variance of K-Fold Cross-Validation
The Journal of Machine Learning Research
Gradient-Based Optimization of Hyperparameters
Neural Computation
Introduction to Statistical Relational Learning (Adaptive Computation and Machine Learning)
Introduction to Statistical Relational Learning (Adaptive Computation and Machine Learning)
TopLog: ILP Using a Logic Program Declarative Bias
ICLP '08 Proceedings of the 24th International Conference on Logic Programming
kFOIL: learning simple relational kernels
AAAI'06 Proceedings of the 21st national conference on Artificial intelligence - Volume 1
On the quest for optimal rule learning heuristics
Machine Learning
Lattice-search runtime distributions may be heavy-tailed
ILP'02 Proceedings of the 12th international conference on Inductive logic programming
Random search for hyper-parameter optimization
The Journal of Machine Learning Research
Data-based research at IIT Bombay
ACM SIGMOD Record
Hi-index | 0.00 |
Reports of experiments conducted with an Inductive Logic Programming system rarely describe how specific values of parameters of the system are arrived at when constructing models. Usually, no attempt is made to identify sensitive parameters, and those that are used are often given "factory-supplied" default values, or values obtained from some non-systematic exploratory analysis. The immediate consequence of this is, of course, that it is not clear if better models could have been obtained if some form of parameter selection and optimisation had been performed. Questions follow inevitably on the experiments themselves: specifically, are all algorithms being treated fairly, and is the exploratory phase sufficiently well-defined to allow the experiments to be replicated? In this paper, we investigate the use of parameter selection and optimisation techniques grouped under the study of experimental design. Screening and response surface methods determine, in turn, sensitive parameters and good values for these parameters. Screening is done here by constructing a stepwise regression model relating the utility of an ILP system's hypothesis to its input parameters, using systematic combinations of values of input parameters (technically speaking, we use a two-level fractional factorial design of the input parameters). The parameters used by the regression model are taken to be the sensitive parameters for the system for that application. We then seek an assignment of values to these sensitive parameters that maximise the utility of the ILP model. This is done using the technique of constructing a local "response surface". The parameters are then changed following the path of steepest ascent until a locally optimal value is reached. This combined use of parameter selection and response surface-driven optimisation has a long history of application in industrial engineering, and its role in ILP is demonstrated using well-known benchmarks. The results suggest that computational overheads from this preliminary phase are not substantial, and that much can be gained, both on improving system performance and on enabling controlled experimentation, by adopting well-established procedures such as the ones proposed here.