Scientific discovery: computational explorations of the creative process
Scientific discovery: computational explorations of the creative process
A robust approach to numeric discovery
Proceedings of the seventh international conference (1990) on Machine learning
Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Stochastic Complexity in Statistical Inquiry Theory
Stochastic Complexity in Statistical Inquiry Theory
Law discovery using neural networks
IJCAI'97 Proceedings of the Fifteenth international joint conference on Artifical intelligence - Volume 2
Discovery of a Set of Nominally Conditioned Polynomials
DS '99 Proceedings of the Second International Conference on Discovery Science
Discovering Polynomials to Fit Multivariate Data Having Numeric and Nominal Variables
Progress in Discovery Science, Final Report of the Japanese Discovery Science Project
Finding Polynomials to Fit Multivariate Data Having Numeric and Nominal Variables
IDA '01 Proceedings of the 4th International Conference on Advances in Intelligent Data Analysis
Hi-index | 0.00 |
Lately the authors have proposed a new law discovery method called RF5 using neural networks; i.e., law-candidates (neural networks) are trained by using a second-order learning algorithm, and an information criterion selects the most suitable from law-candidates. Our previous experiments showed that RF5 worked well for relatively small problems. This paper evaluates how the method can be scaled up, and analyses how it is invariant for the normalization of input and output variables. Since the sizes of many real data are middle or large, the scalability of any law discovery method is highly important. Moreover since in most real data different variables have typical values which may differ significantly, the invariant nature for the normalization of variables is also important.