Goodness-of-fit techniques
Multilayer feedforward networks are universal approximators
Neural Networks
Approximation capabilities of multilayer feedforward networks
Neural Networks
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Neural Computation
Neural networks for classification: a survey
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Hi-index | 0.00 |
The problem of assessing if a sample is coming from one of two probability distributions is most likely one of the oldest problems in the field of testing statistical hypotheses and a number of papers has been produced over the years without finding a most powerful test for this goal. In financial and insurance risk modeling, this problem is often addressed to identify the best extreme values model in a battery of alternatives or to design the heaviness of the tail of the underlying distribution. Taking advantage of the well known performance in classificatory problems of neural networks, the use of feedforward neural networks for discrimination between two distributions is herein proposed and the power of a neural goodness-of-fit test is estimated for small, moderate and large sample sizes in a wide range of symmetric and skewed alternatives. The empirical power of the procedure described is compared to the power of eight classic and well known normality tests for a sample to come from a normal distribution against each of twelve close-to normal alternatives. The neural test resulted to be the most powerful in the whole battery and its behavior was consistent with the expected statistical properties.