Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Random Embedding Machines for Pattern Recognition
Neural Computation
Negative Encoding Length as a Subjective Interestingness Measure for Groups of Rules
PAKDD '09 Proceedings of the 13th Pacific-Asia Conference on Advances in Knowledge Discovery and Data Mining
Neuro-genetic system for stock index prediction
Journal of Intelligent & Fuzzy Systems: Applications in Engineering and Technology - Evolutionary neural networks for practical applications
Relaxation of hard classification targets for LSE minimization
EMMCVPR'05 Proceedings of the 5th international conference on Energy Minimization Methods in Computer Vision and Pattern Recognition
A survey of multiple classifier systems as hybrid systems
Information Fusion
Hi-index | 0.14 |
It is shown that partial classification, which allows for indecision in certain regions of the data space, can increase a benefit function, defined as the difference between the probabilities of correct and incorrect decisions, joint with the event that a decision is made. This is particularly true for small data samples, which may cause a large deviation of the estimated separation surface from the intersection surface between the corresponding probability density functions. Employing a particular density estimation method, an indecision domain is naturally defined by a single parameter, whose optimal size, maximizing the benefit function, is derived from the data. The benefit function is shown to translate into profit in stock trading. Employing medical and economic data, it is shown that partial classification produces, on average, higher benefit values than full classification, assigning each new object to a class, and that the marginal benefit of partial classification reduces as the data size increases.