Multitask learning
Classification-based objective functions
Machine Learning
IEEE Transactions on Information Theory
High-order and multilayer perceptron initialization
IEEE Transactions on Neural Networks
Injecting Chaos in Feedforward Neural Networks
Neural Processing Letters
Hi-index | 0.00 |
Effective backpropagation training of multi-layer perceptrons depends on the incorporation of an appropriate error or objective function. Classification-Based (CB) error functions are heuristic approaches that attempt to guide the network directly to correct pattern classification rather than using common error minimization heuristics, such as Sum-Squared Error (SSE) and Cross-Entropy (CE), which do not explicitly minimize classification error. This work presents CB3, a novel CB approach that learns the error function to be used while training. This is accomplished by learning pattern confidence margins during training, which are used to dynamically set output target values for each training pattern. On 11 applications, CB3 significantly outperforms previous CB error functions, and also reduces average test error over conventional error metrics using 0---1 targets without weight decay by 1.8%, and by 1.3% over metrics with weight decay. The CB3 also exhibits lower model variance and tighter mean confidence interval.