Feature Space Interpretation of SVMs with Indefinite Kernels
IEEE Transactions on Pattern Analysis and Machine Intelligence
A highly efficient implementation of a backpropagation learning algorithm using matrix ISA
Journal of Parallel and Distributed Computing
Hi-index | 0.00 |
A new theory is developed for the feature spaces of hyperbolic tangent used as an activation kernel for non-linear support vector machines. The theory developed herein is based on the distinct features of hyperbolic geometry, which leads to an interesting geometrical interpretation of the higher-dimensional feature spaces of neural networks using hyperbolic tangent as the activation function. The new theory is used to explain the separability of hyperbolic tangent kernels where we show that the separability is possible only for a certain class of hyperbolic kernels. Simulation results are given supporting the separability theory.