Differentiability of family of generalized-inverses of a differentiable family of matrices
MATH'05 Proceedings of the 8th WSEAS International Conference on Applied Mathematics
A geometric newton method for oja's vector field
Neural Computation
A truncated-CG style method for symmetric generalized eigenvalue problems
Journal of Computational and Applied Mathematics
The omni macroprogramming environment for sensor networks
ICCS'06 Proceedings of the 6th international conference on Computational Science - Volume Part III
Adaptive model trust region methods for generalized eigenvalue problems
ICCS'05 Proceedings of the 5th international conference on Computational Science - Volume Part I
Hi-index | 0.00 |
We propose a Newton-like iteration that evolves on the set of fixed dimensional subspaces of $\rr^n$ and converges locally cubically to the invariant subspaces of a symmetric matrix. This iteration is compared in terms of numerical cost and global behavior with three other methods that display the same property of cubic convergence. Moreover, we consider heuristics that greatly improve the global behavior of the iterations.