The nature of statistical learning theory
The nature of statistical learning theory
Matrix computations (3rd ed.)
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Adiabatic quantum state generation and statistical zero knowledge
Proceedings of the thirty-fifth annual ACM symposium on Theory of computing
A Probabilistic Algorithm for k-SAT and Constraint Satisfaction Problems
FOCS '99 Proceedings of the 40th Annual Symposium on Foundations of Computer Science
How Powerful is Adiabatic Quantum Computation?
FOCS '01 Proceedings of the 42nd IEEE symposium on Foundations of Computer Science
Elements of Forecasting
Hi-index | 12.05 |
Instead of traditionally (globally) adiabatic evolution algorithm for unstructured search proposed by Farhi or Van Dam, the high efficiency search using nested local adiabatic evolution algorithm for structured search is herein introduced to the quantum-like neurons in Hopfield-neural-net for performing several local adiabatic quantum searches and then nesting them together so that the optimal or near-optimal solutions can be founded efficiently. Particularly, this approach is applied to optimally training support vector regression (SVR) in such a way that tuning three free parameters of SVR toward an optimal regression is fast obtained, just like a kind of adaptive support vector regression (ASVR). Hence, we focus on the structured adiabatic quantum search by nesting a partial search over a reduced set of variables into a global search for solving an optimization problem on SVR, yielding an average complexity of order N^@a, with @a