On the theory of average case complexity
Journal of Computer and System Sciences
On the difficulty of approximately maximizing agreements
Journal of Computer and System Sciences
Smoothed analysis of algorithms: Why the simplex algorithm usually takes polynomial time
Journal of the ACM (JACM)
Learning Kernel-Based Halfspaces with the 0-1 Loss
SIAM Journal on Computing
Data stability in clustering: a closer look
ALT'12 Proceedings of the 23rd international conference on Algorithmic Learning Theory
Hi-index | 0.01 |
We address a fundamental problem of complexity theory – the inadequacy of worst-case complexity for the task of evaluating the computational resources required for real life problems. While being the best known measure and enjoying the support of a rich and elegant theory, worst-case complexity seems gives rise to over-pessimistic complexity values. Many standard task, that are being carried out routinely in machine learning applications, are NP-hard, that is, infeasible from the worst-case-complexity perspective. In this work we offer an alternative measure of complexity for approximations-optimization tasks. Our approach is to define a hierarchy on the set of inputs to a learning task, so that natural (’real data’) inputs occupy only bounded levels of this hierarchy and that there are algorithms that handle in polynomial time each such bounded level.