Probabilistic counting algorithms for data base applications
Journal of Computer and System Sciences
Elements of information theory
Elements of information theory
Theoretical Computer Science - Special issue on complexity theory and the theory of algorithms as developed in the CIS
Communication complexity
The space complexity of approximating the frequency moments
Journal of Computer and System Sciences
On randomized one-round communication complexity
Computational Complexity
Space lower bounds for distance approximation in the data stream model
STOC '02 Proceedings of the thiry-fourth annual ACM symposium on Theory of computing
Lectures on Discrete Geometry
Counting Distinct Elements in a Data Stream
RANDOM '02 Proceedings of the 6th International Workshop on Randomization and Approximation Techniques
Some complexity questions related to distributive computing(Preliminary Report)
STOC '79 Proceedings of the eleventh annual ACM symposium on Theory of computing
Informational Complexity and the Direct Sum Problem for Simultaneous Message Complexity
FOCS '01 Proceedings of the 42nd IEEE symposium on Foundations of Computer Science
Optimal space lower bounds for all frequency moments
SODA '04 Proceedings of the fifteenth annual ACM-SIAM symposium on Discrete algorithms
Finding frequent items in data streams
Theoretical Computer Science - Special issue on automata, languages and programming
An information statistics approach to data stream and communication complexity
Journal of Computer and System Sciences - Special issue on FOCS 2002
Optimal approximations of the frequency moments of data streams
Proceedings of the thirty-seventh annual ACM symposium on Theory of computing
Space efficient mining of multigraph streams
Proceedings of the twenty-fourth ACM SIGMOD-SIGACT-SIGART symposium on Principles of database systems
Stable distributions, pseudorandom generators, embeddings, and data stream computation
Journal of the ACM (JACM)
Data streams: algorithms and applications
Foundations and Trends® in Theoretical Computer Science
The Computational Hardness of Estimating Edit Distance [Extended Abstract]
FOCS '07 Proceedings of the 48th Annual IEEE Symposium on Foundations of Computer Science
Overcoming the l1 non-embeddability barrier: algorithms for product metrics
SODA '09 Proceedings of the twentieth Annual ACM-SIAM Symposium on Discrete Algorithms
The Data Stream Space Complexity of Cascaded Norms
FOCS '09 Proceedings of the 2009 50th Annual IEEE Symposium on Foundations of Computer Science
Lower bounds for edit distance and product metrics via Poincaré-type inequalities
SODA '10 Proceedings of the twenty-first annual ACM-SIAM symposium on Discrete Algorithms
The communication complexity of distributed task allocation
PODC '12 Proceedings of the 2012 ACM symposium on Principles of distributed computing
Hi-index | 0.00 |
The recent years have witnessed the overwhelming success of algorithms that operate on massive data. Several computing paradigms have been proposed for massive data set algorithms such as data streams, sketching, sampling etc. and understanding their limitations is a fundamental theoretical challenge. In this survey, we describe the information complexity paradigm that has proved successful in obtaining tight lower bounds for several well-known problems. Information complexity quantifies the amount of information about the inputs that must be necessarily propagated by any algorithm in solving a problem. We describe the key ideas of this paradigm, and highlight the beautiful interplay of techniques arising from diverse areas such as information theory, statistics and geometry.