Amortized efficiency of list update and paging rules
Communications of the ACM
The input/output complexity of sorting and related problems
Communications of the ACM
Journal of Algorithms
Partially preemptible hash joins
SIGMOD '93 Proceedings of the 1993 ACM SIGMOD international conference on Management of data
The art of computer programming, volume 1 (3rd ed.): fundamental algorithms
The art of computer programming, volume 1 (3rd ed.): fundamental algorithms
The art of computer programming, volume 3: (2nd ed.) sorting and searching
The art of computer programming, volume 3: (2nd ed.) sorting and searching
Goal Oriented, Adaptive Transaction Routing for High Performance Transaction Processing Systems
PDIS '93 Proceedings of the 2nd International Conference on Parallel and Distributed Information Systems
Dynamic Memory Adjustment for External Mergesort
VLDB '97 Proceedings of the 23rd International Conference on Very Large Data Bases
An Adaptive Hash Join Algorithm for Multiuser Environments
VLDB '90 Proceedings of the 16th International Conference on Very Large Data Bases
Managing Memory to Meet Multiclass Workload Response Time Goals
VLDB '93 Proceedings of the 19th International Conference on Very Large Data Bases
Memory-Adaptive External Sorting
VLDB '93 Proceedings of the 19th International Conference on Very Large Data Bases
I/O complexity: The red-blue pebble game
STOC '81 Proceedings of the thirteenth annual ACM symposium on Theory of computing
External memory algorithms and data structures: dealing with massive data
ACM Computing Surveys (CSUR)
Handbook of massive data sets
Algorithms and data structures for external memory
Foundations and Trends® in Theoretical Computer Science
Hi-index | 0.00 |
External Memory algorithms play a key role in database management systems and large scale processing systems. External memory algorithms are typically tuned for efficient performance given a fixed, statically allocated amount of internal memory. However, with the advent of real-time database system and database systems based upon administratively defined goals, algorithms must increasingly be able to adapt in an online manner when the amount of internal memory allocated to them changes dynamically and unpredictably. In this paper, we present a theoretical and applicable framework for memory-adaptive algorithms (or simply MA algorithms).We define the competitive worst-case notion of what it means for an MA algorithm to be dynamically optimal and prove fundamental lower bounds on the performance of MA algorithms for problems such as sorting, standard matrix multiplication, and several related problems. Our main tool for proving dynamic optimality is the notion of resource consumption, which measures how efficiently an MA algorithm adapts itself to memory fluctuations.We present the first dynamically optimal algorithm for sorting (based upon mergesort), permuting, FFT, permutation networks, buffer trees, (standard) matrix multiplication, and LU decomposition. In each case, dynamic optimality is demonstrated via a potential function argument showing that the algorithm's resource consumption is within a constant factor of optimal.