MFCS '90 Selected papers of the 15th international symposium on Mathematical foundations of computer science
A data structure for manipulating priority queues
Communications of the ACM
Communications of the ACM
Performance engineering case study: heap construction
Journal of Experimental Algorithmics (JEA)
On the Performance of WEAK-HEAPSORT
STACS '00 Proceedings of the 17th Annual Symposium on Theoretical Aspects of Computer Science
Implementing HEAPSORT with (n logn - 0.9n) and QUICKSORT with (n logn + 0.2n) comparisons
Journal of Experimental Algorithmics (JEA)
Black box for constant-time insertion in priority queues (note)
ACM Transactions on Algorithms (TALG)
ACM Transactions on Algorithms (TALG)
Introduction to Algorithms, Third Edition
Introduction to Algorithms, Third Edition
Policy-based benchmarking of weak heaps and their relatives,
SEA'10 Proceedings of the 9th international conference on Experimental Algorithms
The weak-heap data structure: Variants and applications
Journal of Discrete Algorithms
In-place heap construction with optimized comparisons, moves, and cache misses
MFCS'12 Proceedings of the 37th international conference on Mathematical Foundations of Computer Science
Hi-index | 0.00 |
A weak heap is a priority queue that supports the operations construct, minimum, insert, and extract-min. To store n elements, it uses an array of n elements and an array of n bits. In this paper we study different possibilities for optimizing construct and insert such that minimum and extract-min are not made slower. We provide a catalogue of algorithms that optimize the standard algorithms in various ways. As the optimization criteria, we consider the worst-case running time, the number of instructions, branch mispredictions, cache misses, element comparisons, and element moves. Our contributions are summarized as follows:1.The standard algorithm for construct runs in O(n) worst-case time and performs n-1 element comparisons. Our improved algorithms reduce the number of instructions, the number of branch mispredictions, the number of element moves, and the number of cache misses. 2(a)Even though the worst-case running time of the standard insert algorithm is logarithmic, we show that-in contrast to binary heaps-n repeated insert operations require at most 3.5n+O(lg^2n) element comparisons. 2(b)We improve a recent result of ours, in which we achieve O(1) amortized time per insertion, to guarantee O(1) worst-case time per insertion. After the deamortization, minimum still takes O(1) worst-case time and involves no element comparisons, and extract-min takes O(lgn) worst-case time and involves at most lgn+O(1) element comparisons. This constant-factor optimality concerning the number of element comparisons has previously been achieved only by pointer-based multipartite priority queues. We have implemented most of the proposed algorithms and tested their practical behaviour. Interestingly, for integer data, reducing the number of branch mispredictions turned out to be an effective optimization in our experiments.