Digital search trees revisited
SIAM Journal on Computing
Some results on V-ary asymmetric tries
Journal of Algorithms
Elements of information theory
Elements of information theory
Improved behaviour of tries by adaptive branching
Information Processing Letters
Scalable high speed IP routing lookups
SIGCOMM '97 Proceedings of the ACM SIGCOMM '97 conference on Applications, technologies, architectures, and protocols for computer communication
The art of computer programming, volume 3: (2nd ed.) sorting and searching
The art of computer programming, volume 3: (2nd ed.) sorting and searching
The analysis of hybrid trie structures
Proceedings of the ninth annual ACM-SIAM symposium on Discrete algorithms
Average Case Analysis of Algorithms on Sequences
Average Case Analysis of Algorithms on Sequences
Random Structures & Algorithms - Special issue on analysis of algorithms dedicated to Don Knuth on the occasion of his (100)8th birthday
Fast address look-up for internet routers
BC '98 Proceedings of the IFIP TC6/WG6.2 Fourth International Conference on Broadband Communications: The future of telecommunications
Faster Searching in Tries and Quadtrees - An Analysis of Level Compression
ESA '94 Proceedings of the Second Annual European Symposium on Algorithms
Improved Behaviour of Tries by the "Symmetrization" of the Source
DCC '02 Proceedings of the Data Compression Conference
On the number of full levels in tries
Random Structures & Algorithms
Probabilistic behavior of asymmetric level compressed tries
Random Structures & Algorithms
IP-address lookup using LC-tries
IEEE Journal on Selected Areas in Communications
Hi-index | 0.89 |
Andersson and Nilsson have already shown that the average depth Dn of random LC-tries is only Θ(log* n) when the keys are produced by a symmetric memoryless process, and that Dn = O(log logn) when the process is asymmetric. In this paper we refine the second estimate by showing that asymptotically (with n ⇒ ∞): Dn ∼ 1/η log logn, where n is the number of keys inserted in a trie, η = - log(1 - h/h-∞), h = -p log p - q log q is the entropy of a binary, memoryless source with probabilities p, q = 1 - p (p≠q), and h-∞ = - log min(p, q).