A tight upper bound on the cover time for random walks on graphs
Random Structures & Algorithms
On unbiased sampling for unstructured peer-to-peer networks
Proceedings of the 6th ACM SIGCOMM conference on Internet measurement
Hi-index | 0.00 |
In this brief announcement we propose a distributed algorithm to assess the connectivity quality of a network, be it physical or logical. In large complex networks, some nodes may play a vital role due to their position (e.g.for routing or network reliability). Assessing global properties of a graph, as importance of nodes, usually involves lots of communications; doing so while keeping the overhead low is an open challenge. To that end, centralitynotions have been introduced by researchers (see e.g.[Fre77]) to rank the nodes as a function of their importance in the topology. Some of these techniques are based on computing the ratio of shortest-paths that pass through any graph node. This approach has a limitation as nodes "close" from the shortest-paths do not get a better score than any other arbitrary ones. To avoid this drawback, physician Newman proposed a centralized measure of betweenness centrality[New03] based on random walks: counting for each node the number of visits of a random walk travelling from a node ito a target node j, and then averaging this value by all graph source/target pairs. Yet this approach relies on the full knowledge of the graph for each system node, as the random walk target node should be known by advance; this is not an option in large-scale networks. We propose a distributed solution that relies on a single random walk travelling in the graph; each node only needs to be aware of its topological neighbors to forward the walk.