Direct methods for sparse matrices
Direct methods for sparse matrices
An Analysis of Spectral Envelope Reduction via Quadratic Assignment Problems
SIAM Journal on Matrix Analysis and Applications
Small worlds: the dynamics of networks between order and randomness
Small worlds: the dynamics of networks between order and randomness
Performance of data networks with random links
Mathematics and Computers in Simulation
Normalized Cuts and Image Segmentation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Matlab guide
Random Structures & Algorithms
A Matrix Perturbation View of the Small World Phenomenon
SIAM Journal on Matrix Analysis and Applications
CONTEST: A Controllable Test Matrix Toolbox for MATLAB
ACM Transactions on Mathematical Software (TOMS)
Quantifying Brain Connectivity: A Comparative Tractography Study
MICCAI '09 Proceedings of the 12th International Conference on Medical Image Computing and Computer-Assisted Intervention: Part I
Hi-index | 0.00 |
New classes of random graphs have recently been shown to exhibit the small world phenomenon--they are clustered like regular lattices and yet have small average pathlengths like traditional random graphs. Small world behaviour has been observed in a number of real life networks, and hence these random graphs represent a useful modelling tool. In particular, Grindrod [Phys. Rev. E 66 (2002) 066702-1] has proposed a class of range dependent random graphs for modelling proteome networks in bioinformatics. A property of these graphs is that, when suitably ordered, most edges in the graph are short-range, in the sense that they connect near-neighbours, and relatively few are long-range. Grindrod also looked at an inverse problem--given a graph that is known to be an instance of a range dependent random graph, but with vertices in arbitrary order, can we reorder the vertices so that the short-range/long-range connectivity structure is apparent? When the graph is viewed in terms of its adjacency matrix, this becomes a problem in sparse matrix theory: find a symmetric row/column reordering that places most nonzeros close to the diagonal. Algorithms of this general nature have been proposed for other purposes, most notably for reordering to reduce fill-in and for clustering large data sets. Here, we investigate their use in the small world reordering problem. Our numerical results suggest that a spectral reordering algorithm is extremely promising, and we give some theoretical justification for this observation via the maximum likelihood principle.