Structural Matching by Discrete Relaxation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Structural Matching in Computer Vision Using Probabilistic Relaxation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Central Clustering of Attributed Graphs
Machine Learning
Graph clustering using the weighted minimum common supergraph
GbRPR'03 Proceedings of the 4th IAPR international conference on Graph based representations in pattern recognition
ACM attributed graph clustering for learning classes of images
GbRPR'03 Proceedings of the 4th IAPR international conference on Graph based representations in pattern recognition
Manifold embedding for shape analysis
Neurocomputing
Hi-index | 0.00 |
This paper shows how to construct a generative model for graph structure. We commence from a sample of graphs where the correspondences between nodes are unknown ab initio. We also work with graphs where there may be structural differences present, i.e. variations in the number of nodes in each graph and the edge-structure. The idea underpinning the method is to embed the nodes of the graphs into a vector space by performing kernel PCA on the heat kernel. The co-ordinates of the nodes are determined by the eigenvalues and eigenvectors of the Laplacian matrix, together with a time parameter which can be used to scale the embedding. Node correspondences are located by applying Scott and Longuet-Higgins algorithm to the embedded nodes. We capture variations in graph structure using the covariance matrix for corresponding embedded point-positions. We construct a point distribution model for the embedded node positions using the eigenvalues and eigenvectors of the covariance matrix. We show how to use this model to both project individual graphs into the eigenspace of the point-position covariance matrix and how to fit the model to potentially noisy graphs to reconstruct the Laplacian matrix. We illustrate the utility of the resulting method for shape-analysis using data from the COIL database.