Opinion mining of customer feedback data on the web
Proceedings of the 2nd international conference on Ubiquitous information management and communication
Mining significant graph patterns by leap search
Proceedings of the 2008 ACM SIGMOD international conference on Management of data
Identifying differentially expressed subnetworks with MMG
Bioinformatics
GraphSig: A Scalable Approach to Mining Significant Subgraphs in Large Graph Databases
ICDE '09 Proceedings of the 2009 IEEE International Conference on Data Engineering
ECML PKDD '09 Proceedings of the European Conference on Machine Learning and Knowledge Discovery in Databases: Part I
Constraint-Based Pattern Mining in Dynamic Graphs
ICDM '09 Proceedings of the 2009 Ninth IEEE International Conference on Data Mining
Output space sampling for graph patterns
Proceedings of the VLDB Endowment
GAIA: graph classification using evolutionary computation
Proceedings of the 2010 ACM SIGMOD International Conference on Management of data
Mining Heavy Subgraphs in Time-Evolving Networks
ICDM '11 Proceedings of the 2011 IEEE 11th International Conference on Data Mining
Hi-index | 0.00 |
Global-state networks provide a powerful mechanism to model the increasing heterogeneity in data generated by current systems. Such a network comprises of a series of network snapshots with dynamic local states at nodes, and a global network state indicating the occurrence of an event. Mining discriminative subgraphs from global-state networks allows us to identify the influential sub-networks that have maximum impact on the global state and unearth the complex relationships between the local entities of a network and their collective behavior. In this paper, we explore this problem and design a technique called MINDS to mine minimally discriminative subgraphs from large global-state networks. To combat the exponential subgraph search space, we derive the concept of an edit map and perform Metropolis Hastings sampling on it to compute the answer set. Furthermore, we formulate the idea of network-constrained decision trees to learn prediction models that adhere to the underlying network structure. Extensive experiments on real datasets demonstrate excellent accuracy in terms of prediction quality. Additionally, MINDS achieves a speed-up of at least four orders of magnitude over baseline techniques.