Elements of information theory
Elements of information theory
On the self-similar nature of Ethernet traffic
SIGCOMM '93 Conference proceedings on Communications architectures, protocols and applications
Wide area traffic: the failure of Poisson modeling
IEEE/ACM Transactions on Networking (TON)
IEEE/ACM Transactions on Networking (TON)
Self-similarity in World Wide Web traffic: evidence and possible causes
IEEE/ACM Transactions on Networking (TON)
On the relationship between file sizes, transport protocols, and self-similar network traffic
ICNP '96 Proceedings of the 1996 International Conference on Network Protocols (ICNP '96)
A user-friendly self-similarity analysis tool
ACM SIGCOMM Computer Communication Review
Profiling internet backbone traffic: behavior models and applications
Proceedings of the 2005 conference on Applications, technologies, architectures, and protocols for computer communications
Mining anomalies using traffic feature distributions
Proceedings of the 2005 conference on Applications, technologies, architectures, and protocols for computer communications
Entropy Based Worm and Anomaly Detection in Fast IP Networks
WETICE '05 Proceedings of the 14th IEEE International Workshops on Enabling Technologies: Infrastructure for Collaborative Enterprise
Data streaming algorithms for estimating entropy of network traffic
SIGMETRICS '06/Performance '06 Proceedings of the joint international conference on Measurement and modeling of computer systems
Enhanced Skype traffic identification
Proceedings of the 2nd international conference on Performance evaluation methodologies and tools
ACM SIGMETRICS Performance Evaluation Review
Detailed analysis of Skype traffic
IEEE Transactions on Multimedia
IEEE Spectrum
The context-tree weighting method: basic properties
IEEE Transactions on Information Theory
Hi-index | 0.00 |
The Internet has been evolving into a more heterogeneous internetwork with diverse new applications imposing more stringent bandwidth and QoS requirements. Already new applications such as YouTube, Hulu, and Netflix are consuming a large fraction of the total bandwidth. We argue that, in order to engineer future internets such that they can adequately cater to their increasingly diverse and complex set of applications while using resources efficiently, it is critical to be able to characterize the load that emerging and future applications place on the underlying network. In this article, we investigate entropy as a metric for characterizing per-flow network traffic complexity. While previous work has analyzed aggregated network traffic, we focus on studying isolated traffic flows. Per-application flow characterization caters to the need of network control functions such as traffic scheduling and admission control at the edges of the network. Such control functions necessitate differentiating network traffic on a per-application basis. The “entropy fingerprints” that we get from our entropy estimator summarize many characteristics of each application's network traffic. Not only can we compare applications on the basis of peak entropy, but we can also categorize them based on a number of other properties of the fingerprints.