Proceedings of the 2001 conference on Applications, technologies, architectures, and protocols for computer communications
Hop-count filtering: an effective defense against spoofed DDoS traffic
Proceedings of the 10th ACM conference on Computer and communications security
IEEE Security and Privacy
A multifaceted approach to understanding the botnet phenomenon
Proceedings of the 6th ACM SIGCOMM conference on Internet measurement
Botz-4-sale: surviving organized DDoS attacks that mimic flash crowds
NSDI'05 Proceedings of the 2nd conference on Symposium on Networked Systems Design & Implementation - Volume 2
Peer-to-peer botnets: overview and case study
HotBots'07 Proceedings of the first conference on First Workshop on Hot Topics in Understanding Botnets
My botnet is bigger than yours (maybe, better than yours): why size estimates remain challenging
HotBots'07 Proceedings of the first conference on First Workshop on Hot Topics in Understanding Botnets
Wide-scale botnet detection and characterization
HotBots'07 Proceedings of the first conference on First Workshop on Hot Topics in Understanding Botnets
Phalanx: withstanding multimillion-node botnets
NSDI'08 Proceedings of the 5th USENIX Symposium on Networked Systems Design and Implementation
Survey and taxonomy of botnet research through life-cycle
ACM Computing Surveys (CSUR)
Hi-index | 0.00 |
Botnet-based distributed denial of service (DDoS) attacks represent an emerging and sophisticated threat for today's Internet. Attackers are now able to mimic the behavior of legitimate users to a great extent, making the issue of countering these attacks very challenging. In this paper, we propose a simple yet effective scheme that enables an ISP's edge routers to pass a great percentage of legitimate traffic, that is destined to a web server under DDoS attack within that ISP, while filtering all other traffic. The proposed scheme, called JUST-Google, is based on the fact that web search engines (especially Google™) represent the entrance for today's web, thus making it in a strategic position to defend against these attacks. The main idea is that Google™ can assist in identifying human users from bot programs by directing users who want to access a web site under attack to a group of nodes that will perform authentication in which users are required to solve a reverse Turing test to obtain access to the web server. Performance analysis shows that the proposed scheme would enable legitimate clients to access a web site that is under attack with high probability.