Computer forensics in forensis
ACM SIGOPS Operating Systems Review
IWANN '09 Proceedings of the 10th International Work-Conference on Artificial Neural Networks: Part I: Bio-Inspired Systems: Computational and Ambient Intelligence
Debugging complex software systems by means of pathfinder networks
Information Sciences: an International Journal
E-voting and forensics: prying open the black box
EVT/WOTE'09 Proceedings of the 2009 conference on Electronic voting technology/workshop on trustworthy elections
Infrastructure for forensic analysis of multi-agent based simulations
ProMAS'09 Proceedings of the 7th international conference on Programming multi-agent systems
Validating ambient intelligence based ubiquitous computing systems by means of artificial societies
Information Sciences: an International Journal
Hi-index | 0.00 |
Forensic analysis is the process of understanding, re-creating, and analyzing arbitrary events that have previously occurred. It seeks to answer such questions as how an intrusion occurred, what an attacker did during an intrusion, and what the effects of an attack were. Currently the field of computer forensics is largely ad hoc. Data is generally collected because applications log it for debugging purposes or because someone thought it to be important. Practical forensic analysis has traditionally traded off analyzability against the amount of data recorded. Recording less data puts a smaller burden both on computer systems and on the humans that analyze them. Not recording enough data leaves analysts drawing their conclusions based on inference, rather than deduction. This dissertation presents a model of forensic analysis, called Laocoön, designed to determine what data is necessary to understand past events. The model builds upon an earlier model used for intrusion detection, called the requires/provides model. The model is based on a set of qualities we believe a good forensic model should possess. Those qualities are in turn influenced by a set of five principles of computer forensic analysis. We apply Laocoön to examples, and present the results for a UNIX system. The results demonstrate how the model can be used to record smaller amounts of highly useful data, rather than forcing a choice between overwhelming amounts of data or such a small amount of data to be effectively useless.