Comparing rebuild algorithms for mirrored and RAID5 disk arrays
SIGMOD '93 Proceedings of the 1993 ACM SIGMOD international conference on Management of data
Venti: A New Approach to Archival Storage
FAST '02 Proceedings of the Conference on File and Storage Technologies
A Performance Evaluation Tool for RAID Disk Arrays
QEST '04 Proceedings of the The Quantitative Evaluation of Systems, First International Conference
An efficient image authentication method based on Hamming code
Pattern Recognition
Dual-Level Key Management for secure grid communication in dynamic and hierarchical groups
Future Generation Computer Systems
StReD: A quality of security framework for storage resources in Data Grids
Future Generation Computer Systems
Fast and secure distributed read-only file system
OSDI'00 Proceedings of the 4th conference on Symposium on Operating System Design & Implementation - Volume 4
Editorial: Special Section: Grid and Pervasive Computing 2009
Future Generation Computer Systems
Hi-index | 0.00 |
Data integrity is an important aspect of storage and network security. The reality is that no security strategy is achieved without assuring data integrity. Data assurance provides reliability which is a prerequisite for most computer systems and network applications. This paper proposes a new technique for improving the detection of data integrity violations. The proposed technique is based on the Check Determinant Factor (CDF) in measuring data integrity assurance. It involves appending of a Determinant Factor (DF) for each data matrix before storing or transmitting the series of data. This DF is recomputed at the retrieved stage to ensure data integrity. Simulation results show that the new method outperforms the traditional methods such as Hamming code and RAID methods.