You are what you say: privacy risks of public mentions
SIGIR '06 Proceedings of the 29th annual international ACM SIGIR conference on Research and development in information retrieval
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Robust De-anonymization of Large Sparse Datasets
SP '08 Proceedings of the 2008 IEEE Symposium on Security and Privacy
PoolView: stream privacy for grassroots participatory sensing
Proceedings of the 6th ACM conference on Embedded network sensor systems
Proceedings of the 18th international conference on World wide web
A survey of computational location privacy
Personal and Ubiquitous Computing
Virtual individual servers as privacy-preserving proxies for mobile devices
Proceedings of the 1st ACM workshop on Networking, systems, and applications for mobile handhelds
Privacy risks emerging from the adoption of innocuous wearable sensors in the mobile environment
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A Large Scale Gathering System for Activity Data with Mobile Sensors
ISWC '11 Proceedings of the 2011 15th Annual International Symposium on Wearable Computers
Proceedings of the 13th international conference on Ubiquitous computing
mPuff: automated detection of cigarette smoking puffs from respiration measurements
Proceedings of the 11th international conference on Information Processing in Sensor Networks
Provable de-anonymization of large datasets with sparse dimensions
POST'12 Proceedings of the First international conference on Principles of Security and Trust
Hi-index | 0.00 |
Underpinning many recent advances in sensing applications (e.g., mHealth) is the ability to safely collect and share mobile sensor data. Research has shown that even from seemingly harmless sensors (e.g., accelerometers, gyroscopes, or magnetometers) an ever expanding set of potentially sensitive user behavior can be inferred. Providing robust anonymity assurances is a principal mechanism for protecting users when data is shared (e.g., with medical professionals or friends). In this paper, we study the feasibility of user de-anonymization from mobile sensor datasets routinely collected on commodity devices (e.g., smartphones). We perform a systematic investigation to quantify the threat of de-anonymization using existing sparsity-based techniques adapted to exploit mobile sensor data characteristics. This preliminary study indicates significant threats to user anonymity exist within shared mobile sensor data and further investigation is warranted.