Information architecture for the World Wide Web
Information architecture for the World Wide Web
Item-based collaborative filtering recommendation algorithms
Proceedings of the 10th international conference on World Wide Web
Blink detection for real-time eye tracking
Journal of Network and Computer Applications
N-smarts: networked suite of mobile atmospheric real-time sensors
Proceedings of the second ACM SIGCOMM workshop on Networked systems for developing regions
It takes variety to make a world: diversification in recommender systems
Proceedings of the 12th International Conference on Extending Database Technology: Advances in Database Technology
Visual snippets: summarizing web pages for search and revisitation
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
SoundSense: scalable sound sensing for people-centric applications on mobile phones
Proceedings of the 7th international conference on Mobile systems, applications, and services
A framework of energy efficient mobile sensing for automatic user state recognition
Proceedings of the 7th international conference on Mobile systems, applications, and services
Here's what i did: sharing and reusing web activity with ActionShot
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
MAUI: making smartphones last longer with code offload
Proceedings of the 8th international conference on Mobile systems, applications, and services
MoVi: mobile phone based video highlights via collaborative sensing
Proceedings of the 8th international conference on Mobile systems, applications, and services
ConBrowse - contextual content browsing
CCNC'10 Proceedings of the 7th IEEE conference on Consumer communications and networking conference
LittleRock: Enabling Energy-Efficient Continuous Sensing on Mobile Phones
IEEE Pervasive Computing
TagSense: a smartphone-based approach to automatic image tagging
MobiSys '11 Proceedings of the 9th international conference on Mobile systems, applications, and services
SURF: speeded up robust features
ECCV'06 Proceedings of the 9th European conference on Computer Vision - Volume Part I
Bootstrapping personal gesture shortcuts with the wisdom of the crowd and handwriting recognition
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Hi-index | 0.00 |
This paper describes a system for automatically rating content - mainly movies and videos - at multiple granularities. Our key observation is that the rich set of sensors available on today's smartphones and tablets could be used to capture a wide spectrum of user reactions while users are watching movies on these devices. Examples range from acoustic signatures of laughter to detect which scenes were funny, to the stillness of the tablet indicating intense drama. Moreover, unlike in most conventional systems, these ratings need not result in just one numeric score, but could be expanded to capture the user's experience. We combine these ideas into an Android based prototype called Pulse, and test it with 11 users each of whom watched 4 to 6 movies on Samsung tablets. Encouraging results show consistent correlation between the user's actual ratings and those generated by the system. With more rigorous testing and optimization, Pulse could be a candidate for real-world adoption.