24-11-2010, 10:56 AM
This article is submitted by:Arpita Nayak
Mobile phones are the most widespread ubiquitous devices. Due to their inherent context-awareness , they are increasingly used to interact with the user’s current surroundings and nearby realworld objects. Sensors integrated in today’s mobile phones such as GPS receivers, compasses, accelerometers and cameras not only allow to view digital information about such objects but also to manipulate it.
The project “6th sense” is , a mobile gestural-controlled system, have attracted considerable interest. The portable combination of a common webcam, a laptop computer and a tiny projector allows the augmentation of arbitrary surfaces and objects by projected information while triggering actions through natural hand gestures. The visual detection of a user’s gestures causes the involved computer to vanish into the background. Still, the system relies on the laptop computer which has to be carried in a backpack when used on the move.
Inspired by this work and with emerging projector phones in mind, we develop a framework supporting hand gesture manipulation of projected content through a mobile phone. Our aim is to make the mobile phone a wearable, truly unnoticeable mediator between the real and the virtual world changing the human interaction style from a device-centric over to a contentcentric one.
ABSTRACT ON SIXTH SENCE
Mobile phones are the most widespread ubiquitous devices. Due to their inherent context-awareness , they are increasingly used to interact with the user’s current surroundings and nearby realworld objects. Sensors integrated in today’s mobile phones such as GPS receivers, compasses, accelerometers and cameras not only allow to view digital information about such objects but also to manipulate it.
The project “6th sense” is , a mobile gestural-controlled system, have attracted considerable interest. The portable combination of a common webcam, a laptop computer and a tiny projector allows the augmentation of arbitrary surfaces and objects by projected information while triggering actions through natural hand gestures. The visual detection of a user’s gestures causes the involved computer to vanish into the background. Still, the system relies on the laptop computer which has to be carried in a backpack when used on the move.
Inspired by this work and with emerging projector phones in mind, we develop a framework supporting hand gesture manipulation of projected content through a mobile phone. Our aim is to make the mobile phone a wearable, truly unnoticeable mediator between the real and the virtual world changing the human interaction style from a device-centric over to a contentcentric one.