Skinput is an input technology that uses bio-acoustic sensors to locate finger taps on the skin. When raised with a pico-projector, the device can provide direct manipulation, graphical user interface in the body. The technology was developed by Chris Harrison, Desney Tan and Dan Morris, in the Microsoft Research Users Computational Experiences Group. Skinput represents a way to decouple the input of electronic devices in order to allow the devices to become smaller without simultaneously reducing the surface area on which the input can be made. While other systems, such as SixthSense have tried this with computer vision, Skinput employs acoustics, which take advantage of the conductive properties of the natural sound of the human body (eg, bone conduction). This allows the body to be attached as an entry surface without the need for the skin to be invasively instrumented with sensors, tracking markers or other elements.
Microsoft has not commented on the future of the projects, other than that it is in active development. It has been reported that this may not appear on commercial devices for at least 2 years. Skinput has been publicly demonstrated as a bracelet, found in the biceps. This prototype contains ten small cantilevered piezoelectric elements configured to be highly resonant, sensitive to frequencies between 25 and 78 Hz. This configuration acts as a mechanical Fast Fourier transform and provides extreme suppression of out-of-band noise, allowing the system to function even while The user is on the move. From the upper arm, the sensors can locate the finger taps delivered to any part of the arm, up to the fingertips, with an accuracy of more than 90% (up to 96% for five input locations). The classification is driven by a support vector machine using a series of time-independent acoustic characteristics that act as a fingerprint. Like voice recognition systems, the Skinput recognition engine must be trained in the "sound" of each input location before use. After training, placements can be linked to interactive functions such as pausing / playing songs, increasing / decreasing music volume, speed dialing, and menu navigation.
Despite being an internal project of Microsoft Research, Skinput has been publicly demonstrated several times. The first public appearance was at Microsoft's TechFest 2010, where the recognition model was trained live on stage during the presentation, followed by an interactive tour of a simple mobile application with four modes: music player, Electronic, Tetris and voicemail. A similar live demonstration was given at the ACM CHI 2010 conference, where academic work received the "Best Paper" award. Attendees were allowed to test the system. Numerous media outlets have covered the technology, with several live demos.