I am in dire need of the circuit diagram and full details of the above mentioned project
Please help.
Thank you.
Posts: 14,118
Threads: 61
Joined: Oct 2014
Devices with small size have some limitations. Since you can not make bigger buttons and screens without losing the benefit of small size. The main reason for appropriating the human body as an input device is: Easy access by hands. The popularity of mobile devices increases day by day due to the advantages such as portability, mobility and flexibility. There are many advantages of small size, mainly we can take it comfortably but the limited size gives a very little interactive surface. So a large interactive area to use but we want to easily carry it in your pocket.
Skinput allows the user to simply touch their skin to control audio devices, play games, make phone calls. Screen Skinput turns the body into a touch screen surface. Use the sensors to determine where the user touches their skin. The device is called an imaginary interface located inside the palm of the user's hand. This UI is "imaginary" in the sense that there is nothing really beyond the bare hand. The photo below shows how an imaginary "mobile phone" could be placed in the user's left hand. As each point is touched a specific mobile function will be activated and announced by means of a computerized voice for this system to really work there must be some way for users to hear the computer such as using a headset. There must be a way for the computer to know what part of the hand the user is touching. In Gustafson's prototype research this was done with an optical motion tracker that seemed too clumsy for real-world applications. But that's okay; we can easily imagine advances in the recognition of gestures that would be more portable and less intrusive. Although you will not be able to buy a mobile phone soon, it is definitely interesting to consider the potential of the interfaces where users touch themselves instead of a mobile device.