Blue Eyes Technology
#11
PRESENTED BY:
K.V.VIKRANTH KUMAR

[attachment=10712]
1. INTRODUCTION
Imagine yourself in a world where humans interact with computers. You are sitting in front of your personal computer that can listen, talk, or even scream aloud. It has the ability to gather information about you and interact with you through special techniques like facial recognition, speech recognition, etc. It can even understand your emotions at the touch of the mouse. It verifies your identity, feels your presents, and starts interacting with you .You ask the computer to dial to your friend at his office. It realizes the urgency of the situation through the mouse, dials your friend at his office, and establishes a connection.
Human cognition depends primarily on the ability to perceive, interpret, and integrate audio-visuals and sensoring information. Adding extraordinary perceptual abilities to computers would enable computers to work together with human beings as intimate partners.
Researchers are attempting to add more capabilities to computers that will allow them to interact like humans, recognize human presents, talk, listen, or even guess their feelings.
The BLUEEYES technology aims at creating computational machines that have perceptual and sensory ability like those of human beings. It uses non-obtrusige sensing method, employing most modern video cameras and microphones to identify the user’s actions through the use of imparted sensory abilities. The machine can understand what a user wants, where he is looking at, and even realize his physical or emotional states.
2. EMOTION AND COMPUTING
One goal of human computer interaction (HCI) is to make an adaptive, smart computer system. This type of project could possibly include gesture recognition, facial recognition, eye tracking, speech Recognition, etc. Another non-invasive way to obtain information about a person is through touch. People use their computers to obtain, store and manipulate data using their computer. In order to start creating smart computers, the computer must start gaining information about the user. Our proposed method for gaining user information through touch is via a computer input device, the mouse. From the physiological data obtained from the user, an emotional state may be determined which would then be related to the task the user is currently doing on the computer. Over a period of time, a user model will be built in order to gain a sense of the user's personality. The scope of the project is to have the computer adapt to the user in order to create a better working environment where the user is more productive. The first steps towards realizing this goal are described here. Rosalind Picard (1997) describes why emotions are important to the computing community. There are two aspects of affective computing: giving the computer the ability to detect emotions and giving the computer the ability to express emotions. Not only are emotions crucial for rational decision making as Picard describes, but emotion detection is an
important step to an adaptive computer system. An adaptive, smart computer
system has been driving our efforts to detect a person’s emotional state. An important element of incorporating emotion into computing is for
productivity for a computer user. A study (Dryer & Horowitz, 1997) has shown that people with personalities that are similar or complement each other collaborate well. Dryer (1999) has also shown that people view their computer as having a personality. For these reasons, it is important to develop computers which can work well with its user. By matching a
person’s emotional state and the context of the expressed emotion, over a period of time the person’s personality is being exhibited. Therefore, by giving the computer a longitudinal understanding of the emotional state of its user, the computer could adapt a working style which fits with its user’s personality. The result of this collaboration could increase productivity for the user. One way of gaining information from a user non-intrusively is by video. Cameras have been used to detect a person’s emotional state
(Johnson, 1999). We have explored gaining information through touch. One obvious place to put sensors is on the mouse. Through observing normal computer usage (creating and editing documents and surfing the web), people spend approximately 1/3 of their total computer time touching their input device. Because of the incredible amount of time spent touching an input device, we will explore the possibility of detecting emotion through touch.
2.1 TYPES OF EMOTIONAL SENSORS
2.1.1 FOR HAND

One proposed, non—invasive method for gaining user information through touch is via a computer input device, the mouse. This then allows the user to relate the cardiac rhythm, the body temperature, electrical conductivity of the skin and other physiological attributes with the mood. This has led to the creation of the “Emotion Mouse”. The device can measure heart rate, temperature, galvanic skin response and minute bodily movements and matches them with six emotional states: happiness, surprise,
anger, fear, sadness and disgust.
The mouse includes a set of sensors, including infrared detectors and temperature-sensitive chips. These components, User researchers’ stress, will also be crafted into other commonly used items such as the office chair, the steering wheel, the keyboard and the phone handle. Integrating the system into the steering wheel, for instance, could allow an alert to be sounded when a driver becomes drowsy.
Information Obtained From Emotion Mouse:-
1. Behavior
a. Mouse movements
b. Button click frequency
c. Finger pressure when a user presses his/her button
2. Physiological information
a. Heart rate (Electrocardiogram(ECG/EKG),Photoplethysmogram
(PPG) )
b. Skin temperature (Thermester)
c. Skin electricity (Galvanic skin response, GSR)
d. Electromyographic activity (Electromyogram, MG)
Prototype:-
2.1.2 FOR EYES
Fig. A wearable device which allows any viewer to visualize the confusion and interest levels of the wearer.
Other recent developments in related technology is the attempt to learn the needs of the user just by following the interaction between the user and the computer in order to know what he/she is interested in at any given moment. For example, by remembering the type of websites that the user links to according to the mood and time of the day, the computer could search on related sites and suggest the results the user.
2.1.3 FOR BODY:-
A jacket with an embedded sensor net worn by a conductor helps extend the conductor’s ability to express emotion
and intentionality.
2.1.4 FOR SPEECH:-
A personalizing conversational speech interface agent designed for affective communication, which adjust speaking style to the user.
3. THEORIES AND TECHNOLOGIES
3.1 PAUL EKMAN’S FACIAL EXPRESSION

Based on Paul Ekman’s facial expression work, we see a correlation between a person’s emotional state and a person’s physiological measurements. Selected works from Ekman and others on measuring facial behaviors describe Ekman’s Facial Action Coding System (Ekman and Rosenberg, 1997). One of his experiments involved participants attached to devices to record certain measurements including pulse, galvanic skin response (GSR), temperature, somatic movement and blood pressure. He then recorded the measurements as the participants were instructed to mimic facial expressions which corresponded to the six basic emotions. He defined the six basic emotions as anger, fear, sadness, disgust, joy and surprise. From this work, Dryer (1993) determined how physiological measures could be used to distinguish various emotional states.
Six participants were trained to exhibit the facial expressions of the six basic emotions. While each participant exhibited these expressions, the physiological changes associated with affect were assessed. The measures taken were GSR, heart rate, skin temperature and general somatic activity (GSA). These data were then subject to two analyses. For
the first analysis, a multidimensional scaling (MDS) procedure was used to
determine the dimensionality of the data. This analysis suggested that the physiological similarities and dissimilarities of the six emotional states fit within a four dimensional model. For the second analysis, a discriminant
function analysis was used to determine the mathematic functions that would distinguish the six emotional states. This analysis suggested that all four physiological variables made significant, nonredundant contributions to the functions that distinguish the six states. Moreover, these analyses indicate that these four physiological measures are sufficient to determine reliably a person’s specific emotional state. Because of our need to incorporate these measurements into a small, non-intrusive form, we will
explore taking these measurements from the hand. The amount of conductivity of the skin is best taken from the fingers. However, the other measures may not be as obvious or robust. We hypothesize that changes in the temperature of the finger are reliable for prediction of emotion. We also hypothesize the GSA can be measured by change in movement in the computer mouse. Our efforts to develop a robust pulse meter are not discussed here.
3.2 MANUAL AND GAZE INPUT CASCADED (MAGIC) POINTING
This work explores a new direction in utilizing eye gaze for computer input. Gaze tracking has long been considered as an alternative or potentially superior pointing method for computer input. We believe that many fundamental limitations exist with traditional gaze pointing. In
particular, it is unnatural to overload a perceptual channel such as vision with a motor control task. We therefore propose an alternative approach, dubbed MAGIC (Manual And Gaze Input Cascaded) pointing. With such an approach, pointing appears to the user to be a manual task, used for fine manipulation and selection. However, a large portion of the cursor
movement is eliminated by warping the cursor to the eye gaze area, which
encompasses the target. Two specific MAGIC pointing techniques, one
conservative and one liberal, were designed, analyzed, and implemented
with an eye tracker we developed. They were then tested in a pilot study.
This early stage exploration showed that the MAGIC pointing techniques might offer many advantages, including reduced physical effort and fatigue as compared to traditional manual pointing, greater accuracy and naturalness than traditional gaze pointing, and possibly faster speed than manual pointing. The pros and cons of the two techniques are discussed in light of both performance data and subjective reports.


Reply

Important Note..!

If you are not satisfied with above reply ,..Please

ASK HERE

So that we will collect data for you and will made reply to the request....OR try below "QUICK REPLY" box to add a reply to this page
Popular Searches: blue eyes technology for ppt, what is blue eyes technology by wikipedia, freedownload seminar papers on blue eyes, blue eyes technology foe ece, blue eyes technology wikipedia download, blue eyes yugioh, bleu eyes technology ppt,

[-]
Quick Reply
Message
Type your reply to this message here.

Image Verification
Please enter the text contained within the image into the text box below it. This process is used to prevent automated spam bots.
Image Verification
(case insensitive)

Messages In This Thread
Blue Eyes Technology - by sprity - 06-03-2010, 06:51 PM
RE: Blue Eyes Technology - by seminar-avatar - 15-03-2010, 06:15 PM
RE: Blue Eyes Technology - by seminar surveyer - 22-12-2010, 12:12 PM
RE: Blue Eyes Technology - by seminar surveyer - 22-12-2010, 12:20 PM
RE: Blue Eyes Technology - by seminar surveyer - 30-12-2010, 11:59 AM
RE: Blue Eyes Technology - by snehagajjar22 - 06-02-2011, 11:19 AM
RE: Blue Eyes Technology - by sravani chary - 06-02-2011, 11:31 AM
RE: Blue Eyes Technology - by kuntee - 26-02-2011, 10:16 PM
RE: Blue Eyes Technology - by seminar class - 05-03-2011, 02:12 PM
RE: Blue Eyes Technology - by seminar class - 22-03-2011, 10:49 AM
RE: Blue Eyes Technology - by seminar class - 22-03-2011, 03:48 PM
RE: Blue Eyes Technology - by seminar class - 24-03-2011, 04:03 PM
RE: Blue Eyes Technology - by mahitha0978 - 12-01-2012, 11:16 AM
RE: Blue Eyes Technology - by seminar addict - 27-01-2012, 02:41 PM
RE: Blue Eyes Technology - by seminar addict - 31-01-2012, 01:18 PM
RE: Blue Eyes Technology - by seminar addict - 11-02-2012, 11:00 AM
RE: Blue Eyes Technology - by computer girl - 04-06-2012, 04:09 PM
RE: Blue Eyes Technology - by seminar details - 07-12-2012, 02:13 PM
RE: Blue Eyes Technology - by seminar details - 11-12-2012, 01:10 PM
RE: Blue Eyes Technology - by seminar details - 12-12-2012, 01:53 PM
RE: Blue Eyes Technology - by seminar details - 30-01-2013, 12:54 PM

Possibly Related Threads...
Thread Author Replies Views Last Post
  5G technology dhanya1987 8 12,921 11-04-2016, 11:21 AM
Last Post: dhanabhagya
  FinFET Technology computer science crazy 13 11,634 10-03-2015, 04:38 PM
Last Post: seminar report asees
  Cellonics Technology computer science crazy 3 3,528 05-09-2014, 09:45 PM
Last Post: seminar report asees
  Latest Invention: Acoustic Ear-scanning Technology to Help Avoid Theft project report helper 10 8,673 20-08-2014, 09:02 PM
Last Post: preethikrishna
  RFID Technology computer science crazy 4 4,631 09-08-2014, 07:10 PM
Last Post: Guest
  bionic eyes for the blind full report project report tiger 22 23,052 01-04-2014, 10:48 PM
Last Post: seminar report asees
  Seminar Report On Optical Computing Technology mechanical wiki 3 5,844 27-07-2013, 12:41 PM
Last Post: computer topic
Question Sixth Sense Technology nitins60 43 76,200 22-06-2013, 10:46 AM
Last Post: Guest
  blue brain project full report seminar topics 40 39,684 04-04-2013, 01:18 PM
Last Post: Guest
  3G MOBILE COMMUNICATION TECHNOLOGY full report seminar presentation 22 31,233 29-01-2013, 11:06 AM
Last Post: seminar details

Forum Jump: