Blue eyes (Download Full Report And Abstract)
#45



[attachment=8345]

BY-
DIKUL SINGH



INTRODUCTION

Imagine yourself in a world where humans interact with computers. You are sitting in front of your personal computer that can listen, talk, or even scream aloud. It has the ability to gather information about you and interact with you through special techniques like facial recognition, speech recognition, etc. It can even understand your emotions at the touch of the mouse. It verifies your identity, feels your presence, and starts interacting with you .You ask the computer to dial to your friend at his office. It realizes the urgency of the situation through the mouse, dials your friend at his office, and establishes a connection.
Human cognition depends primarily on the ability to perceive, interpret, and integrate audio-visuals and sensoring information. Adding extraordinary perceptual abilities to computers would enable computers to work together with human beings as intimate partners. Researchers are attempting to add more capabilities to computers that will allow them to interact like humans, recognize human presence, talk, listen, or even guess their feelings.
The BLUE EYES technology aims at creating computational machines that have perceptual and sensory ability like those of human beings. It uses non-obtrusive sensing method, employing most modern video cameras and microphones to identify the users actions through the use of imparted sensory abilities . The machine can understand what a user wants, where he is looking at, and even realize his physical or emotional states.

EMOTION MOUSE

One goal of human computer interaction (HCI) is to make an adaptive, smart computer system. This type of project could possibly include gesture recognition, facial recognition, eye tracking, speech recognition, etc. Another non-invasive way to obtain information about a person is through touch. People use their computers to obtain, store and manipulate data using their computer. In order to start creating smart computers, the computer must start gaining information about the user. Our proposed method for gaining user information through touch is via a computer input device, the mouse. From the physiological data obtained from the user, an emotional state may be determined which would then be related to the task the user is currently doing on the computer. Over a period of time, a user model will be built in order to gain a sense of the user's personality. The scope of the project is to have the computer adapt to the user in order to create a better working environment where the user is more productive. The first steps towards realizing this goal are described here.

EMOTION AND COMPUTING

Rosalind Picard (1997) describes why emotions are important to the computing community. There are two aspects of effective computing: giving the computer the ability to detect emotions and giving the computer the ability to express emotions. Not only are emotions crucial for rational decision making as Picard describes, but emotion detection is an important step to an adaptive computer system. An adaptive, smart computer system has been driving our efforts to detect a person’s emotional state. An important element of incorporating emotion into computing is for productivity for a computer user. A study (Dryer & Horowitz, 1997) has shown that people with personalities that are similar or complement each other collaborate well. Dryer (1999) has also shown that people view their computer as having a personality. For these reasons, it is important to develop computers which can work well with its user.
By matching a person’s emotional state and the context of the expressed emotion, over a period of time the person’s personality is being exhibited. Therefore, by giving the computer a longitudinal understanding of the emotional state of its user, the computer could adapt a working style which fits with its user’s personality. The result of this collaboration could increase productivity for the user. One way of gaining information from a user non-intrusively is by video. Cameras have been used to detect a person’s emotional state (Johnson, 1999). We have explored gaining information through touch. One obvious place to put sensors is on the mouse. Through observing normal computer usage (creating and editing documents and surfing the web), people spend approximately 1/3 of their total computer time touching their input device. Because of the incredible amount of time spent touching an input device, we will explore the possibility of detecting emotion through touch.

THEORY

Based on Paul Ekman’s facial expression work, we see a correlation between a person’s emotional state and a person’s physiological measurements. Selected works from Ekman and others on measuring facial behaviors describe Ekman’s Facial Action Coding System (Ekman and Rosenberg, 1997). One of his experiments involved participants attached to devices to record certain measurements including pulse, galvanic skin response (GSR), temperature, somatic movement and blood pressure. He then recorded the measurements
as the participants were instructed to mimic facial expressions which corresponded to the six basic emotions. He defined the six basic emotions as anger, fear, sadness, disgust, joy and surprise. From this work, Dryer (1993) determined how physiological measures could be used to distinguish various emotional states.
Six participants were trained to exhibit the facial expressions of the six basic emotions. While each participant exhibited these expressions, the physiological changes associated with affect were assessed. The measures taken were GSR, heart rate, skin temperature and general somatic activity (GSA). These data were then subject to two analyses. For the first analysis, a multidimensional scaling (MDS) procedure was used to determine the dimensionality of the data. This analysis suggested that the physiological similarities and dissimilarities of the six emotional states fit within a four dimensional model. For the second analysis, a discriminant functional analysis was used to determine the mathematic functions that would distinguish the six emotional states. This analysis suggested that all four physiological variables made significant, non redundant contributions to the functions that distinguish the six states. Moreover, these analyses indicate that these four physiological measures are sufficient to determine reliably a person’s specific emotional state. Because of our need to incorporate these measurements into a small, non-intrusive form, we will explore taking these measurements from the hand. The amount of conductivity of the skin is best taken from the fingers. However, the other measures may not be as obvious or robust. We hypothesize that changes in the temperature of the finger are reliable for prediction of emotion. We also hypothesize the GSA can be measured by change in movement in the computer mouse. Our efforts to develop a robust pulse meter are not discussed here

MANUAL AND GAZE INPUT CASCADED (MAGIC) POINTING

This work explores a new direction in utilizing eye gaze for computer input. Gaze tracking has long been considered as an alternative or potentially superior pointing method for computer input. We believe that many fundamental limitations exist with traditional gaze pointing. In particular, it is unnatural to overload a perceptual channel such as vision with a motor control task. We therefore propose an alternative approach, dubbed MAGIC (Manual And Gaze Input Cascaded) pointing. With such an approach, pointing appears to the user to be a manual task, used for fine manipulation and selection. However, a large portion of the cursor movement is eliminated by warping the cursor to the eye gaze area, which encompasses the target. Two specific MAGIC pointing techniques, one conservative and one liberal, were designed, analyzed, and implemented with an eye tracker we developed. They were then tested in a pilot study. This early stage exploration showed that the MAGIC pointing techniques might offer many advantages, including reduced physical effort and fatigue as compared to traditional manual pointing, greater accuracy and naturalness than traditional gaze pointing, and possibly faster speed than manual pointing. The pros and cons of the two techniques are discussed in light of both performance data and subjective reports.
In our view, there are two fundamental shortcomings to the existing gaze pointing techniques, regardless of the maturity of eye tracking technology. First, given the one-degree size of the fovea and the subconscious jittery motions that the eyes constantly produce, eye gaze is not precise enough to operate UI widgets such as scrollbars, hyperlinks, and slider handles In Proc. CHI’99: ACM Conference on Human Factors in Computing Systems. 246-253, Pittsburgh, 15-20 May1999 Copyright ACM 1999 0-201-48559-1/99/05...$5.00 on today’s GUI interfaces. At a 25-inch viewing distance to the screen, one degree of arc corresponds to 0.44 in, which is twice the size of a typical scroll bar and much greater than the size of a typical character.
Second, and perhaps more importantly, the eye, as one of our primary perceptual devices, has not evolved to be a control organ. Sometimes its movements are voluntarily controlled while at other times it is driven by external events. With the target selection by dwell time method, considered more natural than selection by blinking [7], one has to be conscious of where one looks and how long one looks at an object. If one does not look at a target continuously for a set threshold (e.g., 200 ms), the target will not be successfully selected. On the other hand, if one stares at an object for more than the set threshold, the object will be selected, regardless of the user’s intention. In some cases there is not an adverse effect to a false target selection. Other times it can be annoying and counter-productive (such as unintended jumps to a web page). Furthermore, dwell time can only substitute for one mouse click. There are often two steps to target activation. A single click selects the target (e.g., an application icon) and a double click (or a different physical button click) opens the icon (e.g., launches an application). To perform both steps with dwell time is even more difficult. In short, to load the visual perception channel with a motor control task seems fundamentally at odds with users’ natural mental model in which the eye searches for and takes in information and the hand produces output that manipulates external objects. Other than for disabled users, who have no alternative, using eye gaze for practical pointing does not appear to be very promising.
Are there interaction techniques that utilize eye movement to assist the control task but do not force the user to be overly conscious of his eye movement? We wanted to design a technique in which pointing and selection remained primarily a manual control task but were also aided by gaze tracking. Our key idea is to use gaze to dynamically redefine (warp) the “home” position of the pointing cursor to be at the vicinity of the target, which was presumably what the user was looking at, thereby effectively reducing the cursor movement amplitude needed for target selection.
Once the cursor position had been redefined, the user would need to only make a small movement to, and click on, the target with a regular manual input device. In other words, we wanted to achieve Manual And Gaze Input Cascaded (MAGIC) pointing, or Manual Acquisition with Gaze Initiated Cursor. There are many different ways of designing a MAGIC pointing technique. Critical to its effectiveness is the identification of the target the user intends to acquire. We have designed two MAGIC pointing techniques, one liberal and the other conservative in terms of target identification and cursor placement. The liberal approach is to warp the cursor to every new object the user looks at  




Reply

Important Note..!

If you are not satisfied with above reply ,..Please

ASK HERE

So that we will collect data for you and will made reply to the request....OR try below "QUICK REPLY" box to add a reply to this page
Tagged Pages: blue eyes technology summary pdf,
Popular Searches: zombie eyes, project x blue 60, boss gt 6, abstract on blue tec, blue tec, hawa pradushan chi mahti, lublin poland,

[-]
Quick Reply
Message
Type your reply to this message here.

Image Verification
Please enter the text contained within the image into the text box below it. This process is used to prevent automated spam bots.
Image Verification
(case insensitive)

Messages In This Thread
hi - by d.sunny89 - 29-07-2010, 09:28 AM
Blue eyes (Download ppt) - by projectsofme - 27-09-2010, 01:25 PM
RE: Blue eyes (Download Full Report And Abstract) - by seminar surveyer - 22-01-2011, 01:24 PM

Possibly Related Threads...
Thread Author Replies Views Last Post
  computer networks full report seminar topics 8 45,873 06-10-2018, 12:35 PM
Last Post: jntuworldforum
  OBJECT TRACKING AND DETECTION full report project topics 9 33,023 06-10-2018, 12:20 PM
Last Post: jntuworldforum
  imouse full report computer science technology 3 27,119 17-06-2016, 12:16 PM
Last Post: ashwiniashok
  Implementation of RSA Algorithm Using Client-Server full report seminar topics 6 28,811 10-05-2016, 12:21 PM
Last Post: dhanabhagya
  Optical Computer Full Seminar Report Download computer science crazy 46 69,757 29-04-2016, 09:16 AM
Last Post: dhanabhagya
  ethical hacking full report computer science technology 41 77,739 18-03-2016, 04:51 PM
Last Post: seminar report asees
  broadband mobile full report project topics 7 25,653 27-02-2016, 12:32 PM
Last Post: Prupleannuani
  steganography full report project report tiger 15 43,806 11-02-2016, 02:02 PM
Last Post: seminar report asees
  Digital Signature Full Seminar Report Download computer science crazy 20 47,154 16-09-2015, 02:51 PM
Last Post: seminar report asees
  Steganography In Images (Download Seminar Report) Computer Science Clay 16 27,034 08-06-2015, 03:26 PM
Last Post: seminar report asees

Forum Jump: