KISMET-Robot with facial expression
#1

[attachment=14450]
1. INTRODUCTION
1.1 The Hardware Design

Kismet is an expressive robotic creature with perceptual and motor modalities tailored to natural human communication channels . To facilitate a natural infant-caretaker interaction, the robot is equipped with visual, auditory, and proprioceptive sensory inputs[1], figure 1.1.
Figure 1.1: An Expressive Robot
Our hardware and software control architectures have been designed to meet the challenge of real-time processing of visual signals (approaching 30 Hz) and auditory signals (8 kHz sample rate and frame windows of 10 ms) with minimal latencies (less than 500 ms). The high-level perception system, the motivation system, the behavior system, the motor skill system, and the face motor system execute on four Motorola 68332 microprocessors running L, a multi-threaded Lisp developed in our lab. Vision processing, visual attention and eye/neck control is performed by nine networked 400 MHz PCs running QNX (a real-time Unix operating system) [2]. Expressive speech synthesis and vocal affective intent recognition runs on a dual 450 MHz PC running NT, and the speech recognition system runs on a 500 MHz PC running Linux as in figure 1.2.
Figure 1.2: Hardware Design
1.2 Vision System
Kismet has three degrees of freedom to control gaze direction and three degrees of freedom to control its neck. The degrees of freedom are driven by Maxon DC servo motors with high resolution optical encoders for accurate position control. This gives the robot the ability to move and orient its eyes like a human, engaging in a variety of human visual behaviors. This is not only advantageous from a visual processing perspective, but humans attribute a communicative value to these eye movements as well as shown in figure 1.3 (a) and (b).
(a) Figure 1.3: Vision System (b)
2. THE FRAMEWORK
Kismet is an autonomous robot designed for social interactions with humans. In general, social robotics has concentrated on groups of robots performing behaviors such as flocking, foraging or dispersion, or on paired robot-robot interactions such as imitation. This approach is inspired by the way infants learn to communicate with adults. Specifically, the mode of social interaction is that of a caretaker-infant dyad where a human acts as the caretaker for the robot.
Here is a simplified view of Kismet's design.
Figure 2.1: System Architecture
The system architecture consists of six subsystems as in figure 2.1:
The low-level feature extraction system, the high-level perception system, the attention system, the motivation system, the behavior system, and the motor system. The low-level feature extraction system extracts sensor-based features from the world, and the high-level perceptual system encapsulates these features into percepts that can influence behavior, motivation, and motor processes. The attention system determines what the most salient and relevant stimulus of the environment is at any time so that the robot can organize its behavior about it.
The robot has many behaviors in its repertoire, and several motivations to satiate, so its goals vary over time. The motor system carries out these goals by orchestrating the output modalities (actuator or vocal) to achieve them. For Kismet, these actions are realized as motor skills that accomplish the task physically, or expressive motor acts that accomplish the task via social signals.
2.1 Low-Level Feature Extraction System
The low-level feature extraction system is responsible for processing the raw sensory information into quantities that have behavioral significance for the robot. The routines are designed to be cheap, fast, and just adequate. Of particular interest are those perceptual cues that infants seem to rely on. For instance, visual and auditory cues such as detecting eyes and the recognition of vocal affect are important for infants.
2.2 Attention System
The low-level visual percepts are sent to the attention system. The purpose of the attention system is to pick out low-level perceptual stimuli that are particularly salient or relevant at that time, and to direct the robot's attention and gaze toward them. This provides the robot with a locus of attention that it can use to organize its behavior. A perceptual stimulus may be salient for several reasons. It may capture the robot's attention because of its sudden appearance, or perhaps due to its sudden change. It may stand out because of its inherent saliency such as a red ball may stand out from the background. Or perhaps its quality has special behavioral significance for the robot such as being a typical indication of danger.
2.3 Perceptual System
The low-level features corresponding to the target stimuli of the attention system are fed into the perceptual system. Here they are encapsulated into behaviorally relevant percepts. To environmentally elicit processes in these systems, each behavior and emotive response has an associated releaser. As conceptualized by Tinbergen and Lorenz, a releaser can be viewed as a collection of feature detectors that are minimally necessary to identify a particular object or event of behavioral significance. The function of the releasers is to ascertain if all environmental
(Perceptual) conditions are right for the response to become active.
2.4 Motivation System
The motivation system consists of the robot's basic ``drives'' and ``emotions''. The ``drives'' represent the basic ``needs'' of the robot and are modeled as simple homeostatic regulation mechanisms. When the needs of the robot are being adequately met, the intensity level of each ``drive'' is within a desired regime. However, as the intensity level moves farther away from the homeostatic regime, the robot becomes more strongly motivated to engage in behaviors that restore that ``drive''. Hence the ``drives'' largely establish the robot's own agenda, and play a significant role in determining which behavior(s) the robot activates at any one time. The ``emotions'' are modeled from a functional perspective. Based on simple appraisals of the benefit or detriment of a given stimulus, the robot evokes positive emotive responses that serve to bring itself closer to it, or negative emotive responses in order to withdraw from it. There is a distinct emotive response for each class of eliciting conditions. Currently, six basic emotions are modeled that give the robot synthetic analogs of anger, disgust, fear, joy, sorrow, and surprise (after Ekman). There are also arousal-based responses that correspond to interest, calm, and boredom that are modeled in a similar way. The expression of emotive responses promotes empathy from the caregiver and plays an important role in regulating social interaction with the human.
Reply

Important Note..!

If you are not satisfied with above reply ,..Please

ASK HERE

So that we will collect data for you and will made reply to the request....OR try below "QUICK REPLY" box to add a reply to this page
Popular Searches: source code fuzzy based facial expression recognition, matlab source code for template matching for facial expression recognition using fuzzy logic, ppt on image processing of facial expression, face expression recognition demo ppt, facial expression test, nexi robot with facial expressions pdf, facial expression synonym,

[-]
Quick Reply
Message
Type your reply to this message here.

Image Verification
Please enter the text contained within the image into the text box below it. This process is used to prevent automated spam bots.
Image Verification
(case insensitive)

Possibly Related Threads...
Thread Author Replies Views Last Post
  Military combat robot wireless controlled. Camera helps keeping an eye on border. seminar class 6 11,402 09-06-2017, 10:27 AM
Last Post: jaseela123d
  Line Following Robot project topics 2 1,480 12-05-2016, 09:51 AM
Last Post: seminar report asees
  AUTO PATH FINDER ROBOT computer science technology 3 4,792 23-04-2014, 09:36 PM
Last Post: [email protected]
  RF Based SPY robot full report seminar topics 5 10,868 07-10-2013, 03:34 PM
Last Post: Guest
  RF Controlled Robot with Metal Detector and Wireless image and voice transmission(Mod seminar class 1 3,905 06-11-2012, 12:37 PM
Last Post: seminar details
  Remote Controlled Metal Detecting Robot with Remote Image Transmission seminar class 3 5,034 06-11-2012, 12:37 PM
Last Post: seminar details
  MICROCONTROLLER BASED FIRE FIGHTING ROBOT full report project topics 35 34,976 02-11-2012, 12:27 PM
Last Post: Guest
  Mobile Controlled Robot using DTMF Technology for Industrial Application seminar class 1 3,350 15-10-2012, 03:34 PM
Last Post: seminar details
  Mobile controlled spy robot seminar class 1 2,887 15-10-2012, 03:34 PM
Last Post: seminar details
  Real-Time Tip Position Control of a Flexible Link Robot computer girl 0 839 11-06-2012, 10:35 AM
Last Post: computer girl

Forum Jump: