Speech Emotion Recognition for Affective Human Robot Interaction full report
#1

Speech Emotion Recognition for Affective Human Robot Interaction

Abstract
We evaluate the performance of a speech emotion recognition method for affective human-robot interaction. In the proposed method, emotion is classified into 6 classes: Angry, bored, happy, neutral, sad, and surprised. After applying noise reduction and speech detection, we obtain a feature vector for an utterance from statistics of phonetic and prosodic information. The phonetic information includes log energy, shimmer, formant frequencies, and Teager energy; the prosodic information includes pitch, jitter, and rate of speech. Then a pattern classifier based on Gaussian support vector machines decides the emotion class of the utterance. To simulate a human-robot interaction situation, we record speech commands and dialogs uttered at 2m away from a microphone. Experimental results show that the proposed method achieves the classification accuracy of 58.6% while listeners give 60.4% with the reference labels given by speakersâ„¢ intention. On the other hand, the proposed method shows the classification accuracy of 51.2% with the reference labels given by the listenersâ„¢ majority decision.

Presented By:
Kwang-Dong Jang and Oh-Wook Kwon Department of Control and Instrumentation Engineering Chungbuk National University, Korea {kdjang,owkwon}[at]chungbuk.ac.kr

1. Introduction
A human conveys emotion as well as linguistic information via speech signals. The emotion in speech makes verbal communications natural, emphasizes a speakerâ„¢s intention, and shows oneâ„¢s psychological state. Recently there has been a lot of research activities for affective human-robot interaction with a humanoid robot by recognizing the emotion expressed through facial images and speech. In particular, speech emotion recognition requires less hardware and computational complexity compared to facial emotion recognition. A speech emotion recognizer can be used in an interactive intelligent robot which responds appropriately to a userâ„¢s command according to the userâ„¢s emotional state. It can be also embedded in a music player which suggests a suitable music list to the userâ„¢s emotional state. Emotion can be recognized by using acoustic information and/or linguistic information . Emotion recognition from linguistic information is done by spotting exclamatory words from input utterances and thus cannot be used when there are no exclamatory words. However, acoustic information extracted from speech signals is more flexible for emotion recognition than linguistic information because it does not require any speech recognition system to spot exclamatory words and can be extended to any other language. Among many features suggested for speech emotion recognition, we select the following acoustic information: pitch, energy, formats, tempo, duration, jitter, shimmer, mel frequency coefficient (MFCC), linear predictive coding (LPC) coefficient, and Teager energy. A pattern classifier based on support vector machines (SVM) classifies the motion by using the feature vector obtained from statistics of the acoustic information. We compare the performance of automatic emotion recognition when the reference labels are given by speakers and human listeners. This paper is organized as follows: Section 2 explains the base features extracted from speech and the pattern classifier. Section 3 describes the experimental results when the reference labels are supplied by human listeners and speakers. Section 4 concludes the paper.


for full report please see http://eurasipProceedings/Ext/SPECOM2006/papers/077.pdf
Reply

Important Note..!

If you are not satisfied with above reply ,..Please

ASK HERE

So that we will collect data for you and will made reply to the request....OR try below "QUICK REPLY" box to add a reply to this page
Popular Searches: emotion, full report on rf based robot, human emotion detection, affective, human gait recognition technology, emotion detection, speech recognition seminar report in electronic,

[-]
Quick Reply
Message
Type your reply to this message here.

Image Verification
Please enter the text contained within the image into the text box below it. This process is used to prevent automated spam bots.
Image Verification
(case insensitive)

Possibly Related Threads...
Thread Author Replies Views Last Post
  automatic vehicle locator full report computer science technology 3 8,305 07-09-2013, 11:30 PM
Last Post: Guest
  INSTRUMENTATION CONTROL full report project topics 7 16,583 11-05-2013, 09:58 AM
Last Post: computer topic
  Optical Satellite Communication full report computer science technology 1 5,082 16-01-2013, 12:25 PM
Last Post: seminar details
  Fingerprint Recognition based on Silicon Chips seminar class 2 3,817 14-01-2013, 08:16 AM
Last Post: mall99
  PH Control Technique using Fuzzy Logic full report computer science technology 4 9,030 16-03-2012, 10:23 AM
Last Post: seminar paper
  programmable logic controller plc full report computer science technology 15 24,228 12-03-2012, 03:59 PM
Last Post: seminar paper
  INTELLIGENT WIRELESS VIDEO CAMERA USING COMPUTER full report computer science technology 2 8,815 19-01-2012, 10:53 AM
Last Post: seminar addict
  SPEECH RECOGNITION USING DSP full report computer science technology 18 17,791 16-01-2012, 12:04 PM
Last Post: seminar addict
  air muscles full report project report tiger 4 9,812 03-10-2011, 09:35 AM
Last Post: seminar addict
  Welding Processes full report seminar class 1 4,052 30-07-2011, 02:17 PM
Last Post: smart paper boy

Forum Jump: