Abstract—Emotions Keywords- Blue eyes, emotion mouse, emotion

Abstract—Emotions and facial
expressions plays an important role in communication in social interaction with
other human beings which delivers rich information about their mood. The “BLUE
EYES TECHNOLOGY” aims to create computational mechanisms that have perceptual and
sensory abilities like those of human beings which enables the computer to
gather facts about humans and interact with them. This paper implements the
detection of emotions (happy, sad, fear, surprised, disgust and anger) by
taking in account the human eye expressions and by using an emotion mouse. The emotion
mouse obtains physiological data and emotional state of a person through
the single touch of mouse having different sensors. Emotions are also
determined by human eye expression in which the eye region from a video
sequence is analyzed. From the different frames of the video stream, the human eyes can be
extracted using the edge operator and then can be classified using a Support
Vector machine (SVM) classifier. After the classification we use standard
learning tool, Hidden Markov Model (HMM) to recognize the emotions from the
human eye expressions. After the detection of emotion, suitable audio
track will be played.

 

Keywords- Blue eyes, emotion mouse, emotion
recognition, eye expressions, Support
Vector Machine (SVM), Hidden
Markov Model (HMM).

We Will Write a Custom Essay Specifically
For You For Only $13.90/page!


order now

                                                                                                                                                   
I.         
INTRODUCTION

The
“BLUE EYES” technology aims at creating computational machines by adding in
computers some amazing perceptual abilities that helps them to verify human’s
identity, feel their presence, and interact with them. Human recognition depends
primarily on the stability to observe, understand, and integrate audio/visuals
and sensoring information. Blue eyes technology makes a computer to sense and
understand user’s feelings and their behavior and enables the computer to
respond according to the detected emotional level. The chief aim of blue eyes
technology is to give human abilities or power to a computer, so that the
machine can naturally interact with human beings as humans interacts among
themselves.

The projected
methodologies in this paper detect human emotions are emotion mouse and emotion
recognition by human eye expressions. Emotion mouse is an input device which is
designed in such a way that it can track the emotions of a user by a simple
touch on it. The emotion mouse is used to evaluate and identify the user’s
emotions (such as happy, sad, anger, fear, disgust, surprised, etc). when the user
is interacting with computer.

 

Human’s emotion recognition is
an important component for efficient man-machine interaction. It plays a
critical role in communication by allowing people to express oneself beyond the
verbal domain. Analysis of emotions from human eye expression involves the
detection and categorization of various human emotions or different state of
mind. For example, in security and surveillance, they can predict the offender
or criminal’s behavior by analyzing the images of their face from the frames of
the video sequence. The analysis of human emotions can be applied in a variety
of application domains, such as video surveillance and human – computer
interaction systems. In some cases, the results of such analysis can be applied
to identify and categorize the various human emotions automatically from the
videos.

 

                                                                                                                                                 
II.        
RELATED WORK

Many
approaches for blue eye technology and human emotion recognition have been
proposed in the last two decades.

Mizna
Rehman Mizna et. al. 1 this paper
presents a technique which identifies human emotions (happy, surprise, sad or
excited ) using image processing by taking out only the eye portion from the
captured image which is further compared with images that are already stored in
database. This paper intends two results of emotional sensory world.
First, observation reveals the fact that different eye colors and their
intensity results in change in emotions. It changes without giving any
information on shape and actual detected emotion. It is used to successfully
recognize four different emotions of eyes.

S.R. Vinotha
et. al. 2, this paper uses the feature extraction technique to extract the
eyes, support vector machine (SVM) classifier and a HMM to build a human
emotion recognition system. The proposed system presents a human emotion recognition system that
analyzes the human eye region from video sequences. From the frames of the
video stream the human eyes can be extracted using the well-known canny edge
operator and classified using a non – linear Support Vector machine (SVM)
classifier. Finally, standard learning tool is used, Hidden Markov Model (HMM)
for recognizing the emotions from the human eye expressions.

Mohammad Soleymani et. al. 3 this paper presents the
methodology in which instantaneous detection of the user’s emotions from facial
expression and electroencephalogram (EEG) signals is used. A set of videos
having different emotional level were shown to a group of people and their physiological
responses and facial expressions were recorded. 
Five annotators annotates the valence (from negative to positive) in the
user’s face videos. A continuous annotation of arousal dimensions and valence
is also taken for stimuli videos. Continuous Conditional Random Fields (CCRF)
and Long-short-term-memory recurrent neural networks (LSTM-RNN) were used in
detecting emotions continuously and automatically. The analyzed effect of the
interference of facial muscle activities on EEG signals shows that most of the
emotionally valued content in EEG features are as a result of this interference.
However, the arithmetical analysis showed that EEG signals
carries complementary information in presence of facial expressions.

T. Moriyama et. al. 4 this
paper presents  a system that has capabilities of giving detailed
analysis of eye region images in terms of the position of the iris, angle of
eyelid opening, and the texture, shape and complexity of the eyelids. The
system uses an eye region model that parameterizes the motion and fine
structure of an eye. The structural factors represent structural individuality
of the eye, including the colour and size of the iris, the complexity, boldness
and width of the eyelids, the width of the illumination reflection on the bulge
and the width of the bulge below the eye. The motion factors represent movement
of the eye, including the 2D position of the iris and the up-down motion and
position of the upper and lower eyelids.

Renu Nagpal et. al. 5 this paper presents the world’s first publicly
available dataset of labeled data that has been recorded over the Internet of
people naturally viewing online media. The AM-FED contains, 1) More than 200 webcam
videos recorded in real-world conditions, 2) More than 1.5 lakhs frames labeled
for the presence of 10 symmetrical FACS action units, 4 asymmetric (unilateral)
FACS action units, 2 head movements, smile, general expressiveness, feature
tracker fails and gender, 3) locations of 22 automatically detect landmark
points, 4) baseline performance of detection algorithms on this dataset and baseline
classifier outputs for smile. 5) Self-report responses of familiarity with,
liking of and desire to watch again for the stimuli videos. This represents a
rich and extensively coded resource for researchers working in the domains of
facial expression recognition, affective computing, psychology and marketing.
The videos in this dataset were recorded in real-world conditions. In
particular, they exhibit non-uniform frame rate and non-uniform lighting. The
camera position relative the viewer varies from video to video and in some
cases the screen of the laptop is the only source of illumination. The videos
contain viewers from a range of ages and customs some with glasses and facial
hair. The dataset contains a large number of frames with agreed presence of
facial action units and other labels.

 

                                                                                                                                       
III.       
METHODOLOGY
USED

A.   
Emotion
Recognition From Human Eyes

Facial expressions play an vital
role in communications in social interactions with other human beings which conveys
information about their emotions. The most crucial feature of human interaction
that grants naturalism to the process is our ability to conclude the emotional
states of others. Our goal is to classify the different human emotions from
their eye expressions. The proposed system presents a human emotion recognition
system that analyzes the human eye region from the
video
sequences. From all the frames of the video stream the human
eyes can be extracted using the well-known canny edge operator and classified
using a non – linear Support Vector machine (SVM) classifier. Finally, a
standard learning tool is used, Hidden Markov Model (HMM) for recognizing the
emotions from the human eye expressions.

 

    

        
Surprised                                        Sad

 

    

           Happy                                         Anger

 

    

                 Fear                                         
Disgust

 

Fig. 1: Sample eye
expressions

Human emotion
recognition is an important component for efficient human – computer
interaction. It plays a critical role in communication, allowing people to
express themselves beyond the verbal domain. Analysis of emotions from human
eye expression involves the detection and categorization of various human
emotions and state of mind. The analysis of human emotions can be applied in a
variety of application domains, such as video surveillance and human – computer
interaction systems. In some cases, the results of such analysis can be applied
to identify and categorize the various human emotions automatically from the videos.
The six primary or main types of emotions are shown in Fig. 1: surprised, sad,
happy, anger, fear, disgust. Our method is to use the feature extraction
technique to extract the eyes, support vector machine (SVM) classifier and a
HMM to build a human emotion recognition system.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

              

The
methodology of emotion
recognition from human eye expression is shown in Fig. 2. In this
methodology image of the user sitting in front of the camera is captured. Then
image representing a set of frames is preprocessed and a noise free image is
obtained. The noise free image is edge detected using Canny Edge Operator.
Using the feature extraction process, the eye regions are extracted from the
resultant edge detected image. The extracted eye regions are classified using
SVM classifier. Finally, the corresponding emotions are recognized.

 

B.   
Emotion
Mouse

One
proposed, non-invasive method for gaining user information through touch is via
a computer input device, the mouse. This then allows the user to relate the
cardiac rhythm, the body temperature and other physiological attributes with the mood. 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

                                                 

Fig. 3: Block Diagram
of Emotion Mouse

 

The block diagram of emotion mouse is shown
in Fig. 3, this device can measure heart rate and temperature and matches them
with six emotional states: happiness, surprise, anger, fear, sadness and
disgust.   The mouse includes a set of
sensors, including infrared detectors and temperature-sensitive chips. These
components can also be crafted into other commonly used items such as the
office chair, the steering wheel, the keyboard and the phone handle.
Integrating the system into the steering wheel, for instance, could allow an
alert to be sounded when a driver becomes drowsy.

Heart rate
is taken by IR on the thumb and temperature is taken using a thermistor chip.
These values are input into a series of discriminate function analyses and
correlated to an emotional state. Specifically, for the mouse, discriminate
function analysis is used in accordance with basic principles to determine a
baseline relationship, that is, the relationship between each set of
calibration physiological signals and the associated emotion.

 

                                                                                                                                                 
IV.       
SYSTEM
MODEL

In this system, two methodologies namely
emotion mouse and emotion recognition from eye expression are used. Emotion
mouse will consider the physiological as well as biological parameters such as
cardiac rhythm and body temperature, whereas on the other side emotion
recognition from human eye expression considers facial expression for the
detection of human emotion and mood.

 

Fig. 4: Block diagram
of the system

Fig. 4 shows the
block diagram of the system. In this system the data from the heartbeat sensor
and temperature sensor of the emotion mouse is given to the microcontroller.
The output of the microcontroller is then fed to the computer. The value of
heartbeat sensor and temperature sensor is compared with the standard range of each
emotion and the suitable emotion is selected on the other hand a webcam is
connected with the computer which will take the image of the person from a
video sequence and will further recognize the emotion by detecting the eye
part. The captured eye section will be compared to the images stored in
database to detect mood of the person. After detecting the mood, the musicor
audio command is played according to the detected mood.

 

                                                                                                                                                              
V.        
RESULT

In proposed system, there are two results of
the mentioned methodologies. Firstly, different eye expressions of the different
people are taken in consideration by edge detection of eyes. Further each eye
expression is categorized into a given set of emotions (happy, sad, fear,
surprised, disgust, anger} to take in account a single standard expression for
each emotion. Thus emotion of a person can be detected by comparing the eye
expression of the person with the standard eye expressions of each emotion.
Secondly, the values of heartbeat sensor and temperature sensor are compared
with the standard value range of each emotion and accordingly the value range
of a emotion that matches with the data values of the user is considered as the
emotional state of the user. According to the detected emotion the music or
audio command is played.

 

                                                                                                                                                     
VI.       
CONCLUSION

Recent research documents tell
that the understanding and recognition of emotional expressions plays a very
important role in the maintenance and development of social relationships. This
paper gives an approach of creating computational machines that have perceptual
and sensory ability like those of human beings which enables the computer to
gather information about you through special techniques like facial expressions
recognition and considering biological factors such as cardiac rhythm and body
temperature. This makes it possible for computer and machines to detect the
emotion of the human and respond to it.