Difference between revisions of "EmotionsAnalysis"

From aHuman Wiki
Jump to: navigation, search
(Automated page entry using MWPush.pl)
(Automated page entry using MWPush.pl)
Line 10: Line 10:
 
= Based on Facial Features =
 
= Based on Facial Features =
  
The features on the face can tell a lot about the user, as it is the vital part of the emotional state of the person. Research is already done to get the emotional state of the person depending upon the facial features.  
+
The features on the face can tell a lot about the user, as it is the vital part of the emotional state of the person. Research is already done to get the emotional state of the person depending upon the facial features.
  
 
[http://www.naun.org/journals/circuitssystemssignal/cssp-23.pdf Research on this]
 
[http://www.naun.org/journals/circuitssystemssignal/cssp-23.pdf Research on this]

Revision as of 09:11, 23 June 2015

Emotions Analysis

@@Home -> ArtificialLifeResearch -> EmotionsAnalysis


Introduction

This page highlights various processes that can be used to analyze the emotional state of the person which whom aHuman is interacting. This will help aHuman to interact with the user in the way that will not be weird to the user and aHuman can interact in the same emotional state as the end user is.

Based on Facial Features

The features on the face can tell a lot about the user, as it is the vital part of the emotional state of the person. Research is already done to get the emotional state of the person depending upon the facial features.

Research on this

We can make aHuman capable of knowing the emotional state of the user by analyzing the face features and than aHuman can (using some intelligent feature, not yet decided) switch itself to the same emotional state so as to better interact with user.

= Based on the spoken words

TBD

= Based on activity

TBD