Difference between revisions of "EmotionsAnalysis"

From aHuman Wiki
Jump to: navigation, search
(Automated page entry using MWPush.pl)
(No difference)

Revision as of 01:26, 16 June 2015

Emotions Analysis

@@Home -> ArtificialLifeResearch -> EmotionsAnalysis


Introduction

This page highlights various processes that can be used to analyze the emotional state of the person which whom aHuman is interacting. This will help aHuman to interact with the user in the way that will not be weird to the user and aHuman can interact in the same emotional state as the end user is.

Based on Facial Features

The features on the face can tell a lot about the user, as it is the vital part of the emotional state of the person. Research is already done to get the emotional state of the person depending upon the facial features.

[on this]

We can make aHuman capable of knowing the emotional state of the user by analyzing the face features and than aHuman can (using some intelligent feature, not yet decided) switch itself to the same emotional state so as to better interact with user.

= Based on the spoken words

TBD

= Based on activity

TBD