From aHuman Wiki
Revision as of 19:06, 28 November 2018 by Admin (Talk | contribs) (Automated page entry using MWPush.pl)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search
Emotions Analysis

@@Home -> ArtificialLifeResearch -> EmotionsAnalysis


This page highlights various processes that can be used to analyze the emotional state of the person which whom aHuman is interacting. This will help aHuman to interact with the user in the way that will not be weird to the user and aHuman can interact in the same emotional state as the end user is.

Based on Facial Features

The features on the face can tell a lot about the user, as it is the vital part of the emotional state of the person. Research is already done to get the emotional state of the person depending upon the facial features.

Research on this

We can make aHuman capable of knowing the emotional state of the user by analyzing the face features and than aHuman can (using some intelligent feature, not yet decided) switch itself to the same emotional state so as to better interact with user.

= Based on the spoken words


= Based on activity