Difference between revisions of "EmotionsAnalysis"
(Automated page entry using MWPush.pl) |
(Automated page entry using MWPush.pl) |
||
(2 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
− | |||
<pre style="color: green">Emotions Analysis</pre> | <pre style="color: green">Emotions Analysis</pre> | ||
@@[[Home]] -> [[ArtificialLifeResearch]] -> [[EmotionsAnalysis]] | @@[[Home]] -> [[ArtificialLifeResearch]] -> [[EmotionsAnalysis]] | ||
Line 12: | Line 11: | ||
The features on the face can tell a lot about the user, as it is the vital part of the emotional state of the person. Research is already done to get the emotional state of the person depending upon the facial features. | The features on the face can tell a lot about the user, as it is the vital part of the emotional state of the person. Research is already done to get the emotional state of the person depending upon the facial features. | ||
− | + | [http://www.naun.org/journals/circuitssystemssignal/cssp-23.pdf Research on this] | |
We can make aHuman capable of knowing the emotional state of the user by analyzing the face features and than aHuman can (using some intelligent feature, not yet decided) switch itself to the same emotional state so as to better interact with user. | We can make aHuman capable of knowing the emotional state of the user by analyzing the face features and than aHuman can (using some intelligent feature, not yet decided) switch itself to the same emotional state so as to better interact with user. |
Latest revision as of 19:06, 28 November 2018
Emotions Analysis
@@Home -> ArtificialLifeResearch -> EmotionsAnalysis
Contents
Introduction
This page highlights various processes that can be used to analyze the emotional state of the person which whom aHuman is interacting. This will help aHuman to interact with the user in the way that will not be weird to the user and aHuman can interact in the same emotional state as the end user is.
Based on Facial Features
The features on the face can tell a lot about the user, as it is the vital part of the emotional state of the person. Research is already done to get the emotional state of the person depending upon the facial features.
We can make aHuman capable of knowing the emotional state of the user by analyzing the face features and than aHuman can (using some intelligent feature, not yet decided) switch itself to the same emotional state so as to better interact with user.
= Based on the spoken words
TBD
= Based on activity
TBD