Human-like Emotions and Augmented Dynamics

H.E.A.D is a project to develop an anthropomorphic robot head based on a Bayesian control architecture.

H.E.A.D has various sub-projects; 
  1. Development of a keyboard based NLP interaction --  CHEE
  2. Emotion recognition from image processing
  3. Emotion recognition from voice, content and intonation
  4. Development of a module for tracking the 'history of the interaction', human beings with HEAD -- mood
  5. Development of a Bayesian meta architecture which will tie all of the above together -- Emotion Cloud
The working of HEAD has to be routed through classifiers and we plan to use support vector machines.


The HEAD team,

Research papers published,

No comments: