This paper is an extended version of our paper published in 8th Computer Science and Electronic Engineering (CEEC).
Version 1
: Received: 31 May 2017 / Approved: 1 June 2017 / Online: 1 June 2017 (06:14:53 CEST)
Version 2
: Received: 7 July 2017 / Approved: 10 July 2017 / Online: 10 July 2017 (08:26:31 CEST)
Kalliatakis, G.; Stergiou, A.; Vidakis, N. Conceiving Human Interaction by Visualising Depth Data of Head Pose Changes and Emotion Recognition via Facial Expressions. Computers2017, 6, 25.
Kalliatakis, G.; Stergiou, A.; Vidakis, N. Conceiving Human Interaction by Visualising Depth Data of Head Pose Changes and Emotion Recognition via Facial Expressions. Computers 2017, 6, 25.
Kalliatakis, G.; Stergiou, A.; Vidakis, N. Conceiving Human Interaction by Visualising Depth Data of Head Pose Changes and Emotion Recognition via Facial Expressions. Computers2017, 6, 25.
Kalliatakis, G.; Stergiou, A.; Vidakis, N. Conceiving Human Interaction by Visualising Depth Data of Head Pose Changes and Emotion Recognition via Facial Expressions. Computers 2017, 6, 25.
Abstract
Affective computing in general and human activity and intention analysis in particular, is a rapidly growing field of research. Head pose and emotion changes, present serious challenges when applied to player’s training and ludology experience in serious games or analysis of customer satisfaction regarding broadcast and web services or monitoring a driver’s attention. Given the increasing prominence and utility of depth sensors, it is now feasible to perform large-scale collection of three-dimensional (3D) data for subsequent analysis. Discriminative random regression forests was selected in order to rapidly and accurately estimate head pose changes in unconstrained environment. In order to complete the secondary process of recognising four universal dominant facial expressions (happiness, anger, sadness and surprise), emotion recognition via facial expressions (ERFE) was adopted. After that, a lightweight data exchange format (JavaScript Object Notation-JSON) is employed, in order to manipulate the data extracted from the two aforementioned settings. Motivated by the need of generating comprehensible visual representations from different sets of data, in this paper we introduce a system capable of monitoring human activity through head pose and emotion changes, utilising an affordable 3D sensing technology (Microsoft Kinect sensor).
Keywords
human activity analysis; human intention understanding; affective computing; data visualisation; depth data; head pose estimation; emotion recognition
Subject
Computer Science and Mathematics, Data Structures, Algorithms and Complexity
Copyright:
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.