Acquisition and indexing of RGB-D recordings for facial expressions and emotion recognition

Mariusz Szwoch

Abstract


In this paper KinectRecorder comprehensive tool is described which provides for convenient and fast acquisition, indexing and storing of RGB-D video streams from Microsoft Kinect sensor. The application is especially useful as a supporting tool for creation of fully indexed databases of facial expressions and emotions that can be further used for learning and testing of emotion recognition algorithms for affect-aware applications. KinectRecorder was successfully exploited for creation of Facial Expression and Emotion Database (FEEDB) significantly reducing the time of the whole project consisting of data acquisition, indexing and validation. FEEDB has already been used as a learning and testing dataset for a few emotion recognition algorithms which proved utility of the database, and the KinectRecorder tool.

Keywords


RGB-D data acquisition and indexing; facial expression recognition; emotion recognition

Full Text:

PDF

References


Picard R.: Affective Computing: From Laughter to IEEE. IEEE Transactions On Affective Computing, Vol. 1, No. 1, 2010, p. 11÷17.

Landowska A.: Affect-awareness Framework for Intelligent Tutoring Systems. The 6th Int. Conf. on Human System Interaction, IEEE, 2013, p. 540÷547.

Wrobel M. R.: Emotions in the software development process. The 6th Int. Conf. on Human System Interaction, IEEE, 2013, p. 518÷523.

Gunes H., Piccardi M.: Affect Recognition from Face and Body: Early Fusion vs. Late Fusion. IEEE Int. Conf. on Systems, Man and Cybernetics, Vol. 4, IEEE, 2005, p. 3437÷3443.

Zeng Z., Pantic M., Roisman G., Huang T.: A survey of affect recognition methods: Audio, visual, and spontaneous expressions. IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 31, No. 1, IEEE, 2009, p. 39÷58.

Vizer L. M., Zhou L., Sears A.: Automated stress detection using keystroke and linguistic features. Int. Journal of Human-Computer Studies, Vol. 67, 2009, p. 870÷886.

Kolakowska A.: A review of emotion recognition methods based on keystroke dynamics and mouse movements. The 6th Int. Conf. on Human System Interaction, IEEE, 2013, p. 548÷555.

Bailenson J. N., Pontikakis E. D., Mauss I. B., Gross J. J., Jabon M. E., Hutcherson C. A. C., Nass C., Oliver J.: Real-time classification of evoked emotions using facial feature tracking and physiological responses. Int. Journal of HumanComputer Studies, Vol. 66, No. 5, 2008, p. 303÷317.

Szwoch W.: Using Physiological Signals for Emotion Recognition. The 6th Int. Conf. on Human System Interaction, IEEE, 2013, p. 556÷561.

Burgin W., Pantofaru C., Smart W. D.: Using depth information to improve face detection. The 6th Int. Conf. on Human-Robot Interaction, IEEE, 2011, p. 119÷120.

Castrillón-Santana M., Déniz-Suárez O., Antón-Canalís L., Lorenzo-Navarro J.: Face and Facial Feature Detection Evaluation. Int. Conf. on Computer Vision and Computer Graphics Theory and Applications, 2008, p. 167÷172.

Szwoch M.: FEEDB: a Multimodal Database of Facial Expressions and Emotions. The 6th Int. Conf. on Human System Interaction, IEEE, 2013, p. 524÷531.

Szwoch M.: On Facial Expressions and Emotions RGB-D Database. 10th Int. Conf. Beyond Databases, Architectures, and Structures, CCIS, Vol. 424, Springer, 2014, p. 384÷394.

Face Recognition Homepage: http://www.face-rec.org/databases/, 31.03.2015.

Lucey P., Cohn J. F., Kanade T., Saragih J., Ambadar Z., Matthews I.: The extended Cohn-Kanade (CK+): A complete dataset for action unit and emotion-specified expression. Conf. on Computer Vision and Pattern Recognition, IEEE, 2010, p. 94÷101.

Wang S., Liu Z., Lv S., Lv Y., Wu G., Peng P., Chen F., Wang X.: A Natural Visible and Infrared Facial Expression Database for Expression Recognition and Emotion Inference. IEEE Transactions on Multimedia, Vol. 12, No. 7, IEEE, 2010, p. 682÷691.

Savran A., Alyüz N., Dibeklioglu H., Çeliktutan O., Gökberk B., Sankur B., Akarun L.: Bosphorus Database for 3D Face Analysis. Lecture Notes in Computer Science, Vol. 5372, Springer, 2008, p. 47÷56.

Colombo A., Cusano C., Schettini R.: UMB-DB: A Database of Partially Occluded 3D Faces. IEEE Int. Conf. on Computer Vision Workshops, IEEE, 2011, p. 2113÷2119.

Mehrabian A: Comparison of the PAD and PANAS as models for describing emotions and for differentiating anxiety from depression. Journal of psychopathology and behavioral assessment, Vol. 19, No. 4, Springer, 1997, p. 331÷357.

Bradley M. M., Lang P. J.: Measuring Emotion: The Self-Assessment Manikin and the Semantic Differential. Journal Behav. Ther. &Exp. Psychiat., Vol. 25, No. 1, Elsevier, 1994, p. 49÷59.

Ekman P., Friesen W.: Facial Action Coding System: A Technique for the Measurement of Facial Movement. Consulting Psychologists Press, Palo Alto, 1978.

Szwoch M., Szwoch W.: Emotion Recognition for Affect Aware Video Games. Image Processing & Communications Challenges 6 Advances in Intelligent Systems and Computing, Vol. 313, Springer, 2015, p. 227÷236.




DOI: http://dx.doi.org/10.21936/si2015_v36.n1.718