Robotic Emotion Detection from Gesture Recognition: A Framework for Real-Time Analysis Open Access
Robotic emotion detection has been a prevalent area of focus in robotics for decades, and in many ways is a critical junction for the future of human-robot interactions. Indeed, the ability of a robot to recognize the nuances inherent to human emotions will largely determine its ability to properly respond to requests, tasks, and environmental stimuli at large. By utilizing advanced imaging technologies and biological signal processing, the field of robotics is closer than ever to mapping the human emotion spectrum in a construct applicable to robotic understanding. The present study focuses on the use of upper-body motion analysis as a potential source of real-time emotion recognition. Eleven participants were read twelve different scenarios and asked to physically move in a manner which would portray their emotional state in each scenario. They were then asked to select one of twelve emotions which most closely lined up with the state they were trying to depict. All motions and upper body skeletal data were recorded using a Microsoft Kinect 2. Based on Laban Motion Analysis methods, equations were generated to perform movement interpretations from this data and participant-selected emotions. After breaking the data up into 5-second segments to develop a framework for real-time analysis, principal component analysis was performed to determine the critical elements, the outcome of which was used to determine the machine learning technique to be used. Using a complex tree model, 69.9% accuracy was observed across 5-segment segment data.
Notice to Authors
If you are the author of this work and you have any questions about the information on this page, please use the Contact form to get in touch with us.