Walking Person
Person Walking https://pixabay.com/photos/person-walking-silhouette-away-man-3556090/

A method developed by researchers from the University of North Carolina at Chapel Hill and the University of Maryland at College Park utilizes people's walking styles in identifying their emotions. This method is based on an extracted RGB video of a person's gait from his walking. This gait is analyzed and classified according to four emotions: happy, sad, angry, or neutral.

One lead author Tanmay Randhavane, a UNC graduate student, emphasizes the signifcance of emotions in defining experiences and how people interact with others. People tend to communicate differently based on another individual's emotions.

Conventional tools in recognizing and identifying emotions use analysis of facial expressions or voice recordings. There were previous studies that suggest that body language is an indicator of a person's feelings. This became one of the bases of the researchers in devising a tool that can identify an individual's perceived emotions based on their walking style.

"The main advantage of our perceived emotion recognition approach is that it combines two different techniques," Randhavane said. "In addition to using deep learning, our approach also leverages the findings of psychological studies. A combination of both these techniques gives us an advantage over the other methods."

A person's walk is captured by an RGB video that is extracted by showing it as a series of 3-D poses. These 3-D poses are analyzed using a long short-term memory (LSTM) recurrent neural network and a random forest (RF) classifier and the peson's emotions are identified.

"The LSTM is initially trained on a series of deep features, but these are later combined with affective features computed from the gaits using posture and movement cues. All of these features are ultimately classified using the RF classifier," according to TechExplore.

The team of researchers obtained an 80 percent accuracy in the identification of perceived emotions with the dataset that contained videos of people walking. Moreover, their model increased the accuracy of identifiying perceived emotions by 14 percent compared to current recognition methods.

The researchers do not claim that their model is able to predict the actual emotions that a person has. It can only approximate the perceived emotion of that walking style. Various applications can be used in this approach and this includes better human perception for robots and autonomous vehicles that improve surveillance.