缅北强奸

News

Our head movements convey emotions

Head movements play an important role in conveying emotions through speech and music. Let your head do the talking.

Published: 27 October 2015

When people talk or sing, they often nod, tilt or bow their heads to reinforce verbal messages. 听But how effective are these head gestures at conveying emotions?

Very effective, according to researchers from 缅北强奸 in Montreal.听 Steven R. Livingstone and Caroline Palmer, from 缅北强奸鈥檚 Department of Psychology, found that people were highly accurate at judging emotions based on head movements alone, even in the absence of sound or facial expressions.

This finding suggests that visual information about emotional states available in head movements could aid in the development of automated emotion recognition systems or human-interaction robots, the researchers say. Expressive robots could potentially serve a range of functions, particularly where face-to-face communication is important, such as at hotel reception desks and as interactive care robots for the elderly.

Tracking movement, not sound

Using motion-capture equipment to track people鈥檚 head movements in three dimensions, Livingstone and Palmer recorded vocalists while they spoke or sang with a variety of emotions. The researchers then presented these video clips to viewers without any sound, with the facial expressions of vocalists hidden so that only their head movements were visible. 听Viewers were then asked to identify the emotions that the vocalists intended to convey.

鈥淲e found that when people talk, the ways in which they move their head reveal the emotions that they鈥檙e expressing.听 We also found that people are remarkably accurate at identifying a speaker鈥檚 emotion, just by seeing their head movements,鈥 says Palmer, who holds the Canada Research Chair in Cognitive Neuroscience of Performance.

Research idea emerged from a noisy pub

鈥淲hile the head movements for happy and sad emotions differed, they were highly similar across speech and song, despite differences in vocal acoustics鈥, says Livingstone, a former postdoctoral fellow in the Palmer lab and now a postdoctoral fellow at McMaster University. 鈥淎lthough the research was based on North American English speakers, the focus on head movements creates the possibility for studying emotional communication in contexts where different languages are spoken鈥.

The idea for the study emerged from a noisy pub. 鈥淥ne night in Montreal I was in a bar with my lab mates鈥, explains Livingstone, 鈥淚t was a lively evening, with lots of people, dim lights, and some very loud music. At one point my friend started to talk me; I knew he was excited though I couldn鈥檛 make out what he was saying or see his face clearly.听 Suddenly I realized it was the animated way that he was bobbing his head that told me what he was trying to say鈥.

Adds Palmer, 鈥淥ur discovery may lead to new applications in situations where sound is not available, such as automated recognition of emotional states in crowd behavior or in hearing impairments, by making use of head movements when watching someone talk. It also has applications in computing and robotics, where the addition of expressive head movements may help make humanoid robots more lifelike and approachable." 听

This research was funded in part by an NSERC-CREATE Postdoctoral Fellowship to Dr Steven R. Livingstone and by a Canada Research Chair and NSERC Grant to Dr Caroline Palmer.

"Head Movements Encode Emotions During Speech and Song", Steven R. Livingstone and Caroline Palmer, Emotion, DOI:

Back to top