Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on November 5, 2019

This music venue can track brain responses during performances


This music venue can track brain responses during performances

All evidence indicates that music plays a significant role in every human society, both past and present. When we gather to celebrate, rejoice or mourn, music moves us in powerful ways. Caregivers around the world sing to infants to soothe, play with, and teach them. And yet we are just starting to uncover the profound impact music has on our brain, our emotions and our health.

Laurel Trainor is the director of the McMaster Institute for Music and the Mind and plays principal flute in the Burlington Symphony Orchestra. Her research examines how music is processed in the brain, how musicians co-ordinate non-verbally and the role music plays in early development from multiple perspectives including perceptual, cognitive, social and emotional development.

Dan Bosnyak’s research studies neural plasticity in the human auditory system, in particular the neural correlates of tinnitus and peripheral hearing loss.

Concert laboratory

The Large Interactive Virtual Environment Laboratory (LIVELab) at McMaster University is a research concert hall. It functions as both a high-tech laboratory and theatre, opening up tremendous opportunities for research and investigation.

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

As the only facility of its kind in the world, the LIVELab is a 106-seat concert hall equipped with dozens of microphones, speakers, and sensors to measure brain responses, physiological responses such as heart rate, breathing rates, perspiration and movements in multiple musicians and audience members at the same time.

Engineers, psychologists, and clinician-researchers from many disciplines work alongside musicians, media artists, and industry to study performance, perception, neural processing, and human interaction.

In the LIVELab, acoustics are digitally controlled so the experience can change instantly from extremely silent with almost no reverberation to a noisy restaurant to a subway platform or to the acoustics of Carnegie Hall.

Studying musical experiences

At the LIVELab, researchers work on studies looking at how we make music together, how musicians coordinate their movements and synchronize their brains in order to perform and why people enjoy attending live performances when they could have better sound fidelity at home. By better understanding what is happening in the brain during music listening and playing, we can explore wider health benefits.

Real-time physiological data such as heart rate can be synchronized with data from other systems such as motion capture, and monitored and recorded from both performers and audience members. The result is that the reams of data that can now be collected in a few hours in the LIVELab used to take weeks or months to collect in a traditional lab. And having measurements of multiple people simultaneously is pushing forward our understanding of real-time human interactions.

Consider the implications of how music might help people with Parkinson’s disease to walk more smoothly or children with dyslexia to read better.

Behind the music

In order to examine how musicians communicate non-verbally, researchers at the LIVELab measured the body sway of each musician in a string quartet as they played together. Body sway is not necessary to play the instrument, but it reflects thought processes involved in planning what the musician is going to play next.

In order for musicians to stay together and create a cohesive performance, they need to predict what each other will do next in terms of their micro-timing, phrasing, dynamics, and articulation. If they wait to hear what the other musicians do, it will be too late.

In our recent paper, published in Proceedings of the Natural Academy of Sciences, we used mathematical models to show that we could predict the body sway of one musician from how another musician had just moved, indicating that body sway reflects communication between the musicians. In addition, musicians assigned to be leaders during particular pieces influenced followers more than vice versa.

Another study shows that people experience live performance differently than pre-recorded performances, moving more frequently and synchronously during live music.

Capturing the motions of a string quartet performance. Laurel Trainor, Author provided

In another paper, published in the journal Scientific Reports, we examined how musicians intuitively coordinate with one another during a performance to achieve a common emotional expression. Each performer of a piano trio was fitted with motion capture markers to track their movements while they played happy or sad musical excerpts, once with musical expression and once without.

We found that musicians predicted each other’s movements across both happy and sad excerpts to a greater degree when they played expressively, compared to when they played with no emotion. Importantly, across the pieces, the greater the amount of communication among members of the trio, the higher were ratings by other musicians of the quality of the performance.

This technique for measuring communication between musicians has much wider implications. It could be applied to other situations, such as communication between non-verbal elderly patients with dementia or between autistic children and their caregivers. We can even predict, from body sway communication between people engaged in speed dating, who will match and want to see each other again.

Addressing hearing loss

Another important area of ongoing research is the effectiveness of hearing aids. By the age of 60, nearly 49 percent of people will suffer from some hearing loss. People who wear hearing aids are often frustrated when listening to music because the hearing aids distort the sound and cannot deal with the dynamic range of the music.

The LIVELab is working with the Hamilton Philharmonic Orchestra to solve this problem. During a recent concert, researchers evaluated new ways of delivering sound directly to participants’ hearing aids to enhance sounds.

Researchers hope new technologies can not only increase live musical enjoyment but alleviate the social isolation caused by hearing loss.

Imagine the possibilities for understanding music and sound: How it might help to improve cognitive decline, manage social performance anxiety, help children with developmental disorders, aid in the treatment of depression or keep the mind focused. Every time we conceive and design a study, we think of new possibilities.

This article is republished from The Conversation by Laurel Trainor, Professor, McMaster University and Dan J. Bosnyak, Research Scientist, Technical Director, McMaster University under a Creative Commons license. Read the original article.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with


Published
Back to top