TDLC: Music and the Brain

Music is the one incorporeal entrance into the higher world of knowledge which comprehends mankind but which mankind cannot comprehend. - Ludwig van Beethoven

NYAS ConferenceThere is growing interest among TDLC scientists in the effects of music on the brain. Music is intrinsically temporal – it integrates sensory, motor and affective systems in the brain. Because of its temporal nature, it is an ideal focus for TDLC research, as well as a way to integrate across TDLC networks and initiatives.

Music can affect the brain in many ways, many of which are just now being studied. Music therapy — the clinical application of music to treat a wide range of diagnoses using physiological and medical approaches – has advanced dramatically over the past decade. It is proving to be an effective clinical tool for treating medical diagnoses such as Alzheimer's disease, autism, post-traumatic stress disorder, dementia, stroke, NICU infants, language acquisition, dyslexia, pain management, stress and anxiety, coma, and more.

Paula Tallal and Gyorgy Buzsaki Recently, TDLC members were involved in organizing two conferences about music and the brain. The first, held on March 24, 2011 -- the Newark Workshop on Music, Brain and Education at Rutgers University -- was sponsored by TDLC and organized by TDLC co-Director Paula Tallal. The second -- the New York Academy of Science multidisciplinary conference on "Music, Science and Medicine" -- occurred the next day, on March 25, 2011. TDLC PIs, Paula Tallal and Gyorgy Buzsáki, were involved in organizing the NYAS conference, with the main organizer being Dr. Dorita Berger, Editor-In-Chief at on-line Journal of BioMusical Engineering. This landmark meeting explored the connection between recent scientific findings and their possible application to clinical music and physiological function. The ultimate goal of the conference was to bring together experts studying music in human adaptive function, physiological sciences, neuroscience, neurology, medical research, psychology, music education, and other related disciplines, and to promote collaborative research, communication, and translation of scientific research into music-based clinical treatments of disease. (Please click here for conference abstracts. To view conference talks on The Science Network please click here).


On March 24, 2011, TDLC Cod Paula Tallal organized a TDLC-sponsored conference -- the Newark Workshop on Music, Brain and Education at Rutgers University. TDLC members Paula Tallal and Gyorgy Buzsáki were also involved in organizing a New York Academy of Science multidisciplinary conference on "Music, Science and Medicine" on March 25, 2011. "

Auditory Processing and Language, Memory and Attention

Dr. Paula Tallal, who helped to organize the NYAS music conference, explains how music -- more specifically timing in the auditory system -- might affect language development: "Understanding the importance of auditory processing speed is really important for understanding how language works in the brain ... Children with language April Benasichlearning problems (or weak language development) can't sequence two simple tones that differ in frequency when they are presented rapidly in succession. They do absolutely fine when you present two tones separated further apart in time." She continues, "So the actual precision of timing in the auditory system determines what words we actually hear. In order to become a proficient reader and to learn how to spell, we need to hear these small acoustic differences in words and learn that it's those acoustic differences that actually go with the letters."

Another TDLC member, April Benasich (and her colleagues at Rutgers University) are studying how infants only a few months old process sound. Using electroencephalographic recording, they have found that the way these infants process sound in their brains may provide a way to predict later language difficulties. The researchers hope to develop interventions that might correct any early deficiencies (please click here for more).

Additional studies have revealed that musical training may help language processing. Studies by Nadine Gaab, assistant Professor of Pediatrics at Children's Hospital Boston and Harvard Medical School, have demonstrated that people with musical experience found it easier than non-musicians to detect small differences in word syllables. Musical experience improved the way people's brains process split-second changes in sounds and tones used in speech, and consequently, may affect the acoustic and phonetic skills needed for learning language and reading. "The brain becomes more efficient and can process more subtle auditory cues that occur simultaneously," she said. (Please click here to listen to Nadine Gaab present her recent research at the NYAS 2011 Conference, posted on The Science Network). Another key investigator in the field of music and the brain, Nina Kraus from Northwestern University, is studying the neurobiology underlying speech and music perception. She has found that "musical experience strengthens neural, perceptual and cognitive skills that undergird hearing speech in noise throughout the lifespan."

Gamelan Project

TDLC researchers Alexander Khalil, Victor Minces, and Andrea Chiba have observed a correlation between musical synchrony and attentional performance.Their pilot project, conducted at the Museum School (a San Diego City Schools charter school), demonstrated a significant correlation between the ability of 150 children to synchronize in an ensemble setting—regardless of other musical abilities— and their ability to "pay attention" or maintain focus not only in music class but in other areas as well. This increase in overall attentional performance was measured by standard psychometric tests and teacher questionnaires. Now that a relationship between the ability to synchronize musically and attentional performance has been established, and because musical synchrony can be learned, the research team seeks to determine whether a period of musical practice might translate to overall improvement in attentional performance. Please see the Gamelan Project website for more information about this study.

Another TDLC PI, Dr. Isabel Gauthier, and her graduate student Yetta Wong are interested in the holistic processing of musical notation. They studied brain activity in people with various degrees of musical experience, and were surprised by exactly how much of the brain becomes engaged in the simple act of perceiving a single note, especially in advanced musicians. (For more, please see "The Musical Brain Sees Faster").

The Brain Computer Interface Converts Emotions into Music

Three teams of researchers at UC San Diego's Swartz Center for Computational Neuroscience (SCCN) and Institute for Neural Computation (INC) are pioneering a new field called Brain Computer Interface (BCI). Scott Makeig, Tzyy-Ping Jung, and colleagues are developing technology that links thoughts, commands and emotions from the brain to computers, using EEG. "In addition to gadgets like mind-dialed cell phones, devices to assist the severely disabled and a cap to alert nodding-off air traffic controllers, new technology could reshape medicine."

SCCN QuartetScott Makeig, Director of SCCN, has integrated music into his BCI research. His studies use the Brain Computer Interface to read emotions and convert those emotions into musical tones. In his "Quartet for Brain and Trio" Project ("Just: A Suite for Flute, Violin, Cello, and Brain"), Dr. Makeig composed the music and performed the violin, accompanied by a flutist, a cellist, and a so-called "brainist". The "brainist," cognitive science graduate student Tim Mullen, focused on one of five distinct emotional states, feeling it fully inside his body. When he entered into a certain feeling, sensors brought brain signals to the Mobile Brain/Body Imaging (MoBI) laboratory, converting the emotion into a similar-feeling tone complex. The musicians then played the piece that corresponded to that ground tone complex. This research demonstrates that a computer can decode primal emotions and the brain can communicate these feelings through music, without even lifting a hand.

In addition to the BCI studies, SCCN is involved in several other music projects. In one, PhD student Grace Leslie is working with Scott Makeig to study emotional expression of a person attempting to convey the feeling of the music they are hearing via expressive 'conducting' gestures, using their new Mobile Brain/Body Imaging (MoBI) laboratory. In a related pilot project, Dr. Makeig and his team are using an instrumented violin bow to collect EEG, motion capture, and bow dynamics from violinists attempting to express various feelings vita simple open-string bowed violin tones.

Music-Based Interventions

Because studies show correlations between auditory processing ability and cognitive functions, investigators have begun to develop and test music-based interventions that might help improve children's cognitive abilities (e.g. The Gamelan Project).

Researchers are finding that music training correlates with cognitive and language improvements. Laurel Trainor, director of the Institute for Music and the Mind at McMaster University in West Hamilton, Ontario, and colleagues compared preschool children who had taken music lessons with those who did not. Those with some training showed larger brain responses on a number of sound recognition tests given to the children. Her research indicated that musical training appears to modify the brain's auditory cortex. Even a year or two of music training led to enhanced levels of memory and attention (when measured by the same type of tests that monitor electrical and magnetic impulses in the brain). Harvard University researcher Gottfried Schlaug found a correlation between early-childhood training in music and enhanced motor and auditory skills as well as improvements in verbal ability and nonverbal reasoning.

Terry JerniganTDLC researcher Terry Jernigan is involved in a newly developing study to explore the impact of musical/symphonic training on cognitive and brain development in children in Chula Vista elementary schools. The study involves a new partnership between The Neurosciences Institute (represented by Aniruddh Patel and John Iversen), The San Diego Youth Symphony (led by Dalouge Smith), and UC San Diego (TDLC's Terry Jernigan at the Center for Human Development). The team is especially interested in how musical training impacts the development of language, attention, and executive function, and the brain networks that support these abilities. The project builds on the strengths of the three participating organizations: NSI (~15 years of research on music neuroscience), the San Diego Youth Symphony (extensive experience in music education) and Dr. Jernigan's lab (leading experts in cognitive and brain development). The researchers, who are currently looking into possible funding sources, are currently doing pilot studies in two Chula Vista elementary schools, with children primarily learning string instruments (e.g., violin). The team plans to use behavioral cognitive tests and structural brain imaging.

TDLC's Paula Tallal explains that auditory language training, as well as musical training, is being shown to alter the functional anatomy of the brain that is traditionally associated with speech and language processing. She explains, "behavioral data shows that musical training, as well as neuroplasticity-based acoustic training, significantly improves language and reading skills. Thus, one route by which music therapy may most significantly impact clinical populations is by improving dynamic auditory attention, sequencing and memory processes."

Now that a correlation has been found between music training and cognitive and language improvements, the next step is to create and test different interventions that might help improve cognitive and language skills in children at risk or struggling in reading or other attentional tasks. Paula Tallal, as part of the Scientific Learning Corporation, has helped develop the Fast ForWord® Language and Reading products. The program consists of a series of computer-delivered brain fitness exercises to help educators improve children's academic achievement. "After auditory language training of children with dyslexia," Dr. Tallal explains, "metabolic brain activity more closely resembles "normal" readers, and reading improved enormously after intervention." In fact, Improving neural capacities has been shown to improve student performance, independent of content (language, math, science) or curriculum used (Tallal, 2004). So, Dr. Tallal explains, "even children's math scores improve tremendously in large clinical trials in the schools, even though we don't train math. We train the brain's precision to process auditory and language information. Our focus is not just auditory, it is not just music, but what that does for language and how important language is for ALL academic achievement."


Additional Information - NYAS Conference "Music and the Brain" (Lectures by TDLC PIs Dr. Tallal and Dr. Buzsáki):

The Role of Auditory Processing in Language Development and Disorders
Paula Tallal, PhD, Rutgers University

Neural syntax: what does music offer to neuroscience (and vice versa)
Gyorgy Buzsáki, MD, PhD, Rutgers University


> Additional lectures from the NYAS Music, Science and Medicine Conference (on The Science Network)