|TDLC: Music and the Brain|
There is growing interest among TDLC scientists in the effects of music on the brain. Music is intrinsically temporal – it integrates sensory, motor and affective systems in the brain. Because of its temporal nature, it is an ideal focus for TDLC research, as well as a way to integrate across TDLC networks and initiatives.
On March 24, 2011, TDLC Cod Paula Tallal organized a TDLC-sponsored conference -- the Newark Workshop on Music, Brain and Education at Rutgers University. TDLC members Paula Tallal and Gyorgy Buzsáki were also involved in organizing a New York Academy of Science multidisciplinary conference on "Music, Science and Medicine" on March 25, 2011. "
Auditory Processing and Language, Memory and Attention
Dr. Paula Tallal, who helped to organize the NYAS music conference, explains how music -- more specifically timing in the auditory system -- might affect language development: "Understanding the importance of auditory processing speed is really important for understanding how language works in the brain ... Children with language learning problems (or weak language development) can't sequence two simple tones that differ in frequency when they are presented rapidly in succession. They do absolutely fine when you present two tones separated further apart in time." She continues, "So the actual precision of timing in the auditory system determines what words we actually hear. In order to become a proficient reader and to learn how to spell, we need to hear these small acoustic differences in words and learn that it's those acoustic differences that actually go with the letters."
TDLC member, April Benasich (and her colleagues at Rutgers University) are studying how infants only a few months old process sound. Using electroencephalographic recording, they have found that the way these infants process sound in their brains may provide a way to predict later language difficulties. The researchers hope to develop interventions that might correct any early deficiencies (please click here for more).
Additional studies have revealed that musical training may help language processing. Studies by Nadine Gaab, assistant Professor of Pediatrics at Children's Hospital Boston and Harvard Medical School, have demonstrated that people with musical experience found it easier than non-musicians to detect small differences in word syllables. Musical experience improved the way people's brains process split-second changes in sounds and tones used in speech, and consequently, may affect the acoustic and phonetic skills needed for learning language and reading. "The brain becomes more efficient and can process more subtle auditory cues that occur simultaneously," she said. (Please click here to listen to Nadine Gaab present her recent research at the NYAS 2011 Conference, posted on The Science Network). Another key investigator in the field of music and the brain, Nina Kraus from Northwestern University, is studying the neurobiology underlying speech and music perception. She has found that "musical experience strengthens neural, perceptual and cognitive skills that undergird hearing speech in noise throughout the lifespan."
TDLC researchers Alexander Khalil, Victor Minces, and Andrea Chiba have observed a correlation between musical synchrony and attentional performance.Their pilot project, conducted at the Museum School (a San Diego City Schools charter school), demonstrated a significant correlation between the ability of 150 children to synchronize in an ensemble setting—regardless of other musical abilities— and their ability to "pay attention" or maintain focus not only in music class but in other areas as well. This increase in overall attentional performance was measured by standard psychometric tests and teacher questionnaires. Now that a relationship between the ability to synchronize musically and attentional performance has been established, and because musical synchrony can be learned, the research team seeks to determine whether a period of musical practice might translate to overall improvement in attentional performance. Please see the Gamelan Project website for more information about this study.
Three teams of researchers at UC San Diego's Swartz Center for Computational Neuroscience (SCCN) and Institute for Neural Computation (INC) are pioneering a new field called Brain Computer Interface (BCI). Scott Makeig, Tzyy-Ping Jung, and colleagues are developing technology that links thoughts, commands and emotions from the brain to computers, using EEG. "In addition to gadgets like mind-dialed cell phones, devices to assist the severely disabled and a cap to alert nodding-off air traffic controllers, new technology could reshape medicine."
In addition to the BCI studies, SCCN is involved in several other music projects. In one, PhD student Grace Leslie is working with Scott Makeig to study emotional expression of a person attempting to convey the feeling of the music they are hearing via expressive 'conducting' gestures, using their new Mobile Brain/Body Imaging (MoBI) laboratory. In a related pilot project, Dr. Makeig and his team are using an instrumented violin bow to collect EEG, motion capture, and bow dynamics from violinists attempting to express various feelings vita simple open-string bowed violin tones.
Because studies show correlations between auditory processing ability and cognitive functions, investigators have begun to develop and test music-based interventions that might help improve children's cognitive abilities (e.g. The Gamelan Project).
TDLC's Paula Tallal explains that auditory language training, as well as musical training, is being shown to alter the functional anatomy of the brain that is traditionally associated with speech and language processing. She explains, "behavioral data shows that musical training, as well as neuroplasticity-based acoustic training, significantly improves language and reading skills. Thus, one route by which music therapy may most significantly impact clinical populations is by improving dynamic auditory attention, sequencing and memory processes."
Now that a correlation has been found between music training and cognitive and language improvements, the next step is to create and test different interventions that might help improve cognitive and language skills in children at risk or struggling in reading or other attentional tasks. Paula Tallal, as part of the Scientific Learning Corporation, has helped develop the Fast ForWord® Language and Reading products. The program consists of a series of computer-delivered brain fitness exercises to help educators improve children's academic achievement. "After auditory language training of children with dyslexia," Dr. Tallal explains, "metabolic brain activity more closely resembles "normal" readers, and reading improved enormously after intervention." In fact, Improving neural capacities has been shown to improve student performance, independent of content (language, math, science) or curriculum used (Tallal, 2004). So, Dr. Tallal explains, "even children's math scores improve tremendously in large clinical trials in the schools, even though we don't train math. We train the brain's precision to process auditory and language information. Our focus is not just auditory, it is not just music, but what that does for language and how important language is for ALL academic achievement."
The Role of Auditory Processing in Language Development and Disorders