Become a member

Language Magazine is a monthly print and online publication that provides cutting-edge information for language learners, educators, and professionals around the world.

― Advertisement ―

― Advertisement ―

WIDA Response

WIDA understands the challenges educators are facing in teaching literacy, especially as they navigate diverse student needs and follow various research-driven and legislated approaches...

Opera for Educators

HomeFeaturesStudy Finds Visual Rhythms in Sign Language

Study Finds Visual Rhythms in Sign Language

A new study has found that patterns exist within the visual structure of sign language. The study, published in the Proceedings of the National Academy of Sciences of the United States of America (PNAS) by Geoffrey Brookshire, Jenny Lu, Howard C. Nusbaum, Susan Goldin-Meadow, and Daniel Casasanto and other researchers sought to find if sign language had similar patterns to that of auditory languages.

When people are listening to a language, they “entrain” or lock up with the speaker’s speaking patterns. The study states, “By showing similar entrainment of visual cortex to sign language, we establish that this phase-locking is not due to specific properties of auditory cortex or of oral speech perception. Rather, low-frequency entrainment is a generalized cortical strategy for boosting perceptual sensitivity to informational peaks in time-varying signals.” Essentially, this means that the way one speaks forms patterns, whether or not the language is spoken or visual.

These patterns serve a purpose, in that when people listen to conversations, their brains “phase lock” with the volume of the speaker. This means that even if a listener is distracted via outside stimuli, they can follow the volume of the speaker and anticipate when important information will be said.

The study looked at speakers of American Sign Language, German Sign Language, British Sign Language, and Australian Sign Language, as all of these languages are genetically unrelated. They created a metric to measure visual change over time and measured the fluctuations of periodic pattern fluctuation in sign language.

Prior results suggest that auditory and visual perception may be differently modulated by rhythms at different frequencies. In speech, listeners begin to phrase lock when phrases occur below 8Hz or 8 pulses per second. The researchers found that visual speech also has a frequency in which people begin to lock in, which is about 10Hz.

“By looking at sign, we’re learning something about how the brain processes language more generally. We’re solving a mystery we couldn’t crack by studying speech alone,” Casasanto said to Science Daily.

“This is an exciting finding because scientists have been theorizing for years about how adaptable or flexible entrainment may be, but we were never sure if it was specific to auditory processing or if it was more general purpose,” Brookshire added. “This study suggests that humans have the ability to follow perceptual rhythms and make temporal predictions in any of our senses.”

Language Magazine
Send this to a friend