School of Engineering Research Seminar with Dr. Xiu Zhai

Date: 01-12-2021

Time: 01:00 PM



Neural Coding and Perception of Natural Sounds 

Dr. Xiu Zhai 

University of Connecticut 

1-2 pm 

Tuesday, January 12th, 2021 

Zoom meeting ID: 95219877084 

Being able to recognize and discriminate natural sounds, such as from a running stream, a crowd clapping, or ruffling leaves, is a critical task of the normal functioning auditory system. However, people with hearing loss demonstrate degraded auditory processing and reduced perceptual abilities. This difficulty is attributed to the complex physical structure of natural sounds and the fact they are not unique: they vary randomly in a statistically defined manner from one excerpt to the other. Much of the previous work on the neural basis of sound coding has been performed using relatively simple acoustic stimuli, such as tones and noises. Less is known about the neural representations and perception of the high-order structure in real-world sounds and how they are affected by hearing impairment. This presentation will introduce a recent study which demonstrates that the spectrum and high-order summary statistics of natural sounds are reflected in the response statistics of auditory midbrain ensembles. Results from the neural decoders are consistent with and can predict perceptual patterns in human listeners, suggesting that these neural response statistics may have dissociable roles for sound recognition and discrimination. Future directions in the investigation of sound coding and perception in normal and impaired auditory systems will be discussed. The findings can potentially contribute to the development of new sound processing strategies for assistive listening devices (e.g., hearing aids, cochlear implants) and clinical diagnostic tools that can help individuals with hearing loss. 

Biosketch: Dr. Xiu Zhai is currently a postdoc in the Department of Electrical and Computer Engineering at the University of Connecticut. She received her Ph.D. in Biomedical Engineering from UConn in 2017, focusing on the study of saccadic eye movements triggered by visual and auditory stimuli. Her postdoctoral work is centered on the role of sound statistics in sound recognition and discrimination. Dr. Zhai’s current research interests include using neurophysiological, psychophysical and computational approaches to study the neural coding and perception of everyday acoustic stimuli and investigate how they are affected by hearing impairment, aiming to help people with hearing loss. 

For more information, contact School of Engineering / 4147 /