Sensory Neuroengineering

Tobias Reichenbach
Department of Bioengineering, Imperial College London

EEG setup My group works on the biophysics of hearing and neuroscience, at the interface of science, technology and medicine.

We use ideas from theoretical physics, mathematics, and computer science in combination with ear and brain imaging to investigate principles of human auditory signal detection and processing. Together with clinical collaborators we also investigate auditory and language impairments.

We aim to apply our findings in novel bio-inspired technology as well as in novel technology diagnosing and rehabilitating hearing and communication impairments.

The group is part of the Department of Bioengineering at Imperial College London. We are funded by EPSRC, the Wellcome Trust, and the Royal Society.







News

Neural responses to speech can help to diagnose brain injury

Brain imaging

Brain injury such as following traffic or sports accidents can lead to severe disorders, including disorders of consciousness. This disorder is currently diagnosed through behavioural assessments, but this method fails when patients are not able to respond overtly. We investigated whether neural responses to speech as measured from the clinically-applicable EEG can aid to diagnose disorders of consciousness. We focussed on the neural tracking of the speech envelope that can index attention to speech as well as speech comprehension. We find that the latency of the neural envelope tracking related to the severity of the disorder of consciousness: patients in a vegetative state without signs of consciuosness showed neural responses to the speech envelope that were significantly delayed compared to patients that exhibited consciusness.

C. Braiman, E. A. Fridman, M. M. Conte, C. S. Reichenbach, T. Reichenbach, N. D. Schiff
Cortical Response to the Natural Speech Envelope Correlates with Neuroimaging Evidence of Cognition in Severe Brain Injury,
Curr. Biol. 28:1-7 (2018). [pdf]


How we can tune in to a voice in background noise

The investigators

In order to focus on a particular conversation, listeners need to be able to focus on the voice of the speaker they wish to listen to. This process is called selective attention and has been extensively studied within the auditory cortex. However, due to neural feedback from the cortex to lower auditory areas, the auditory brainstem as well as the inner ear, these structures may already actively participate in attending to a particular voice.

We have devised a mathematical method to measure the response of the auditory brainstem to the pitch of natural speech. In a controlled experiment on selective attention, we have then shown that the brainstem responds stronger to the pitch of the voice that a person is listening to than to that of the ignored voice. Our findings demonstrate that the brainstem contributes already actively to selective attention. They also show that the pitch of a voice can be a powerful cue to focus on that voice, which may inspire future speech-recognition technology.

A. E. Forte, O. Etard and T. Reichenbach,
The human auditory brainstem response to running speech reveals a subcortical mechanism for selective attention,

eLife 6:e27203 (2017). [pdf] [bioRxiv]


Upcoming workshop on Speech and Hearing

I am excited to announce an upcoming workshop Physics of Hearing: From Neurobiology to Information Theory and Back at the Kavli Institute for Theoretical Physics (KITP), University of California Santa Barbara (UCSB), U.S.A.. The workshop will run from May 30 2017 to July 21 2017. Coordinated by Hervé Bourlard, Maria Neimark Geffen, Jim Hudspeth, and myself it will bring together researchers on the biophysics and neurobiology of hearing with those investigating the information theory of complex auditory signals. We expect that the combination of these two perspectives will foster novel and exciting collaborations between program participants and yield significant progress in the neurobiology of hearing and oral communication as well as in speech-recognition technology.