About the Lab

The Hamilton Lab is part of the NeuroComm Labs at UT Austin's Department of Communication Sciences and Disorders in the Moody College of Communication. Our research aims to determine how natural sounds including speech are represented by the human brain, and how these representations change during development. We study how the human brain processes speech sounds using intracranial electrocorticography (ECoG) recordings from patients with intractable epilepsy who are undergoing surgery to treat their epilepsy.  We use a combination of electrophysiology, behavior, neuroimaging, and computational modeling to address these questions. 

We would not be able to do this work without the generous help of clinicians and our patient volunteers, who participate in listening tasks during their hospital stay. You can read more about our research on the Research page.

Picture of brain with electrodes localized and labeled using our software.

October 31, 2017

Our new paper with colleagues from the University of California San Francisco on localization, labeling, and warping of electrodes in electrocorticography (ECoG) is now out! The paper includes open source python software for electrode localization, brain plotting, and more.

Intracranial EEG waveforms showing detected seizure activity from methods described in Baud et al. in the journal Neurosurgery

October 10, 2017

A paper with our collaborators from the University of California, San Francisco Department of Neurological Surgery and Department of Neurology is now out in the journal Neurosurgery! Read how Maxime Baud and colleagues were able to apply unsupervised machine learning techniques to detecting seizure activity in intracranial EEG.

Brain image from Auditory Cortex meeting poster showing regions sensitive to onsets and sustained portions of speech sounds

September 14, 2017

Dr. Hamilton presents her research at the 6th International Conference on Auditory Cortex in Banff, Canada on the functional organization of the human speech cortex using intracranial recordings.

Sound waveforms from English sentences used in our studies

August 24, 2017

Write-up from NPR on Tang, Hamilton, & Chang Science 2017. "Really? Really. How Our Brains Figure Out What Words Mean Based On How They're Said"

Picture of one electrode on the brain showing a response to a pitch contour in speech.

August 24, 2017

Write-up from Wired on Tang, Hamilton, & Chang Science 2017. "Scientists found the neurons that respond to uptalk"