Date of Completion

Embargo Period

Keywords

Major Advisor

Co-Major Advisor

Heather L. Read

Associate Advisor

Ian Stevenson

Associate Advisor

Sabato Santaniello

Field of Study

Biomedical Engineering

Degree

Doctor of Philosophy

Open Access

Open Access

Abstract

The mammalian brain is able to recognize natural sounds in the presence of acoustic uncertainties such as background noise. A prevailing theory of neural coding suggest that neural systems are optimized for natural environment signals and sensory inputs that are biologically relevant. The optimal coding hypothesis thus suggests that neural populations should encode sensory information so as to maximize efficient utilization of environmental inputs. In the first part of my thesis, I will explore the origins of scale invariance phenomena which has been previously described for natural sounds and has been observed in a variety of natural sensory signals including natural scenes. In the second part, I will explore the ability of the brain to utilize high-level statistical regularities in natural sounds to perform sound identification tasks. Using a catalog of natural sounds, texture synthesis procedures to manipulate sounds statistics from various sound categories, and neural recordings from the auditory midbrain of awake rabbits, I will show that neural population response statistics can be used to identify discrete sound categories. In the last part of the thesis, I will explore the role of hierarchical organization in the auditory pathway for sound recognition and optimal coding in the presence of challenging background noise. Using neural responses from auditory nerve, midbrain, and auditory cortex, I developed optimal computational neural network model for word recognition in presence of speech babble noise. I demonstrate that the optimal computational strategy for word recognition in noise predicts various transformations performed by the ascending auditory pathway, including a sequential loss of temporal and spectral resolution, increasing sparseness and selectivity.