What a ferret sees could be a window into understanding how the human brain enables vision, or so Adwiteeya Misra ’14 learned as a Mazilu Engineering Research Fellow. Since the areas of a ferret’s brain that process early visual signals—including the retina, visual thalamus, and primary visual cortex—are structurally and functionally similar to humans, her advisors Assistant Professor of Engineering Solomon Diamond ’97 Th’98 and Assistant Professor of Physiology at Geisel School of MedicineFarran Briggs chose ferrets to help them discern how external factors affect the physiology of neurons in early visual brain areas.

Until now experiments have only predicted the brain’s responses to stimuli by monitoring an anesthetized animal’s response to visual stimuli. The team, which also includes Geisel PhD student Michael Hasse, is among the first to contrast the response from the brain of the animal that is freely roaming in a natural environment with its response when anesthetized and passively viewing natural scenes. They will record what the ferret sees when exploring the naturalistic environment, and then replay that 3D digital movie to the same animal when anesthetized—all while recording its neural signals for later analysis.

“Using a combination of computer aided design, 3D printing, and animation programs, we are developing a physical environment identical to the virtual environment to create a movie of the animal's visual scenes using the recorded movements of the animal,” says Misra, who received the fellowship from Jaime Mazilu '05 Th'06 via Saguaro Technology, Inc., founded and run by her mother, Doina Mazilu to fund student research and encourage academic engineering careers in the U.S. “This movie will eventually be played to the anesthetized animal.”

Since neurons communicate through electrical signals, the team is implanting a wireless electrode array in the ferret’s primary visual cortex. The array will track the ferret’s neural activity while it roams free in the specially-designed environment that displays images of luminescent trees, branches, and leaves that Misra, Hasse and Diamond carefully compiled from a widely studied database of visual scenes.

“There is also an external camera system that will track a ferret’s movement with precise measurements of direction as well as orientation,” says Misra, adding that Briggs’ team will oversee the visual-processing component of the project. “The movement data will be extracted to determine the exact position of the ferret's head, which provides knowledge of their eye position.”

The team hopes to walk away with new understanding of how neurons in an individual’s primary visual system responds to visual scenes not just in the research lab but in the real world.

“For now, we are in the process of assembling the entire environment, and are excited to run the experiments soon,” says Misra. “Although we cannot accurately predict the neural signals we will receive, we look forward to the challenge of analyzing the signal patterns we observe.”