LA JOLLA, CA The bird enthusiast who chronicled the adventures of a flock of red-headed conures in his book "The Wild Parrots of Telegraph Hill" knows most of the parrots by name, yet most of us would be hard pressed to tell one bird from another. While it has been known for a long time that we can become acutely attuned to our day-to-day environment, the underlying neural mechanism has been less clear.
Now, collaboration between researchers at the Salk Institute for Biological Studies and Weill Cornell Medical College has revealed that brain cells processing visual information adjust their filtering properties to make the most sense of incoming information.
"We are best at discriminating the facial features that are typical of our neighbors, and if they happen to be parrots, we become very good at recognizing individual birds," explains Tatyana Sharpee, Ph.D., an assistant professor in the Laboratory for Computational Biology and the lead author on the current study, which has been published in the August 5 online edition of the Journal of Computational Neuroscience.
Neurobiologists are on a perennial quest to understand how the brain codes and processes information. In the past, they had to rely on simplified objects on a computer screen or random stimuli to garner information on how the brain's visual circuitry works. "Ultimately we are interested in what happens in a natural environment," explains Sharpee, "but some questions require more control over the properties of visual stimuli than a picture of a natural scene would allow."
Neurons in the primary visual cortex only respond when a stimulus appears within a window covering a small part of the visual field that the eye sees. This window is known as the neuron's "receptive field." Whenever a stimulus enters the neuron's receptive field, the cell produces a volley of electrical spikes, known as "action potentials" that can be recorded.
But these ne
|Contact: Gina Kirchweger|