You might think you have an honest relationship with your brain, but studies from UA researchers indicate otherwise.
Laura Cacciamani, a post-doctoral fellow at Smith-Kettlewell Eye Research Institute in San Francisco who recently completed her graduate studies at the UA in psychology, is a lead author on a study that shows how the brain assesses meaning for objects that you don’t consciously perceive. The behavioral study was published online in the journal Attention, Perception and Psychophysics earlier this year.
“The idea is that while you’re walking around in the world, there are things that you’re seeing that might seem meaningless while you do another task,” Cacciamani said, “but your brain is still processing what’s going on in those unseen displays.”
The experiment was formatted to have participants indicate whether a noun on a computer screen was either an artificial or natural object. For example, a leaf should be classified as natural while an axe would be artificial. Before each noun, a black silhouette was flashed for 50 milliseconds. In the ground region — the white portion of the image — was a familiar natural or artificial object that participants were unaware of.
Cacciamani found although subjects were not consciously aware of the natural or artificial object within the ground region, the response time when deciding the noun type was faster by a magnitude of 50 to 100 milliseconds if the two object categories matched.
“Your brain picked up on the groundside even though you weren’t attending to it,” Cacciamani said. “The brain picks up on that naturalness and then, when another word that is natural comes up, subjects were faster to respond to that word versus when the word was of another category. The brain was primed.”
Jay Sanguinetti, a recent doctoral graduate of psychology at the UA, was inspired by the same questions regarding visual perception and utilized electroencephalography and the N400 brain potential.
Sanguinetti explained the N400 wave is a brain signature for when the brain has processed something familiar.
“The N400 peaks 400 milliseconds after the stimulus,” Sanguinetti said. “This brain potential is coming from parts of the brain that deal with memory. There [have] been 30 years of studies showing this. This brain potential is reflecting the part of the brain that is trying to integrate meaning into the context. Your brain is doing this all day, every day.”
Sanguinetti’s study was set up slightly differently than that of Cacciamani. He had participants wear EEG caps where electrodes would record the brain’s electrical activity. He then flashed the black-and-white silhouettes on a computer screen. He asked subjects to decide whether the object, the black-colored part of the image, was novel or familiar. However, the groundside also had either familiar or novel objects present.
Sanguinetti explained the N400 potential was present even when participants described the object as novel because their brains had processed the familiar object in the groundside. When participants were shown objects that were unfamiliar all around, the N400 wave was very small and essentially gone.
“If there is mere repetition of that familiarity regardless of conscious awareness, then we should get the reduction, and we did,” said Mary Peterson, a professor of psychology and cognitive science. Peterson was the adviser for both Cacciamani and Sanguinetti. Peterson is optimistic for how these converging studies will allow for better understanding of visual perception.
“The brain is a dynamic, interactive machine,” Peterson said. “We can use this understanding to help us understand disease and what’s going wrong in various sorts of things like developmental disorders or late onset blindness. If we understand how the brain works to give us visual perception, then eventually we could imagine coming up with much better prosthesis than we have right now.”
_______________
Follow Kimberlie Wang on Twitter.