How do the eyes and brain work together when it comes to categorizing and memory? For example, you go to the supermarket and you see some carrots. Do you think of the carrots as an ingredient for a salad or are they part of a snack tray? Or imagine seeing two books on different shelves in a crowded bookcase that wasn’t arranged in any particular order. How do you remember where those books were if you came back to them after a few moments. Two separate research projects, one at Columbia University and another at Ohio State University studied these aspects of vision and memory and here is what they discovered.
Getting back to the example of finding carrots in the supermarket, thinking of carrots as an ingredient for salad or as part of a snack tray depends a few things, such as the time of year or your plans when you get home. If you are planning a quiet evening at home, you might want to have a salad for dinner, and the salad would have some carrots. If you are planning a watch party for Super Bowl Sunday, then carrots would be part of a snack tray.
Categorizing carrots as either an ingredient for salad or as part of a snack tray is the responsibility of the prefrontal cortex. This is the brain region in charge of reasoning and other high-level functions. In this example, the eyes and visual regions of the brain act like a security camera, collecting data and processing it in a standardized fashion before moving it along for analysis. Research conducted at Columbia University shows that the brain’s visual regions play an active role in interpreting information. The way the visual regions interpret the information depends on what the rest of the brain is working on at the moment.
Scientists used functional magnetic resonance imaging (fMRI) to observe people’s brain activity while they categorized shapes. Over the course of the activity, the “rules” for categorizing the shapes kept changing. This helped the researchers to ascertain whether the visual cortex was changing how it represented the shapes depending on how the categories were defined.
They then analyzed the data to examine the patterns of brain activation in response to the different shape images and measured how the brain recognizes shapes in different categories. What they saw was that the brain responds differently depending on the category study participants were sorting the shapes into.
Researchers found that activity in the primary and secondary visual cortices, which handle information directly from the eyes, changed with every task. The visual system reorganized its activity depending on the rules people were using. This was shown on the fMRI by the brain activation patterns becoming more distinctive when a shape was near the gray area between categories. These shapes were the most difficult to tell apart, and that is where extra processing proved useful.
In fact, scientist could see clearer neural patterns in the fMRI data where subjects performed better on the tasks. This suggests that the visual cortex may help to solve flexible categorization tasks. Since being flexible in categorizing things—such as carrots for a salad or a snack—is a hallmark of human cognition. The results of this study could contribute to the development of AI systems that can better adapt to changing conditions.
What if the task at hand involves remembering things other than just picking up carrots at the supermarket? For example, you have to find something like two books on different shelves in a crowded bookcase after you set them aside for a moment. Part two deals with this aspect of vision and memory.
Source:
https://www.engineering.columbia.edu/about/news/how-thoughts-influence-what-eyes-see