Washington: Neuroscientists have identified two key regions in our brain which merge fleeting views of our surroundings into a seamless, 360-degree panorama.
“Our understanding of our environment is largely shaped by our memory for what’s currently out of sight,” said Caroline Robertson, a postdoc at Massachusetts Institute of Technology (MIT) in the US. “What we were looking for are hubs in the brain where your memories for the panoramic environment are integrated with your current field of view,” said Robertson.
As we look at a scene, visual information flows from our retinas into the brain, which has regions that are responsible for processing different elements of what we see, such as faces or objects. The team suspected that areas involved in processing scenes – the occipital place area (OPA), the retrosplenial complex (RSC), and parahippocampal place area (PPA) – might also be involved in generating panoramic memories of a place such as a street corner.
If this were true, when you saw two images of houses that you knew were across the street from each other, they would evoke similar patterns of activity in these specialised brain regions. Two houses from different streets would not induce
similar patterns. “Our hypothesis was that as we begin to build memory of
the environment around us, there would be certain regions of the brain where the representation of a single image would start to overlap with representations of other views from the same scene,” Robertson said.
The researchers explored this hypothesis using immersive virtual reality headsets, which allowed them to show people many different panoramic scenes.
In this study, the researchers showed participants images from 40 street corners in Boston’s Beacon Hill neighborhood. The images were presented in two ways: Half the time, participants saw a 100-degree stretch of a 360-degree scene,
but the other half of the time, they saw two noncontinuous stretches of a 360-degree scene.
After showing participants these panoramic environments, the researchers then showed them 40 pairs of images and asked if they came from the same street corner. Participants were much better able to determine if pairs came from the same corner if they had seen the two scenes linked in the 100-degree image than if they had seen them unlinked.
Brain scans showed that when participants saw two images that they knew were linked, the response patterns in the RSC and OPA regions were similar. However, this was not the case for image pairs that the participants had not seen as linked. This suggests that the RSC and OPA, but not the PPA, are involved in building panoramic memories of our surroundings, researchers said. The study appears in the journal Current Biology.