Researchers propose a new computational framework that uses artificial intelligence technology to unravel the relationship between perception and memory in the human brain

The medial temporal lobe (MTL) supports a constellation of memory-related behaviors. Its involvement in perceptual processing, however, has been subject to enduring debate. This debate centers on the perirhinal cortex (PRC), an MTL structure at the apex of the ventral visual stream (VVS).

Took advantage of a deep learning framework that approximates visual behaviors supported by the VVS (i.e., lacking PRC).

First applied this approach retroactively, modeling 30 published visual discrimination experiments: excluding non-diagnostic stimulus sets, there is a striking correspondence between VVS-modeled and PRC-lesioned behavior, while each is outperformed by PRC-intact participants.

Extended these results with a novel experiment, directly comparing PRC-intact human performance to electrophysiological recordings from the macaque VVS: PRC-intact participants outperform a linear readout of high-level visual cortex.

The full report on the publication can be accessed at the Neuroscience News website.

The paper featuring the research may be retrieved from the DOI: 10.1016/j.neuron.2021.06.018