Technical Program

Paper Detail

Paper: PS-2A.17
Session: Poster Session 2A
Location: Symphony/Overture
Session Time: Friday, September 7, 17:15 - 19:15
Presentation Time:Friday, September 7, 17:15 - 19:15
Presentation: Poster
Publication: 2018 Conference on Cognitive Computational Neuroscience, 5-8 September 2018, Philadelphia, Pennsylvania
Paper Title: High-Level Features Organize Perceived Action Similarities
Manuscript:  Click here to view manuscript
DOI: https://doi.org/10.32470/CCN.2018.1120-0
Authors: Leyla Tarhan, Talia Konkle, Harvard University, United States
Abstract: Other people’s actions fill our visual worlds – we watch others run, dance, cook, and laugh on a daily basis. Together, these make up a repertoire of visual actions that we can recognize and reason about. How is this repertoire organized in our minds, so that some actions appear more similar than others? To answer this, we measured the perceived similarity among a large set of everyday actions. We then used a modeling framework to explore which kinds of features predict that similarity. We found that the mental action similarity space is organized primarily by relatively high-level features relating to semantic category and body part involvement. Further, neural similarity within regions that tile the visuo-motor cortex does not predict these judgments well, suggesting that they do not directly support this higher-level space. These results echo recent findings that human similarity judgments in the object and scene domains are best predicted by high-level feature spaces not grounded in the ventral visual stream (Groen et al., 2017; Jozwik et al., 2017), a pattern that has now been observed across three domains of vision and may reflect a broader principle of the perceptual system.