Technical Program

Paper Detail

Paper: GS-3.1
Session: Contributed Talks III
Location: Ormandy
Session Time: Thursday, September 6, 13:50 - 14:30
Presentation Time:Thursday, September 6, 13:50 - 14:10
Presentation: Oral
Publication: 2018 Conference on Cognitive Computational Neuroscience, 5-8 September 2018, Philadelphia, Pennsylvania
Paper Title: Understanding Action Prediction with Machine Learning and Psychophysics
Manuscript:  Click here to view manuscript
DOI: https://doi.org/10.32470/CCN.2018.1174-0
Authors: Emalie McMahon, National Institute of Mental Health, United States; Ray Gonzalez, Ken Nakayama, Harvard University, United States; Leslie G. Ungerleider, Maryam Vaziri-Pashkam, National Institute of Mental Health, United States
Abstract: Predicting the actions of others is relevant in many social situations from extending a handshake to elaborate waltzes. To study the preparatory information in movements and how people are able to interpret these preparatory cues, we designed a partnered reaching task. In the competitive condition, one partner (the Blocker) had to beat the other (Attacker) to the target (see Vaziri-Pashkam et al., 2017), and in the cooperative condition, both participants were asked to tap the same target at the same time. In a psychophysical paradigm, different subjects viewed short clips of the Attacker’s movements and were asked to predict whether the Attacker was going to point to the left or right target. Subjects were able to predict the direction of movement with between 80% and 90% accuracy before finger lift off. A follow-up searchlight analysis revealed that all body parts contained informative predictive cues with the head showing predictive information earlier in the movement for both conditions, but especially for the cooperative condition. These results reveal that subjects can use preparatory cues in the movements of others to predict action goals before the start of the movement and that these cues are exaggerated in the cooperative context to communicate the goal of actions.