Technical Program

Paper Detail

Paper: PS-1A.10
Session: Poster Session 1A
Location: Symphony/Overture
Session Time: Thursday, September 6, 16:30 - 18:30
Presentation Time:Thursday, September 6, 16:30 - 18:30
Presentation: Poster
Publication: 2018 Conference on Cognitive Computational Neuroscience, 5-8 September 2018, Philadelphia, Pennsylvania
Paper Title: Approximate inference explains paradoxical data in two-event causal inference task
Manuscript:  Click here to view manuscript
DOI: https://doi.org/10.32470/CCN.2018.1260-0
Authors: Sabyasachi Shivkumar, Madeline Cappelloni, Ross Maddox, Ralf M. Haefner, University of Rochester, United States
Abstract: The brain combines noisy and incomplete signals from multiple sources according to their reliability to infer the state of the outside world. The brain's implementation of this process of ``probabilistic inference'' is necessarily approximate. Here, we present theoretical insights and experimental results from a new causal integration task involving two auditory and two visual cues. While the auditory cues contain information about the correct choice, the visual cues do not. Despite the fact that the performance of an ideal observer does not depend on the location of the visual cues, human subjects' performance does. We show that this improvement can be explained by a model based on approximate inference (in our case sampling). Furthermore, we are able to quantify the ``accuracy'' of a subject's approximation using psychophysical data, something that is hard in simpler tasks in which sensor noise and inference noise affect behavior similarly. More generally, our task and model allow us to dissociate the three principal sources of suboptimality in perceptual decision-making tasks: sensor noise (e.g. in photo receptors), model mismatch (mistaken assumptions about the structure of the world), and approximate inference. Depending on this partitioning, our model makes subject-specific predictions for how behavioral performance should scale with stimulus duration.