Technical Program

Paper Detail

Paper: PS-2B.26
Session: Poster Session 2B
Location: Symphony/Overture
Session Time: Friday, September 7, 19:30 - 21:30
Presentation Time:Friday, September 7, 19:30 - 21:30
Presentation: Poster
Publication: 2018 Conference on Cognitive Computational Neuroscience, 5-8 September 2018, Philadelphia, Pennsylvania
Paper Title: Modelling Human Visual Uncertainty using Bayesian Deep Neural Networks
Manuscript:  Click here to view manuscript
DOI: https://doi.org/10.32470/CCN.2018.1040-0
Authors: Patrick McClure, University of Cambridge, United Kingdom; Tim Kietzmann, Johannes Mehrer, Univeristy of Cambridge, United Kingdom; Nikolaus Kriegeskorte, Columbia Univeristy, United States
Abstract: Dealing with sensory uncertainty is necessary for humans to operate in the world. Often, multiple interpretations of an event are possible given the sensory evidence, even if one interpretation is most likely. The exact neurobiological mechanism used for representing uncertainty is unknown, but there is increasing evidence that the human brain could use stochasticity to code for uncertainty. However, the convolutional neural networks (CNNs) currently used to model human vision implement deterministic mappings from input to output. We seek to use stochasticity to improve CNNs as both computer vision models and models of human visual perception. We used Gaussian unit noise and sampling to approximate Bayesian CNNs for Eco-set, a large-scale object recognition dataset. We found that sampling during both training and testing improved a CNN's accuracy and ability to represent its own uncertainty for large-scale object recognition. We also found that sampling during both training and testing improved the ability of linear classifiers trained on internal CNN representations to predict human confidence scores for image classification. These results add to the evidence that Bayesian models predict key aspects of human object categorisation behaviour and that sampling in biological neural networks could be a means of representing uncertainty for visual perception in the human brain.