Digital Repository

Inconsistent illusory motion in predictive coding deep neural networks

Show simple item record

dc.contributor.author KIRUBESWARAN, O. R. en_US
dc.contributor.author Storrs, Katherine R. en_US
dc.date.accessioned 2023-06-26T03:56:03Z
dc.date.available 2023-06-26T03:56:03Z
dc.date.issued 2023-05 en_US
dc.identifier.citation Vision Research, 206, 108195. en_US
dc.identifier.issn 0042-6989 en_US
dc.identifier.issn 1878-5646 en_US
dc.identifier.uri https://doi.org/10.1016/j.visres.2023.108195 en_US
dc.identifier.uri http://dr.iiserpune.ac.in:8080/xmlui/handle/123456789/8036
dc.description.abstract Why do we perceive illusory motion in some static images? Several accounts point to eye movements, response latencies to different image elements, or interactions between image patterns and motion energy detectors. Recently PredNet, a recurrent deep neural network (DNN) based on predictive coding principles, was reported to reproduce the “Rotating Snakes” illusion, suggesting a role for predictive coding. We begin by replicating this finding, then use a series of “in silico” psychophysics and electrophysiology experiments to examine whether PredNet behaves consistently with human observers and non-human primate neural data. A pretrained PredNet predicted illusory motion for all subcomponents of the Rotating Snakes pattern, consistent with human observers. However, we found no simple response delays in internal units, unlike evidence from electrophysiological data. PredNet’s detection of motion in gradients seemed dependent on contrast, but depends predominantly on luminance in humans. Finally, we examined the robustness of the illusion across ten PredNets of identical architecture, retrained on the same video data. There was large variation across network instances in whether they reproduced the Rotating Snakes illusion, and what motion, if any, they predicted for simplified variants. Unlike human observers, no network predicted motion for greyscale variants of the Rotating Snakes pattern. Our results sound a cautionary note: even when a DNN successfully reproduces some idiosyncrasy of human vision, more detailed investigation can reveal inconsistencies between humans and the network, and between different instances of the same network. These inconsistencies suggest that predictive coding does not reliably give rise to human-like illusory motion. en_US
dc.language.iso en en_US
dc.publisher Elsevier B.V. en_US
dc.subject Visual illusions en_US
dc.subject Motion perception en_US
dc.subject Deep neural networks en_US
dc.subject Predictive coding en_US
dc.subject Peripheral drift illusion en_US
dc.subject 2023-JUN-WEEK1 en_US
dc.subject TOC-JUN-2023 en_US
dc.subject 2023 en_US
dc.title Inconsistent illusory motion in predictive coding deep neural networks en_US
dc.type Article en_US
dc.contributor.department Dept. of Biology en_US
dc.identifier.sourcetitle Vision Research en_US
dc.publication.originofpublisher Foreign en_US


Files in this item

Files Size Format View

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record

Search Repository


Advanced Search

Browse

My Account