Modelling the dynamics of motion integration with a new luminance-gated diffusion mechanism

The dynamics of motion integration show striking similarities when observed at neuronal, psychophysical, and oculomotor levels. Based on the inter-relation and complementary insights given by those dynamics, our goal was to test how basic mechanisms of dynamical cortical processing can be incorporat...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Vision research (Oxford) 2010-08, Vol.50 (17), p.1676-1692
Hauptverfasser: Tlapale, Émilien, Masson, Guillaume S., Kornprobst, Pierre
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 1692
container_issue 17
container_start_page 1676
container_title Vision research (Oxford)
container_volume 50
creator Tlapale, Émilien
Masson, Guillaume S.
Kornprobst, Pierre
description The dynamics of motion integration show striking similarities when observed at neuronal, psychophysical, and oculomotor levels. Based on the inter-relation and complementary insights given by those dynamics, our goal was to test how basic mechanisms of dynamical cortical processing can be incorporated in a dynamical model to solve several aspects of 2D motion integration and segmentation. Our model is inspired by the hierarchical processing stages of the primate visual cortex: we describe the interactions between several layers processing local motion and form information through feedforward, feedback, and inhibitive lateral connections. Also, following perceptual studies concerning contour integration and physiological studies of receptive fields, we postulate that motion estimation takes advantage of another low-level cue, which is luminance smoothness along edges or surfaces, in order to gate recurrent motion diffusion. With such a model, we successfully reproduced the temporal dynamics of motion integration on a wide range of simple motion stimuli: line segments, rotating ellipses, plaids, and barber poles. Furthermore, we showed that the proposed computational rule of luminance-gated diffusion of motion information is sufficient to explain a large set of contextual modulations of motion integration and segmentation in more elaborated stimuli such as chopstick illusions, simulated aperture problems, or rotating diamonds. As a whole, in this paper we proposed a new basal luminance-driven motion integration mechanism as an alternative to less parsimonious models, we carefully investigated the dynamics of motion integration, and we established a distinction between simple and complex stimuli according to the kind of information required to solve their ambiguities.
doi_str_mv 10.1016/j.visres.2010.05.022
format Article
fullrecord <record><control><sourceid>proquest_hal_p</sourceid><recordid>TN_cdi_hal_primary_oai_HAL_hal_00847434v1</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0042698910002555</els_id><sourcerecordid>749028305</sourcerecordid><originalsourceid>FETCH-LOGICAL-c473t-6bf9087d732e9d7ceff99cc79bde54bff8d1bf6390e650e8d54f345cbfc07d773</originalsourceid><addsrcrecordid>eNqFkUGP1CAYhonRuOPqPzCGm_HQ8WuBAheTzUZdkzFe9OSBUPiYYdKWtbSz2X8vY9c96gny5Xn5yPsQ8rqGbQ11-_64PcU8Yd42UEYgttA0T8imVlJVouXtU7IB4E3VaqUvyIucjwAgRaOfk4sGhGC6FRvy82vy2Pdx3NP5gNTfj3aILtMU6JDmmEYaxxn3k_1zv4vzgVo64h3tlyGOdnRY7e2MnvoYwpLP0IDuYMeYh5fkWbB9xlcP5yX58enj9-ubavft85frq13luGRz1XZBg5Jesga1lw5D0No5qTuPgnchKF93oWUasBWAygseGBeuCw5KSrJL8m5992B7czvFwU73Jtlobq525jwDUFxyxk91Yd-u7O2Ufi2YZzPE7EoDdsS0ZCMFV5qzmv-f5BoaxUAUkq-km1IuSsLjJ2owZ1fmaFZX5uzKgDDFVYm9eViwdAP6x9BfOQX4sAJYyjtFnEx2EUvlPk7oZuNT_PeG31wRqCk</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>749028305</pqid></control><display><type>article</type><title>Modelling the dynamics of motion integration with a new luminance-gated diffusion mechanism</title><source>MEDLINE</source><source>Elsevier ScienceDirect Journals</source><source>Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals</source><creator>Tlapale, Émilien ; Masson, Guillaume S. ; Kornprobst, Pierre</creator><creatorcontrib>Tlapale, Émilien ; Masson, Guillaume S. ; Kornprobst, Pierre</creatorcontrib><description>The dynamics of motion integration show striking similarities when observed at neuronal, psychophysical, and oculomotor levels. Based on the inter-relation and complementary insights given by those dynamics, our goal was to test how basic mechanisms of dynamical cortical processing can be incorporated in a dynamical model to solve several aspects of 2D motion integration and segmentation. Our model is inspired by the hierarchical processing stages of the primate visual cortex: we describe the interactions between several layers processing local motion and form information through feedforward, feedback, and inhibitive lateral connections. Also, following perceptual studies concerning contour integration and physiological studies of receptive fields, we postulate that motion estimation takes advantage of another low-level cue, which is luminance smoothness along edges or surfaces, in order to gate recurrent motion diffusion. With such a model, we successfully reproduced the temporal dynamics of motion integration on a wide range of simple motion stimuli: line segments, rotating ellipses, plaids, and barber poles. Furthermore, we showed that the proposed computational rule of luminance-gated diffusion of motion information is sufficient to explain a large set of contextual modulations of motion integration and segmentation in more elaborated stimuli such as chopstick illusions, simulated aperture problems, or rotating diamonds. As a whole, in this paper we proposed a new basal luminance-driven motion integration mechanism as an alternative to less parsimonious models, we carefully investigated the dynamics of motion integration, and we established a distinction between simple and complex stimuli according to the kind of information required to solve their ambiguities.</description><identifier>ISSN: 0042-6989</identifier><identifier>EISSN: 1878-5646</identifier><identifier>EISSN: 0042-6989</identifier><identifier>DOI: 10.1016/j.visres.2010.05.022</identifier><identifier>PMID: 20553965</identifier><language>eng</language><publisher>England: Elsevier Ltd</publisher><subject>2D motion integration ; Eye Movements - physiology ; General Mathematics ; Humans ; Lighting ; Luminance ; Mathematics ; Models, Biological ; Motion perception ; Motion Perception - physiology ; Pattern Recognition, Visual - physiology ; Primates ; Recurrent cortical model ; Temporal dynamics</subject><ispartof>Vision research (Oxford), 2010-08, Vol.50 (17), p.1676-1692</ispartof><rights>2010 Elsevier Ltd</rights><rights>Copyright 2010 Elsevier Ltd. All rights reserved.</rights><rights>Distributed under a Creative Commons Attribution 4.0 International License</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c473t-6bf9087d732e9d7ceff99cc79bde54bff8d1bf6390e650e8d54f345cbfc07d773</citedby><cites>FETCH-LOGICAL-c473t-6bf9087d732e9d7ceff99cc79bde54bff8d1bf6390e650e8d54f345cbfc07d773</cites><orcidid>0000-0003-4906-1368 ; 0000-0001-9227-0777</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.sciencedirect.com/science/article/pii/S0042698910002555$$EHTML$$P50$$Gelsevier$$Hfree_for_read</linktohtml><link.rule.ids>230,314,776,780,881,3537,27903,27904,65309</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/20553965$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink><backlink>$$Uhttps://inria.hal.science/hal-00847434$$DView record in HAL$$Hfree_for_read</backlink></links><search><creatorcontrib>Tlapale, Émilien</creatorcontrib><creatorcontrib>Masson, Guillaume S.</creatorcontrib><creatorcontrib>Kornprobst, Pierre</creatorcontrib><title>Modelling the dynamics of motion integration with a new luminance-gated diffusion mechanism</title><title>Vision research (Oxford)</title><addtitle>Vision Res</addtitle><description>The dynamics of motion integration show striking similarities when observed at neuronal, psychophysical, and oculomotor levels. Based on the inter-relation and complementary insights given by those dynamics, our goal was to test how basic mechanisms of dynamical cortical processing can be incorporated in a dynamical model to solve several aspects of 2D motion integration and segmentation. Our model is inspired by the hierarchical processing stages of the primate visual cortex: we describe the interactions between several layers processing local motion and form information through feedforward, feedback, and inhibitive lateral connections. Also, following perceptual studies concerning contour integration and physiological studies of receptive fields, we postulate that motion estimation takes advantage of another low-level cue, which is luminance smoothness along edges or surfaces, in order to gate recurrent motion diffusion. With such a model, we successfully reproduced the temporal dynamics of motion integration on a wide range of simple motion stimuli: line segments, rotating ellipses, plaids, and barber poles. Furthermore, we showed that the proposed computational rule of luminance-gated diffusion of motion information is sufficient to explain a large set of contextual modulations of motion integration and segmentation in more elaborated stimuli such as chopstick illusions, simulated aperture problems, or rotating diamonds. As a whole, in this paper we proposed a new basal luminance-driven motion integration mechanism as an alternative to less parsimonious models, we carefully investigated the dynamics of motion integration, and we established a distinction between simple and complex stimuli according to the kind of information required to solve their ambiguities.</description><subject>2D motion integration</subject><subject>Eye Movements - physiology</subject><subject>General Mathematics</subject><subject>Humans</subject><subject>Lighting</subject><subject>Luminance</subject><subject>Mathematics</subject><subject>Models, Biological</subject><subject>Motion perception</subject><subject>Motion Perception - physiology</subject><subject>Pattern Recognition, Visual - physiology</subject><subject>Primates</subject><subject>Recurrent cortical model</subject><subject>Temporal dynamics</subject><issn>0042-6989</issn><issn>1878-5646</issn><issn>0042-6989</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2010</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><recordid>eNqFkUGP1CAYhonRuOPqPzCGm_HQ8WuBAheTzUZdkzFe9OSBUPiYYdKWtbSz2X8vY9c96gny5Xn5yPsQ8rqGbQ11-_64PcU8Yd42UEYgttA0T8imVlJVouXtU7IB4E3VaqUvyIucjwAgRaOfk4sGhGC6FRvy82vy2Pdx3NP5gNTfj3aILtMU6JDmmEYaxxn3k_1zv4vzgVo64h3tlyGOdnRY7e2MnvoYwpLP0IDuYMeYh5fkWbB9xlcP5yX58enj9-ubavft85frq13luGRz1XZBg5Jesga1lw5D0No5qTuPgnchKF93oWUasBWAygseGBeuCw5KSrJL8m5992B7czvFwU73Jtlobq525jwDUFxyxk91Yd-u7O2Ufi2YZzPE7EoDdsS0ZCMFV5qzmv-f5BoaxUAUkq-km1IuSsLjJ2owZ1fmaFZX5uzKgDDFVYm9eViwdAP6x9BfOQX4sAJYyjtFnEx2EUvlPk7oZuNT_PeG31wRqCk</recordid><startdate>20100806</startdate><enddate>20100806</enddate><creator>Tlapale, Émilien</creator><creator>Masson, Guillaume S.</creator><creator>Kornprobst, Pierre</creator><general>Elsevier Ltd</general><general>Elsevier</general><scope>6I.</scope><scope>AAFTH</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope><scope>7TK</scope><scope>1XC</scope><orcidid>https://orcid.org/0000-0003-4906-1368</orcidid><orcidid>https://orcid.org/0000-0001-9227-0777</orcidid></search><sort><creationdate>20100806</creationdate><title>Modelling the dynamics of motion integration with a new luminance-gated diffusion mechanism</title><author>Tlapale, Émilien ; Masson, Guillaume S. ; Kornprobst, Pierre</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c473t-6bf9087d732e9d7ceff99cc79bde54bff8d1bf6390e650e8d54f345cbfc07d773</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2010</creationdate><topic>2D motion integration</topic><topic>Eye Movements - physiology</topic><topic>General Mathematics</topic><topic>Humans</topic><topic>Lighting</topic><topic>Luminance</topic><topic>Mathematics</topic><topic>Models, Biological</topic><topic>Motion perception</topic><topic>Motion Perception - physiology</topic><topic>Pattern Recognition, Visual - physiology</topic><topic>Primates</topic><topic>Recurrent cortical model</topic><topic>Temporal dynamics</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Tlapale, Émilien</creatorcontrib><creatorcontrib>Masson, Guillaume S.</creatorcontrib><creatorcontrib>Kornprobst, Pierre</creatorcontrib><collection>ScienceDirect Open Access Titles</collection><collection>Elsevier:ScienceDirect:Open Access</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><collection>Neurosciences Abstracts</collection><collection>Hyper Article en Ligne (HAL)</collection><jtitle>Vision research (Oxford)</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Tlapale, Émilien</au><au>Masson, Guillaume S.</au><au>Kornprobst, Pierre</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Modelling the dynamics of motion integration with a new luminance-gated diffusion mechanism</atitle><jtitle>Vision research (Oxford)</jtitle><addtitle>Vision Res</addtitle><date>2010-08-06</date><risdate>2010</risdate><volume>50</volume><issue>17</issue><spage>1676</spage><epage>1692</epage><pages>1676-1692</pages><issn>0042-6989</issn><eissn>1878-5646</eissn><eissn>0042-6989</eissn><abstract>The dynamics of motion integration show striking similarities when observed at neuronal, psychophysical, and oculomotor levels. Based on the inter-relation and complementary insights given by those dynamics, our goal was to test how basic mechanisms of dynamical cortical processing can be incorporated in a dynamical model to solve several aspects of 2D motion integration and segmentation. Our model is inspired by the hierarchical processing stages of the primate visual cortex: we describe the interactions between several layers processing local motion and form information through feedforward, feedback, and inhibitive lateral connections. Also, following perceptual studies concerning contour integration and physiological studies of receptive fields, we postulate that motion estimation takes advantage of another low-level cue, which is luminance smoothness along edges or surfaces, in order to gate recurrent motion diffusion. With such a model, we successfully reproduced the temporal dynamics of motion integration on a wide range of simple motion stimuli: line segments, rotating ellipses, plaids, and barber poles. Furthermore, we showed that the proposed computational rule of luminance-gated diffusion of motion information is sufficient to explain a large set of contextual modulations of motion integration and segmentation in more elaborated stimuli such as chopstick illusions, simulated aperture problems, or rotating diamonds. As a whole, in this paper we proposed a new basal luminance-driven motion integration mechanism as an alternative to less parsimonious models, we carefully investigated the dynamics of motion integration, and we established a distinction between simple and complex stimuli according to the kind of information required to solve their ambiguities.</abstract><cop>England</cop><pub>Elsevier Ltd</pub><pmid>20553965</pmid><doi>10.1016/j.visres.2010.05.022</doi><tpages>17</tpages><orcidid>https://orcid.org/0000-0003-4906-1368</orcidid><orcidid>https://orcid.org/0000-0001-9227-0777</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 0042-6989
ispartof Vision research (Oxford), 2010-08, Vol.50 (17), p.1676-1692
issn 0042-6989
1878-5646
0042-6989
language eng
recordid cdi_hal_primary_oai_HAL_hal_00847434v1
source MEDLINE; Elsevier ScienceDirect Journals; Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals
subjects 2D motion integration
Eye Movements - physiology
General Mathematics
Humans
Lighting
Luminance
Mathematics
Models, Biological
Motion perception
Motion Perception - physiology
Pattern Recognition, Visual - physiology
Primates
Recurrent cortical model
Temporal dynamics
title Modelling the dynamics of motion integration with a new luminance-gated diffusion mechanism
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-28T02%3A25%3A16IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_hal_p&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Modelling%20the%20dynamics%20of%20motion%20integration%20with%20a%20new%20luminance-gated%20diffusion%20mechanism&rft.jtitle=Vision%20research%20(Oxford)&rft.au=Tlapale,%20%C3%89milien&rft.date=2010-08-06&rft.volume=50&rft.issue=17&rft.spage=1676&rft.epage=1692&rft.pages=1676-1692&rft.issn=0042-6989&rft.eissn=1878-5646&rft_id=info:doi/10.1016/j.visres.2010.05.022&rft_dat=%3Cproquest_hal_p%3E749028305%3C/proquest_hal_p%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=749028305&rft_id=info:pmid/20553965&rft_els_id=S0042698910002555&rfr_iscdi=true