A perturbation analysis of depth perception from combinations of texture and motion cues

We examined how depth information from two different cue types (object motion and texture gradient) is integrated into a single estimate in human vision. Two critical assumptions of a recent model of depth cue combination (termed modified weak fusion) were tested. The first assumption is that the ov...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Vision research (Oxford) 1993-12, Vol.33 (18), p.2685-2696
Hauptverfasser: Young, Mark J., Landy, Michael S., Maloney, Laurence T.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 2696
container_issue 18
container_start_page 2685
container_title Vision research (Oxford)
container_volume 33
creator Young, Mark J.
Landy, Michael S.
Maloney, Laurence T.
description We examined how depth information from two different cue types (object motion and texture gradient) is integrated into a single estimate in human vision. Two critical assumptions of a recent model of depth cue combination (termed modified weak fusion) were tested. The first assumption is that the overall depth estimate is a weighted linear combination of the estimates derived from the individual cues, after initial processing needed to bring them to a common format. The second assumption is that the weight assigned to a cue reflects the apparent reliability of that cue in a particular scene. By this account, the depth combination rule is linear and dynamic, changing in a predictable fashion in response to the particular scene and viewing conditions. A novel procedure was used to measure the weights assigned to the texture and motion cues across experimental conditions. This procedure uses a type of perturbation analysis. The results are consistent with the weighted linear combination rule. In addition, when either cue is corrupted by added noise, the weighted linear combination rule shifts in favor of the uncontaminated cue.
doi_str_mv 10.1016/0042-6989(93)90228-O
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_76199503</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>004269899390228O</els_id><sourcerecordid>76199503</sourcerecordid><originalsourceid>FETCH-LOGICAL-c580t-d5c934f59be206318526687b75200fff510c6df8dc3884a02af156f777d405f73</originalsourceid><addsrcrecordid>eNqFkM1q3DAUhUVJmE6nfYMWvAihWTi9kq2_TSGENgkEZpNAd0LWD1WxrYnkCc3bR54ZZpmsdOF85-ryIfQVwyUGzH4AtKRmUsjvsrmQQIio1x_QEgsuaspadoKWR-Qj-pTzPwDglMgFWggiC0GX6M9VtXFp2qZOTyGOlR51_5JDrqKvrNtMf-fYlGEOfYpDZeLQhXFH76jJ_S91V5q2GuKOM1uXP6NTr_vsvhzeFXr8_evh-ra-X9_cXV_d14YKmGpLjWxaT2XnCLAGC0oYE7wrdwJ47ykGw6wX1jRCtBqI9pgyzzm3LVDPmxU63-_dpPhU_p3UELJxfa9HF7dZcYalpNC8C2LGhcC8LWC7B02KOSfn1SaFQacXhUHN5tWsVc1alWzUzrxal9q3w_5tNzh7LB1Ul_zskOtsdO-THk3IR6zhFAiIgv3cY65Iew4uqWyCG42zITkzKRvD23e8AlX4n2s</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>16788174</pqid></control><display><type>article</type><title>A perturbation analysis of depth perception from combinations of texture and motion cues</title><source>MEDLINE</source><source>Access via ScienceDirect (Elsevier)</source><creator>Young, Mark J. ; Landy, Michael S. ; Maloney, Laurence T.</creator><creatorcontrib>Young, Mark J. ; Landy, Michael S. ; Maloney, Laurence T.</creatorcontrib><description>We examined how depth information from two different cue types (object motion and texture gradient) is integrated into a single estimate in human vision. Two critical assumptions of a recent model of depth cue combination (termed modified weak fusion) were tested. The first assumption is that the overall depth estimate is a weighted linear combination of the estimates derived from the individual cues, after initial processing needed to bring them to a common format. The second assumption is that the weight assigned to a cue reflects the apparent reliability of that cue in a particular scene. By this account, the depth combination rule is linear and dynamic, changing in a predictable fashion in response to the particular scene and viewing conditions. A novel procedure was used to measure the weights assigned to the texture and motion cues across experimental conditions. This procedure uses a type of perturbation analysis. The results are consistent with the weighted linear combination rule. In addition, when either cue is corrupted by added noise, the weighted linear combination rule shifts in favor of the uncontaminated cue.</description><identifier>ISSN: 0042-6989</identifier><identifier>EISSN: 1878-5646</identifier><identifier>DOI: 10.1016/0042-6989(93)90228-O</identifier><identifier>PMID: 8296465</identifier><identifier>CODEN: VISRAM</identifier><language>eng</language><publisher>Oxford: Elsevier Ltd</publisher><subject>Biological and medical sciences ; Cues ; Depth ; Depth Perception - physiology ; Fundamental and applied biological sciences. Psychology ; Humans ; Mental Processes - physiology ; Motion Perception - physiology ; Multiple cues ; Pattern Recognition, Visual ; Perception ; Psychology. Psychoanalysis. Psychiatry ; Psychology. Psychophysiology ; Psychometrics ; Sensor fusion ; Vision</subject><ispartof>Vision research (Oxford), 1993-12, Vol.33 (18), p.2685-2696</ispartof><rights>1993 Pergamon Press Ltd</rights><rights>1994 INIST-CNRS</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c580t-d5c934f59be206318526687b75200fff510c6df8dc3884a02af156f777d405f73</citedby><cites>FETCH-LOGICAL-c580t-d5c934f59be206318526687b75200fff510c6df8dc3884a02af156f777d405f73</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://dx.doi.org/10.1016/0042-6989(93)90228-O$$EHTML$$P50$$Gelsevier$$H</linktohtml><link.rule.ids>314,780,784,3550,27924,27925,45995</link.rule.ids><backlink>$$Uhttp://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&amp;idt=3750208$$DView record in Pascal Francis$$Hfree_for_read</backlink><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/8296465$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Young, Mark J.</creatorcontrib><creatorcontrib>Landy, Michael S.</creatorcontrib><creatorcontrib>Maloney, Laurence T.</creatorcontrib><title>A perturbation analysis of depth perception from combinations of texture and motion cues</title><title>Vision research (Oxford)</title><addtitle>Vision Res</addtitle><description>We examined how depth information from two different cue types (object motion and texture gradient) is integrated into a single estimate in human vision. Two critical assumptions of a recent model of depth cue combination (termed modified weak fusion) were tested. The first assumption is that the overall depth estimate is a weighted linear combination of the estimates derived from the individual cues, after initial processing needed to bring them to a common format. The second assumption is that the weight assigned to a cue reflects the apparent reliability of that cue in a particular scene. By this account, the depth combination rule is linear and dynamic, changing in a predictable fashion in response to the particular scene and viewing conditions. A novel procedure was used to measure the weights assigned to the texture and motion cues across experimental conditions. This procedure uses a type of perturbation analysis. The results are consistent with the weighted linear combination rule. In addition, when either cue is corrupted by added noise, the weighted linear combination rule shifts in favor of the uncontaminated cue.</description><subject>Biological and medical sciences</subject><subject>Cues</subject><subject>Depth</subject><subject>Depth Perception - physiology</subject><subject>Fundamental and applied biological sciences. Psychology</subject><subject>Humans</subject><subject>Mental Processes - physiology</subject><subject>Motion Perception - physiology</subject><subject>Multiple cues</subject><subject>Pattern Recognition, Visual</subject><subject>Perception</subject><subject>Psychology. Psychoanalysis. Psychiatry</subject><subject>Psychology. Psychophysiology</subject><subject>Psychometrics</subject><subject>Sensor fusion</subject><subject>Vision</subject><issn>0042-6989</issn><issn>1878-5646</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>1993</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><recordid>eNqFkM1q3DAUhUVJmE6nfYMWvAihWTi9kq2_TSGENgkEZpNAd0LWD1WxrYnkCc3bR54ZZpmsdOF85-ryIfQVwyUGzH4AtKRmUsjvsrmQQIio1x_QEgsuaspadoKWR-Qj-pTzPwDglMgFWggiC0GX6M9VtXFp2qZOTyGOlR51_5JDrqKvrNtMf-fYlGEOfYpDZeLQhXFH76jJ_S91V5q2GuKOM1uXP6NTr_vsvhzeFXr8_evh-ra-X9_cXV_d14YKmGpLjWxaT2XnCLAGC0oYE7wrdwJ47ykGw6wX1jRCtBqI9pgyzzm3LVDPmxU63-_dpPhU_p3UELJxfa9HF7dZcYalpNC8C2LGhcC8LWC7B02KOSfn1SaFQacXhUHN5tWsVc1alWzUzrxal9q3w_5tNzh7LB1Ul_zskOtsdO-THk3IR6zhFAiIgv3cY65Iew4uqWyCG42zITkzKRvD23e8AlX4n2s</recordid><startdate>19931201</startdate><enddate>19931201</enddate><creator>Young, Mark J.</creator><creator>Landy, Michael S.</creator><creator>Maloney, Laurence T.</creator><general>Elsevier Ltd</general><general>Elsevier Science</general><scope>IQODW</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7TK</scope><scope>7X8</scope></search><sort><creationdate>19931201</creationdate><title>A perturbation analysis of depth perception from combinations of texture and motion cues</title><author>Young, Mark J. ; Landy, Michael S. ; Maloney, Laurence T.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c580t-d5c934f59be206318526687b75200fff510c6df8dc3884a02af156f777d405f73</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>1993</creationdate><topic>Biological and medical sciences</topic><topic>Cues</topic><topic>Depth</topic><topic>Depth Perception - physiology</topic><topic>Fundamental and applied biological sciences. Psychology</topic><topic>Humans</topic><topic>Mental Processes - physiology</topic><topic>Motion Perception - physiology</topic><topic>Multiple cues</topic><topic>Pattern Recognition, Visual</topic><topic>Perception</topic><topic>Psychology. Psychoanalysis. Psychiatry</topic><topic>Psychology. Psychophysiology</topic><topic>Psychometrics</topic><topic>Sensor fusion</topic><topic>Vision</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Young, Mark J.</creatorcontrib><creatorcontrib>Landy, Michael S.</creatorcontrib><creatorcontrib>Maloney, Laurence T.</creatorcontrib><collection>Pascal-Francis</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Neurosciences Abstracts</collection><collection>MEDLINE - Academic</collection><jtitle>Vision research (Oxford)</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Young, Mark J.</au><au>Landy, Michael S.</au><au>Maloney, Laurence T.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A perturbation analysis of depth perception from combinations of texture and motion cues</atitle><jtitle>Vision research (Oxford)</jtitle><addtitle>Vision Res</addtitle><date>1993-12-01</date><risdate>1993</risdate><volume>33</volume><issue>18</issue><spage>2685</spage><epage>2696</epage><pages>2685-2696</pages><issn>0042-6989</issn><eissn>1878-5646</eissn><coden>VISRAM</coden><abstract>We examined how depth information from two different cue types (object motion and texture gradient) is integrated into a single estimate in human vision. Two critical assumptions of a recent model of depth cue combination (termed modified weak fusion) were tested. The first assumption is that the overall depth estimate is a weighted linear combination of the estimates derived from the individual cues, after initial processing needed to bring them to a common format. The second assumption is that the weight assigned to a cue reflects the apparent reliability of that cue in a particular scene. By this account, the depth combination rule is linear and dynamic, changing in a predictable fashion in response to the particular scene and viewing conditions. A novel procedure was used to measure the weights assigned to the texture and motion cues across experimental conditions. This procedure uses a type of perturbation analysis. The results are consistent with the weighted linear combination rule. In addition, when either cue is corrupted by added noise, the weighted linear combination rule shifts in favor of the uncontaminated cue.</abstract><cop>Oxford</cop><pub>Elsevier Ltd</pub><pmid>8296465</pmid><doi>10.1016/0042-6989(93)90228-O</doi><tpages>12</tpages><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 0042-6989
ispartof Vision research (Oxford), 1993-12, Vol.33 (18), p.2685-2696
issn 0042-6989
1878-5646
language eng
recordid cdi_proquest_miscellaneous_76199503
source MEDLINE; Access via ScienceDirect (Elsevier)
subjects Biological and medical sciences
Cues
Depth
Depth Perception - physiology
Fundamental and applied biological sciences. Psychology
Humans
Mental Processes - physiology
Motion Perception - physiology
Multiple cues
Pattern Recognition, Visual
Perception
Psychology. Psychoanalysis. Psychiatry
Psychology. Psychophysiology
Psychometrics
Sensor fusion
Vision
title A perturbation analysis of depth perception from combinations of texture and motion cues
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-04T04%3A06%3A20IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20perturbation%20analysis%20of%20depth%20perception%20from%20combinations%20of%20texture%20and%20motion%20cues&rft.jtitle=Vision%20research%20(Oxford)&rft.au=Young,%20Mark%20J.&rft.date=1993-12-01&rft.volume=33&rft.issue=18&rft.spage=2685&rft.epage=2696&rft.pages=2685-2696&rft.issn=0042-6989&rft.eissn=1878-5646&rft.coden=VISRAM&rft_id=info:doi/10.1016/0042-6989(93)90228-O&rft_dat=%3Cproquest_cross%3E76199503%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=16788174&rft_id=info:pmid/8296465&rft_els_id=004269899390228O&rfr_iscdi=true