The generic viewpoint assumption in a framework for visual perception

A VISUAL system makes assumptions in order to interpret visual data. The assumption of 'generic view' 1–4 states that the observer is not in a special position relative to the scene. Researchers commonly use a binary decision of generic or accidental view to disqualify scene interpretation...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Nature (London) 1994-04, Vol.368 (6471), p.542-545
1. Verfasser: Freeman, William T.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 545
container_issue 6471
container_start_page 542
container_title Nature (London)
container_volume 368
creator Freeman, William T.
description A VISUAL system makes assumptions in order to interpret visual data. The assumption of 'generic view' 1–4 states that the observer is not in a special position relative to the scene. Researchers commonly use a binary decision of generic or accidental view to disqualify scene interpretations that assume accidental viewpoints 5–10 . Here we show how to use the generic view assumption, and others like it, to quantify the likelihood of a view, adding a new term to the probability of a given image interpretation. The resulting framework better models the visual world and reduces the reliance on other prior assumptions. It may lead to computer vision algorithms of greater power and accuracy, or to better models of human vision. We show applications to the problems of inferring shape, surface reflectance properties, and motion from images.
doi_str_mv 10.1038/368542a0
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_76407758</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>76407758</sourcerecordid><originalsourceid>FETCH-LOGICAL-c434t-9a75618446f3a489222505d297e497f309f3d5844ef4454a1fd03782edff6f743</originalsourceid><addsrcrecordid>eNpl0EtLw0AQB_BFlFof4BcQgojoITrJzj5ylOILCl7qOazJbI3m5W6j-O3d2qqgpzn8f8wMf8YOEjhPgOsLLrXA1MAGGyeoZIxSq002Bkh1DJrLbbbj_TMAiEThiI10wrNAxuxq9kTRnFpyVRG9VfTed1W7iIz3Q9Mvqq6NqjYykXWmoffOvUS2c8H5wdRRT66gL7THtqypPe2v5y57uL6aTW7j6f3N3eRyGhfIcRFnRgmZaERpuUGdpWkqQJRppggzZTlklpci5GQRBZrElsCVTqm0VlqFfJedrPb2rnsdyC_ypvIF1bVpqRt8riSCUkIHePQHPneDa8NveQrhPs8yGdDpChWu896RzXtXNcZ95Anky1rz71oDPVzvGx4bKn_guseQH69z4wtTh7raovI_DAGkwOVbZyvmQ9LOyf2-9e_kJyxWihI</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>204463996</pqid></control><display><type>article</type><title>The generic viewpoint assumption in a framework for visual perception</title><source>MEDLINE</source><source>Nature Journals Online</source><source>SpringerLink Journals - AutoHoldings</source><creator>Freeman, William T.</creator><creatorcontrib>Freeman, William T.</creatorcontrib><description>A VISUAL system makes assumptions in order to interpret visual data. The assumption of 'generic view' 1–4 states that the observer is not in a special position relative to the scene. Researchers commonly use a binary decision of generic or accidental view to disqualify scene interpretations that assume accidental viewpoints 5–10 . Here we show how to use the generic view assumption, and others like it, to quantify the likelihood of a view, adding a new term to the probability of a given image interpretation. The resulting framework better models the visual world and reduces the reliance on other prior assumptions. It may lead to computer vision algorithms of greater power and accuracy, or to better models of human vision. We show applications to the problems of inferring shape, surface reflectance properties, and motion from images.</description><identifier>ISSN: 0028-0836</identifier><identifier>EISSN: 1476-4687</identifier><identifier>DOI: 10.1038/368542a0</identifier><identifier>PMID: 8139687</identifier><identifier>CODEN: NATUAS</identifier><language>eng</language><publisher>London: Nature Publishing Group UK</publisher><subject>Applied sciences ; Exact sciences and technology ; Eyes &amp; eyesight ; Humanities and Social Sciences ; Humans ; Information, signal and communications theory ; letter ; Light ; Models, Biological ; multidisciplinary ; Pattern recognition ; Perceptions ; Probability ; Science ; Science (multidisciplinary) ; Signal processing ; Space life sciences ; Surface Properties ; Telecommunications and information theory ; Visual Perception - physiology</subject><ispartof>Nature (London), 1994-04, Vol.368 (6471), p.542-545</ispartof><rights>Springer Nature Limited 1994</rights><rights>1994 INIST-CNRS</rights><rights>Copyright Macmillan Journals Ltd. Apr 7, 1994</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c434t-9a75618446f3a489222505d297e497f309f3d5844ef4454a1fd03782edff6f743</citedby><cites>FETCH-LOGICAL-c434t-9a75618446f3a489222505d297e497f309f3d5844ef4454a1fd03782edff6f743</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1038/368542a0$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1038/368542a0$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,776,780,27901,27902,41464,42533,51294</link.rule.ids><backlink>$$Uhttp://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&amp;idt=4006548$$DView record in Pascal Francis$$Hfree_for_read</backlink><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/8139687$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Freeman, William T.</creatorcontrib><title>The generic viewpoint assumption in a framework for visual perception</title><title>Nature (London)</title><addtitle>Nature</addtitle><addtitle>Nature</addtitle><description>A VISUAL system makes assumptions in order to interpret visual data. The assumption of 'generic view' 1–4 states that the observer is not in a special position relative to the scene. Researchers commonly use a binary decision of generic or accidental view to disqualify scene interpretations that assume accidental viewpoints 5–10 . Here we show how to use the generic view assumption, and others like it, to quantify the likelihood of a view, adding a new term to the probability of a given image interpretation. The resulting framework better models the visual world and reduces the reliance on other prior assumptions. It may lead to computer vision algorithms of greater power and accuracy, or to better models of human vision. We show applications to the problems of inferring shape, surface reflectance properties, and motion from images.</description><subject>Applied sciences</subject><subject>Exact sciences and technology</subject><subject>Eyes &amp; eyesight</subject><subject>Humanities and Social Sciences</subject><subject>Humans</subject><subject>Information, signal and communications theory</subject><subject>letter</subject><subject>Light</subject><subject>Models, Biological</subject><subject>multidisciplinary</subject><subject>Pattern recognition</subject><subject>Perceptions</subject><subject>Probability</subject><subject>Science</subject><subject>Science (multidisciplinary)</subject><subject>Signal processing</subject><subject>Space life sciences</subject><subject>Surface Properties</subject><subject>Telecommunications and information theory</subject><subject>Visual Perception - physiology</subject><issn>0028-0836</issn><issn>1476-4687</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>1994</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><sourceid>8G5</sourceid><sourceid>BEC</sourceid><sourceid>BENPR</sourceid><sourceid>GUQSH</sourceid><sourceid>M2O</sourceid><recordid>eNpl0EtLw0AQB_BFlFof4BcQgojoITrJzj5ylOILCl7qOazJbI3m5W6j-O3d2qqgpzn8f8wMf8YOEjhPgOsLLrXA1MAGGyeoZIxSq002Bkh1DJrLbbbj_TMAiEThiI10wrNAxuxq9kTRnFpyVRG9VfTed1W7iIz3Q9Mvqq6NqjYykXWmoffOvUS2c8H5wdRRT66gL7THtqypPe2v5y57uL6aTW7j6f3N3eRyGhfIcRFnRgmZaERpuUGdpWkqQJRppggzZTlklpci5GQRBZrElsCVTqm0VlqFfJedrPb2rnsdyC_ypvIF1bVpqRt8riSCUkIHePQHPneDa8NveQrhPs8yGdDpChWu896RzXtXNcZ95Anky1rz71oDPVzvGx4bKn_guseQH69z4wtTh7raovI_DAGkwOVbZyvmQ9LOyf2-9e_kJyxWihI</recordid><startdate>19940407</startdate><enddate>19940407</enddate><creator>Freeman, William T.</creator><general>Nature Publishing Group UK</general><general>Nature Publishing</general><general>Nature Publishing Group</general><scope>IQODW</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7QG</scope><scope>7QL</scope><scope>7QP</scope><scope>7QR</scope><scope>7RV</scope><scope>7SN</scope><scope>7SS</scope><scope>7ST</scope><scope>7T5</scope><scope>7TG</scope><scope>7TK</scope><scope>7TM</scope><scope>7TO</scope><scope>7U9</scope><scope>7X2</scope><scope>7X7</scope><scope>7XB</scope><scope>88A</scope><scope>88E</scope><scope>88G</scope><scope>88I</scope><scope>8AF</scope><scope>8AO</scope><scope>8C1</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>8G5</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AEUYN</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>ATCPS</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>BKSAR</scope><scope>C1K</scope><scope>CCPQU</scope><scope>D1I</scope><scope>DWQXO</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>GUQSH</scope><scope>H94</scope><scope>HCIFZ</scope><scope>K9.</scope><scope>KB.</scope><scope>KB0</scope><scope>KL.</scope><scope>L6V</scope><scope>LK8</scope><scope>M0K</scope><scope>M0S</scope><scope>M1P</scope><scope>M2M</scope><scope>M2O</scope><scope>M2P</scope><scope>M7N</scope><scope>M7P</scope><scope>M7S</scope><scope>MBDVC</scope><scope>NAPCQ</scope><scope>P5Z</scope><scope>P62</scope><scope>P64</scope><scope>PATMY</scope><scope>PCBAR</scope><scope>PDBOC</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PSYQQ</scope><scope>PTHSS</scope><scope>PYCSY</scope><scope>Q9U</scope><scope>R05</scope><scope>RC3</scope><scope>S0X</scope><scope>SOI</scope><scope>7X8</scope></search><sort><creationdate>19940407</creationdate><title>The generic viewpoint assumption in a framework for visual perception</title><author>Freeman, William T.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c434t-9a75618446f3a489222505d297e497f309f3d5844ef4454a1fd03782edff6f743</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>1994</creationdate><topic>Applied sciences</topic><topic>Exact sciences and technology</topic><topic>Eyes &amp; eyesight</topic><topic>Humanities and Social Sciences</topic><topic>Humans</topic><topic>Information, signal and communications theory</topic><topic>letter</topic><topic>Light</topic><topic>Models, Biological</topic><topic>multidisciplinary</topic><topic>Pattern recognition</topic><topic>Perceptions</topic><topic>Probability</topic><topic>Science</topic><topic>Science (multidisciplinary)</topic><topic>Signal processing</topic><topic>Space life sciences</topic><topic>Surface Properties</topic><topic>Telecommunications and information theory</topic><topic>Visual Perception - physiology</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Freeman, William T.</creatorcontrib><collection>Pascal-Francis</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Animal Behavior Abstracts</collection><collection>Bacteriology Abstracts (Microbiology B)</collection><collection>Calcium &amp; Calcified Tissue Abstracts</collection><collection>Chemoreception Abstracts</collection><collection>Nursing &amp; Allied Health Database</collection><collection>Ecology Abstracts</collection><collection>Entomology Abstracts (Full archive)</collection><collection>Environment Abstracts</collection><collection>Immunology Abstracts</collection><collection>Meteorological &amp; Geoastrophysical Abstracts</collection><collection>Neurosciences Abstracts</collection><collection>Nucleic Acids Abstracts</collection><collection>Oncogenes and Growth Factors Abstracts</collection><collection>Virology and AIDS Abstracts</collection><collection>Agricultural Science Collection</collection><collection>Health &amp; Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Biology Database (Alumni Edition)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Psychology Database (Alumni)</collection><collection>Science Database (Alumni Edition)</collection><collection>STEM Database</collection><collection>ProQuest Pharma Collection</collection><collection>Public Health Database</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Research Library (Alumni Edition)</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest One Sustainability</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>Agricultural &amp; Environmental Science Collection</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>eLibrary</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>Natural Science Collection</collection><collection>Earth, Atmospheric &amp; Aquatic Science Collection</collection><collection>Environmental Sciences and Pollution Management</collection><collection>ProQuest One Community College</collection><collection>ProQuest Materials Science Collection</collection><collection>ProQuest Central Korea</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>Research Library Prep</collection><collection>AIDS and Cancer Research Abstracts</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>Materials Science Database</collection><collection>Nursing &amp; Allied Health Database (Alumni Edition)</collection><collection>Meteorological &amp; Geoastrophysical Abstracts - Academic</collection><collection>ProQuest Engineering Collection</collection><collection>ProQuest Biological Science Collection</collection><collection>Agricultural Science Database</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>ProQuest Psychology</collection><collection>Research Library</collection><collection>Science Database</collection><collection>Algology Mycology and Protozoology Abstracts (Microbiology C)</collection><collection>Biological Science Database</collection><collection>Engineering Database</collection><collection>Research Library (Corporate)</collection><collection>Nursing &amp; Allied Health Premium</collection><collection>Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>Environmental Science Database</collection><collection>Earth, Atmospheric &amp; Aquatic Science Database</collection><collection>Materials Science Collection</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest One Psychology</collection><collection>Engineering Collection</collection><collection>Environmental Science Collection</collection><collection>ProQuest Central Basic</collection><collection>University of Michigan</collection><collection>Genetics Abstracts</collection><collection>SIRS Editorial</collection><collection>Environment Abstracts</collection><collection>MEDLINE - Academic</collection><jtitle>Nature (London)</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Freeman, William T.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>The generic viewpoint assumption in a framework for visual perception</atitle><jtitle>Nature (London)</jtitle><stitle>Nature</stitle><addtitle>Nature</addtitle><date>1994-04-07</date><risdate>1994</risdate><volume>368</volume><issue>6471</issue><spage>542</spage><epage>545</epage><pages>542-545</pages><issn>0028-0836</issn><eissn>1476-4687</eissn><coden>NATUAS</coden><abstract>A VISUAL system makes assumptions in order to interpret visual data. The assumption of 'generic view' 1–4 states that the observer is not in a special position relative to the scene. Researchers commonly use a binary decision of generic or accidental view to disqualify scene interpretations that assume accidental viewpoints 5–10 . Here we show how to use the generic view assumption, and others like it, to quantify the likelihood of a view, adding a new term to the probability of a given image interpretation. The resulting framework better models the visual world and reduces the reliance on other prior assumptions. It may lead to computer vision algorithms of greater power and accuracy, or to better models of human vision. We show applications to the problems of inferring shape, surface reflectance properties, and motion from images.</abstract><cop>London</cop><pub>Nature Publishing Group UK</pub><pmid>8139687</pmid><doi>10.1038/368542a0</doi><tpages>4</tpages></addata></record>
fulltext fulltext
identifier ISSN: 0028-0836
ispartof Nature (London), 1994-04, Vol.368 (6471), p.542-545
issn 0028-0836
1476-4687
language eng
recordid cdi_proquest_miscellaneous_76407758
source MEDLINE; Nature Journals Online; SpringerLink Journals - AutoHoldings
subjects Applied sciences
Exact sciences and technology
Eyes & eyesight
Humanities and Social Sciences
Humans
Information, signal and communications theory
letter
Light
Models, Biological
multidisciplinary
Pattern recognition
Perceptions
Probability
Science
Science (multidisciplinary)
Signal processing
Space life sciences
Surface Properties
Telecommunications and information theory
Visual Perception - physiology
title The generic viewpoint assumption in a framework for visual perception
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-29T20%3A16%3A14IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=The%20generic%20viewpoint%20assumption%20in%20a%20framework%20for%20visual%20perception&rft.jtitle=Nature%20(London)&rft.au=Freeman,%20William%20T.&rft.date=1994-04-07&rft.volume=368&rft.issue=6471&rft.spage=542&rft.epage=545&rft.pages=542-545&rft.issn=0028-0836&rft.eissn=1476-4687&rft.coden=NATUAS&rft_id=info:doi/10.1038/368542a0&rft_dat=%3Cproquest_cross%3E76407758%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=204463996&rft_id=info:pmid/8139687&rfr_iscdi=true