Combining Sensory Information: Mandatory Fusion within, but Not between, Senses
Humans use multiple sources of sensory information to estimate environmental properties. For example, the eyes and hands both provide relevant information about an object's shape. The eyes estimate shape using binocular disparity, perspective projection, etc. The hands supply haptic shape infor...
Gespeichert in:
Veröffentlicht in: | Science (American Association for the Advancement of Science) 2002-11, Vol.298 (5598), p.1627-1630 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 1630 |
---|---|
container_issue | 5598 |
container_start_page | 1627 |
container_title | Science (American Association for the Advancement of Science) |
container_volume | 298 |
creator | Hillis, J. M. Ernst, M. O. Banks, M. S. Landy, M. S. |
description | Humans use multiple sources of sensory information to estimate environmental properties. For example, the eyes and hands both provide relevant information about an object's shape. The eyes estimate shape using binocular disparity, perspective projection, etc. The hands supply haptic shape information by means of tactile and proprioceptive cues. Combining information across cues can improve estimation of object properties but may come at a cost: loss of single-cue information. We report that single-cue information is indeed lost when cues from within the same sensory modality (disparity and texture gradients in vision) are combined, but not when different modalities (vision and haptics) are combined. |
doi_str_mv | 10.1126/science.1075396 |
format | Article |
fullrecord | <record><control><sourceid>gale_proqu</sourceid><recordid>TN_cdi_proquest_miscellaneous_743127995</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A95357968</galeid><jstor_id>3832851</jstor_id><sourcerecordid>A95357968</sourcerecordid><originalsourceid>FETCH-LOGICAL-c740t-1765fe586bb56a0911cf5dfce14aec3d73e767e2eb6ca9a327e3717ce73e878f3</originalsourceid><addsrcrecordid>eNqN089v0zAUB_AIgVgZnLkgFCHx49Bs_hHbMbdRsVKprIcBV8txXoqrxB5xorH_HleNmIoqVvkQ6b2PLecl3yR5idEZxoSfB2PBGTjDSDAq-aNkgpFkmSSIPk4mCFGeFbF1kjwLYYNQ7En6NDnBJM-5xGSSrGa-La2zbp1egwu-u0sXrvZdq3vr3cf0q3aV7rflyyHESnpr-5_WTdNy6NMr36cl9LcAsbDdDuF58qTWTYAX4_M0-X75-dvsS7ZczRezi2VmRI76DAvOamAFL0vGNZIYm5pVtQGcazC0EhQEF0Cg5EZLTYkAKrAwEBuFKGp6mrzfnXvT-V8DhF61NhhoGu3AD0GJnGIipGRRvvu_JAJLSciDkAjESZHTByEuGOMszyN88w_c-KFzcS6KYMokycX2ftMdWusGlI3D7ztt1uCg0413UNtYvogvwoTkReTZAR5XBa01h_yHPR9JD7_7tR5CUIvrq6Pp6sfR9NP8WFrMl3t0eoga3zSwBhV_oNlqj5_vuOl8CB3U6qazre7uFEZqGw81xkON8Yg7Xo8fZChbqO79mIcI3o5AB6ObutPO2HDvciolzkV0r3ZuE2I4_vZpQUnBMP0Dex8c1Q</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>213592475</pqid></control><display><type>article</type><title>Combining Sensory Information: Mandatory Fusion within, but Not between, Senses</title><source>American Association for the Advancement of Science</source><source>Jstor Complete Legacy</source><source>MEDLINE</source><creator>Hillis, J. M. ; Ernst, M. O. ; Banks, M. S. ; Landy, M. S.</creator><creatorcontrib>Hillis, J. M. ; Ernst, M. O. ; Banks, M. S. ; Landy, M. S.</creatorcontrib><description>Humans use multiple sources of sensory information to estimate environmental properties. For example, the eyes and hands both provide relevant information about an object's shape. The eyes estimate shape using binocular disparity, perspective projection, etc. The hands supply haptic shape information by means of tactile and proprioceptive cues. Combining information across cues can improve estimation of object properties but may come at a cost: loss of single-cue information. We report that single-cue information is indeed lost when cues from within the same sensory modality (disparity and texture gradients in vision) are combined, but not when different modalities (vision and haptics) are combined.</description><identifier>ISSN: 0036-8075</identifier><identifier>EISSN: 1095-9203</identifier><identifier>DOI: 10.1126/science.1075396</identifier><identifier>PMID: 12446912</identifier><identifier>CODEN: SCIEAS</identifier><language>eng</language><publisher>Washington, DC: American Association for the Advancement of Science</publisher><subject>Analysis ; Anatomy ; Biological and medical sciences ; Consistent estimators ; Cues ; Error of Measurement ; Estimators ; Eyes & eyesight ; Form Perception ; Fundamental and applied biological sciences. Psychology ; Humans ; Mathematics ; Maximum likelihood estimation ; Multimodal perception ; Nervous system ; Neurology ; Perception ; Psychology. Psychoanalysis. Psychiatry ; Psychology. Psychophysiology ; Quadrants ; Rectangles ; Senses ; Senses and sensation ; Sensory discrimination ; Sensory information ; Sensory perception ; Sensory stimulation ; Stereognosis ; Stimuli ; Surface texture ; Touch ; Vision ; Vision Disparity ; Visual Perception</subject><ispartof>Science (American Association for the Advancement of Science), 2002-11, Vol.298 (5598), p.1627-1630</ispartof><rights>Copyright 2005 American Association for the Advancement of Science</rights><rights>2003 INIST-CNRS</rights><rights>COPYRIGHT 2002 American Association for the Advancement of Science</rights><rights>COPYRIGHT 2002 American Association for the Advancement of Science</rights><rights>Copyright American Association for the Advancement of Science Nov 22, 2002</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c740t-1765fe586bb56a0911cf5dfce14aec3d73e767e2eb6ca9a327e3717ce73e878f3</citedby><cites>FETCH-LOGICAL-c740t-1765fe586bb56a0911cf5dfce14aec3d73e767e2eb6ca9a327e3717ce73e878f3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.jstor.org/stable/pdf/3832851$$EPDF$$P50$$Gjstor$$H</linktopdf><linktohtml>$$Uhttps://www.jstor.org/stable/3832851$$EHTML$$P50$$Gjstor$$H</linktohtml><link.rule.ids>314,776,780,799,2871,2872,27901,27902,57992,58225</link.rule.ids><backlink>$$Uhttp://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&idt=14399147$$DView record in Pascal Francis$$Hfree_for_read</backlink><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/12446912$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Hillis, J. M.</creatorcontrib><creatorcontrib>Ernst, M. O.</creatorcontrib><creatorcontrib>Banks, M. S.</creatorcontrib><creatorcontrib>Landy, M. S.</creatorcontrib><title>Combining Sensory Information: Mandatory Fusion within, but Not between, Senses</title><title>Science (American Association for the Advancement of Science)</title><addtitle>Science</addtitle><description>Humans use multiple sources of sensory information to estimate environmental properties. For example, the eyes and hands both provide relevant information about an object's shape. The eyes estimate shape using binocular disparity, perspective projection, etc. The hands supply haptic shape information by means of tactile and proprioceptive cues. Combining information across cues can improve estimation of object properties but may come at a cost: loss of single-cue information. We report that single-cue information is indeed lost when cues from within the same sensory modality (disparity and texture gradients in vision) are combined, but not when different modalities (vision and haptics) are combined.</description><subject>Analysis</subject><subject>Anatomy</subject><subject>Biological and medical sciences</subject><subject>Consistent estimators</subject><subject>Cues</subject><subject>Error of Measurement</subject><subject>Estimators</subject><subject>Eyes & eyesight</subject><subject>Form Perception</subject><subject>Fundamental and applied biological sciences. Psychology</subject><subject>Humans</subject><subject>Mathematics</subject><subject>Maximum likelihood estimation</subject><subject>Multimodal perception</subject><subject>Nervous system</subject><subject>Neurology</subject><subject>Perception</subject><subject>Psychology. Psychoanalysis. Psychiatry</subject><subject>Psychology. Psychophysiology</subject><subject>Quadrants</subject><subject>Rectangles</subject><subject>Senses</subject><subject>Senses and sensation</subject><subject>Sensory discrimination</subject><subject>Sensory information</subject><subject>Sensory perception</subject><subject>Sensory stimulation</subject><subject>Stereognosis</subject><subject>Stimuli</subject><subject>Surface texture</subject><subject>Touch</subject><subject>Vision</subject><subject>Vision Disparity</subject><subject>Visual Perception</subject><issn>0036-8075</issn><issn>1095-9203</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2002</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><sourceid>8G5</sourceid><sourceid>BEC</sourceid><sourceid>BENPR</sourceid><sourceid>GUQSH</sourceid><sourceid>M2O</sourceid><recordid>eNqN089v0zAUB_AIgVgZnLkgFCHx49Bs_hHbMbdRsVKprIcBV8txXoqrxB5xorH_HleNmIoqVvkQ6b2PLecl3yR5idEZxoSfB2PBGTjDSDAq-aNkgpFkmSSIPk4mCFGeFbF1kjwLYYNQ7En6NDnBJM-5xGSSrGa-La2zbp1egwu-u0sXrvZdq3vr3cf0q3aV7rflyyHESnpr-5_WTdNy6NMr36cl9LcAsbDdDuF58qTWTYAX4_M0-X75-dvsS7ZczRezi2VmRI76DAvOamAFL0vGNZIYm5pVtQGcazC0EhQEF0Cg5EZLTYkAKrAwEBuFKGp6mrzfnXvT-V8DhF61NhhoGu3AD0GJnGIipGRRvvu_JAJLSciDkAjESZHTByEuGOMszyN88w_c-KFzcS6KYMokycX2ftMdWusGlI3D7ztt1uCg0413UNtYvogvwoTkReTZAR5XBa01h_yHPR9JD7_7tR5CUIvrq6Pp6sfR9NP8WFrMl3t0eoga3zSwBhV_oNlqj5_vuOl8CB3U6qazre7uFEZqGw81xkON8Yg7Xo8fZChbqO79mIcI3o5AB6ObutPO2HDvciolzkV0r3ZuE2I4_vZpQUnBMP0Dex8c1Q</recordid><startdate>20021122</startdate><enddate>20021122</enddate><creator>Hillis, J. M.</creator><creator>Ernst, M. O.</creator><creator>Banks, M. S.</creator><creator>Landy, M. S.</creator><general>American Association for the Advancement of Science</general><general>The American Association for the Advancement of Science</general><scope>IQODW</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>8GL</scope><scope>IBG</scope><scope>IOV</scope><scope>ISN</scope><scope>0-V</scope><scope>3V.</scope><scope>7QF</scope><scope>7QG</scope><scope>7QL</scope><scope>7QP</scope><scope>7QQ</scope><scope>7QR</scope><scope>7SC</scope><scope>7SE</scope><scope>7SN</scope><scope>7SP</scope><scope>7SR</scope><scope>7SS</scope><scope>7T7</scope><scope>7TA</scope><scope>7TB</scope><scope>7TK</scope><scope>7TM</scope><scope>7U5</scope><scope>7U9</scope><scope>7X2</scope><scope>7X7</scope><scope>7XB</scope><scope>88A</scope><scope>88B</scope><scope>88E</scope><scope>88I</scope><scope>8AF</scope><scope>8BQ</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>8G5</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AEUYN</scope><scope>AFKRA</scope><scope>ALSLI</scope><scope>ARAPS</scope><scope>ATCPS</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>BKSAR</scope><scope>C1K</scope><scope>CCPQU</scope><scope>CJNVE</scope><scope>D1I</scope><scope>DWQXO</scope><scope>F28</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>GUQSH</scope><scope>H8D</scope><scope>H8G</scope><scope>H94</scope><scope>HCIFZ</scope><scope>JG9</scope><scope>JQ2</scope><scope>K9-</scope><scope>K9.</scope><scope>KB.</scope><scope>KR7</scope><scope>L6V</scope><scope>L7M</scope><scope>LK8</scope><scope>L~C</scope><scope>L~D</scope><scope>M0K</scope><scope>M0P</scope><scope>M0R</scope><scope>M0S</scope><scope>M1P</scope><scope>M2O</scope><scope>M2P</scope><scope>M7N</scope><scope>M7P</scope><scope>M7S</scope><scope>MBDVC</scope><scope>P5Z</scope><scope>P62</scope><scope>P64</scope><scope>PATMY</scope><scope>PCBAR</scope><scope>PDBOC</scope><scope>PHGZM</scope><scope>PHGZT</scope><scope>PJZUB</scope><scope>PKEHL</scope><scope>PPXIY</scope><scope>PQEDU</scope><scope>PQEST</scope><scope>PQGLB</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope><scope>PYCSY</scope><scope>Q9U</scope><scope>R05</scope><scope>RC3</scope><scope>7X8</scope></search><sort><creationdate>20021122</creationdate><title>Combining Sensory Information: Mandatory Fusion within, but Not between, Senses</title><author>Hillis, J. M. ; Ernst, M. O. ; Banks, M. S. ; Landy, M. S.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c740t-1765fe586bb56a0911cf5dfce14aec3d73e767e2eb6ca9a327e3717ce73e878f3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2002</creationdate><topic>Analysis</topic><topic>Anatomy</topic><topic>Biological and medical sciences</topic><topic>Consistent estimators</topic><topic>Cues</topic><topic>Error of Measurement</topic><topic>Estimators</topic><topic>Eyes & eyesight</topic><topic>Form Perception</topic><topic>Fundamental and applied biological sciences. Psychology</topic><topic>Humans</topic><topic>Mathematics</topic><topic>Maximum likelihood estimation</topic><topic>Multimodal perception</topic><topic>Nervous system</topic><topic>Neurology</topic><topic>Perception</topic><topic>Psychology. Psychoanalysis. Psychiatry</topic><topic>Psychology. Psychophysiology</topic><topic>Quadrants</topic><topic>Rectangles</topic><topic>Senses</topic><topic>Senses and sensation</topic><topic>Sensory discrimination</topic><topic>Sensory information</topic><topic>Sensory perception</topic><topic>Sensory stimulation</topic><topic>Stereognosis</topic><topic>Stimuli</topic><topic>Surface texture</topic><topic>Touch</topic><topic>Vision</topic><topic>Vision Disparity</topic><topic>Visual Perception</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Hillis, J. M.</creatorcontrib><creatorcontrib>Ernst, M. O.</creatorcontrib><creatorcontrib>Banks, M. S.</creatorcontrib><creatorcontrib>Landy, M. S.</creatorcontrib><collection>Pascal-Francis</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Gale In Context: High School</collection><collection>Gale in Context : Biography</collection><collection>Gale in Context : Opposing Viewpoints</collection><collection>Gale In Context: Canada</collection><collection>ProQuest Social Sciences Premium Collection</collection><collection>ProQuest Central (Corporate)</collection><collection>Aluminium Industry Abstracts</collection><collection>Animal Behavior Abstracts</collection><collection>Bacteriology Abstracts (Microbiology B)</collection><collection>Calcium & Calcified Tissue Abstracts</collection><collection>Ceramic Abstracts</collection><collection>Chemoreception Abstracts</collection><collection>Computer and Information Systems Abstracts</collection><collection>Corrosion Abstracts</collection><collection>Ecology Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>Entomology Abstracts (Full archive)</collection><collection>Industrial and Applied Microbiology Abstracts (Microbiology A)</collection><collection>Materials Business File</collection><collection>Mechanical & Transportation Engineering Abstracts</collection><collection>Neurosciences Abstracts</collection><collection>Nucleic Acids Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>Virology and AIDS Abstracts</collection><collection>Agricultural Science Collection</collection><collection>Health & Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Biology Database (Alumni Edition)</collection><collection>Education Database (Alumni Edition)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Science Database (Alumni Edition)</collection><collection>STEM Database</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Research Library (Alumni Edition)</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest One Sustainability</collection><collection>ProQuest Central UK/Ireland</collection><collection>Social Science Premium Collection</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>Agricultural & Environmental Science Collection</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>eLibrary</collection><collection>ProQuest Central</collection><collection>Technology Collection (ProQuest)</collection><collection>Natural Science Collection (ProQuest)</collection><collection>Earth, Atmospheric & Aquatic Science Collection</collection><collection>Environmental Sciences and Pollution Management</collection><collection>ProQuest One Community College</collection><collection>Education Collection</collection><collection>ProQuest Materials Science Collection</collection><collection>ProQuest Central Korea</collection><collection>ANTE: Abstracts in New Technology & Engineering</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>Research Library Prep</collection><collection>Aerospace Database</collection><collection>Copper Technical Reference Library</collection><collection>AIDS and Cancer Research Abstracts</collection><collection>SciTech Premium Collection</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Consumer Health Database (Alumni Edition)</collection><collection>ProQuest Health & Medical Complete (Alumni)</collection><collection>Materials Science Database</collection><collection>Civil Engineering Abstracts</collection><collection>ProQuest Engineering Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>ProQuest Biological Science Collection</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Agricultural Science Database</collection><collection>ProQuest Education Journals</collection><collection>Consumer Health Database</collection><collection>Health & Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>ProQuest Research Library</collection><collection>Science Database</collection><collection>Algology Mycology and Protozoology Abstracts (Microbiology C)</collection><collection>ProQuest Biological Science Journals</collection><collection>Engineering Database</collection><collection>Research Library (Corporate)</collection><collection>Advanced Technologies & Aerospace Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>Environmental Science Database</collection><collection>Earth, Atmospheric & Aquatic Science Database</collection><collection>Materials Science Collection</collection><collection>ProQuest Central (New)</collection><collection>ProQuest One Academic (New)</collection><collection>ProQuest Health & Medical Research Collection</collection><collection>ProQuest One Academic Middle East (New)</collection><collection>ProQuest One Health & Nursing</collection><collection>ProQuest One Education</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Applied & Life Sciences</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection><collection>Environmental Science Collection</collection><collection>ProQuest Central Basic</collection><collection>University of Michigan</collection><collection>Genetics Abstracts</collection><collection>MEDLINE - Academic</collection><jtitle>Science (American Association for the Advancement of Science)</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Hillis, J. M.</au><au>Ernst, M. O.</au><au>Banks, M. S.</au><au>Landy, M. S.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Combining Sensory Information: Mandatory Fusion within, but Not between, Senses</atitle><jtitle>Science (American Association for the Advancement of Science)</jtitle><addtitle>Science</addtitle><date>2002-11-22</date><risdate>2002</risdate><volume>298</volume><issue>5598</issue><spage>1627</spage><epage>1630</epage><pages>1627-1630</pages><issn>0036-8075</issn><eissn>1095-9203</eissn><coden>SCIEAS</coden><abstract>Humans use multiple sources of sensory information to estimate environmental properties. For example, the eyes and hands both provide relevant information about an object's shape. The eyes estimate shape using binocular disparity, perspective projection, etc. The hands supply haptic shape information by means of tactile and proprioceptive cues. Combining information across cues can improve estimation of object properties but may come at a cost: loss of single-cue information. We report that single-cue information is indeed lost when cues from within the same sensory modality (disparity and texture gradients in vision) are combined, but not when different modalities (vision and haptics) are combined.</abstract><cop>Washington, DC</cop><pub>American Association for the Advancement of Science</pub><pmid>12446912</pmid><doi>10.1126/science.1075396</doi><tpages>4</tpages></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0036-8075 |
ispartof | Science (American Association for the Advancement of Science), 2002-11, Vol.298 (5598), p.1627-1630 |
issn | 0036-8075 1095-9203 |
language | eng |
recordid | cdi_proquest_miscellaneous_743127995 |
source | American Association for the Advancement of Science; Jstor Complete Legacy; MEDLINE |
subjects | Analysis Anatomy Biological and medical sciences Consistent estimators Cues Error of Measurement Estimators Eyes & eyesight Form Perception Fundamental and applied biological sciences. Psychology Humans Mathematics Maximum likelihood estimation Multimodal perception Nervous system Neurology Perception Psychology. Psychoanalysis. Psychiatry Psychology. Psychophysiology Quadrants Rectangles Senses Senses and sensation Sensory discrimination Sensory information Sensory perception Sensory stimulation Stereognosis Stimuli Surface texture Touch Vision Vision Disparity Visual Perception |
title | Combining Sensory Information: Mandatory Fusion within, but Not between, Senses |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-18T22%3A47%3A02IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale_proqu&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Combining%20Sensory%20Information:%20Mandatory%20Fusion%20within,%20but%20Not%20between,%20Senses&rft.jtitle=Science%20(American%20Association%20for%20the%20Advancement%20of%20Science)&rft.au=Hillis,%20J.%20M.&rft.date=2002-11-22&rft.volume=298&rft.issue=5598&rft.spage=1627&rft.epage=1630&rft.pages=1627-1630&rft.issn=0036-8075&rft.eissn=1095-9203&rft.coden=SCIEAS&rft_id=info:doi/10.1126/science.1075396&rft_dat=%3Cgale_proqu%3EA95357968%3C/gale_proqu%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=213592475&rft_id=info:pmid/12446912&rft_galeid=A95357968&rft_jstor_id=3832851&rfr_iscdi=true |