Did you see it? A Python tool for psychophysical assessment of the human blind spot

The blind spot is a region in the temporal monocular visual field in humans, which corresponds to a physiological scotoma within the nasal hemi-retina. This region has no photoreceptors, so is insensitive to visual stimulation. There is no corresponding perceptual scotoma because the visual stimulat...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:PloS one 2021-11, Vol.16 (11), p.e0254195-e0254195
Hauptverfasser: Ling, Xiao, Silson, Edward H, McIntosh, Robert D
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page e0254195
container_issue 11
container_start_page e0254195
container_title PloS one
container_volume 16
creator Ling, Xiao
Silson, Edward H
McIntosh, Robert D
description The blind spot is a region in the temporal monocular visual field in humans, which corresponds to a physiological scotoma within the nasal hemi-retina. This region has no photoreceptors, so is insensitive to visual stimulation. There is no corresponding perceptual scotoma because the visual stimulation is "filled-in" by the visual system. Investigations of visual perception in and around the blind spot allow us to investigate this filling-in process. However, because the location and size of the blind spot are individually variable, experimenters must first map the blind spot in every observer. We present an open-source tool, which runs in Psychopy software, to estimate the location and size of the blind spot psychophysically. The tool will ideally be used with an Eyelink eye-tracker (SR Research), but it can also run in standalone mode. Here, we explain the rationale for the tool and demonstrate its validity in normally-sighted observers. We develop a detailed map of the blind spot in one observer. Then, in a group of 12 observers, we propose a more efficient, pragmatic method to define a "safe zone" within the blind spot, for which the experimenter can be fully confident that visual stimuli will not be seen. Links are provided to this open-source tool and a user manual.
doi_str_mv 10.1371/journal.pone.0254195
format Article
fullrecord <record><control><sourceid>gale_plos_</sourceid><recordid>TN_cdi_plos_journals_2593588729</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A681288021</galeid><doaj_id>oai_doaj_org_article_21ab8af7487b446581521a2c74a656f5</doaj_id><sourcerecordid>A681288021</sourcerecordid><originalsourceid>FETCH-LOGICAL-c641t-c73d0cf62d222be9266088af377ffd1958002a550a27d6093cb3f22b7c01f07a3</originalsourceid><addsrcrecordid>eNqNkl2L1DAUhoso7rr6D0QDgujFjPloPnqjDOvXwMKKq96GNE2mGdqm26Ti_HszO91lKnshuUg4ec57ck7eLHuO4BIRjt5t_Th0qln2vjNLiGmOCvogO0UFwQuGIXl4dD7JnoSwhZASwdjj7ITknNCc0tPs6qOrwM6PIBgDXPwAVuDbLta-A9H7Blg_gD7sdO37ehecVg1QIZgQWtNF4C2ItQH12KoOlI3rKhB6H59mj6xqgnk27WfZz8-ffpx_XVxcflmfry4WmuUoLjQnFdSW4QpjXJoCMwaFUJZwbm2VuhEQYkUpVJhXDBZEl8QmkmuILOSKnGUvD7p944Oc5hEkpgWhQnBcJGJ9ICqvtrIfXKuGnfTKyZuAHzZSDdHpxkiMVJmK81zwMs8ZFYimENY8V4wyS5PW-6naWLam0mkAg2pmovObztVy439LQZnATCSBN5PA4K9HE6JsXdCmaVRn_Hjz7hwXCLE8oa_-Qe_vbqI2KjXgOutTXb0XlSsmEBYCYpSo5T1UWpVpnU7msS7FZwlvZwmJieZP3KgxBLm--v7_7OWvOfv6iK2NamIdfDNG57swB_MDqAcfwmDs3ZARlHvv305D7r0vJ--ntBfHH3SXdGt28hda8_yX</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2593588729</pqid></control><display><type>article</type><title>Did you see it? A Python tool for psychophysical assessment of the human blind spot</title><source>PLoS</source><source>MEDLINE</source><source>Full-Text Journals in Chemistry (Open access)</source><source>DOAJ Directory of Open Access Journals</source><source>PubMed Central</source><source>EZB Electronic Journals Library</source><creator>Ling, Xiao ; Silson, Edward H ; McIntosh, Robert D</creator><contributor>Swindale, Nicholas V</contributor><creatorcontrib>Ling, Xiao ; Silson, Edward H ; McIntosh, Robert D ; Swindale, Nicholas V</creatorcontrib><description>The blind spot is a region in the temporal monocular visual field in humans, which corresponds to a physiological scotoma within the nasal hemi-retina. This region has no photoreceptors, so is insensitive to visual stimulation. There is no corresponding perceptual scotoma because the visual stimulation is "filled-in" by the visual system. Investigations of visual perception in and around the blind spot allow us to investigate this filling-in process. However, because the location and size of the blind spot are individually variable, experimenters must first map the blind spot in every observer. We present an open-source tool, which runs in Psychopy software, to estimate the location and size of the blind spot psychophysically. The tool will ideally be used with an Eyelink eye-tracker (SR Research), but it can also run in standalone mode. Here, we explain the rationale for the tool and demonstrate its validity in normally-sighted observers. We develop a detailed map of the blind spot in one observer. Then, in a group of 12 observers, we propose a more efficient, pragmatic method to define a "safe zone" within the blind spot, for which the experimenter can be fully confident that visual stimuli will not be seen. Links are provided to this open-source tool and a user manual.</description><identifier>ISSN: 1932-6203</identifier><identifier>EISSN: 1932-6203</identifier><identifier>DOI: 10.1371/journal.pone.0254195</identifier><identifier>PMID: 34735455</identifier><language>eng</language><publisher>United States: Public Library of Science</publisher><subject>Adult ; Analysis ; Biology and Life Sciences ; Estimates ; Evaluation ; Experiments ; Female ; Humans ; Male ; Medical examination ; Medicine and Health Sciences ; Observers ; Optic disc ; Optic Disk ; Philosophy ; Photoreceptors ; Physical Sciences ; Physiology ; Programming Languages ; Psychophysics ; Psychophysiology ; Quantitative psychology ; Retina ; Social Sciences ; Source code ; Stimulation ; Vision, Monocular ; Visual field ; Visual fields ; Visual Perception ; Visual stimuli ; Visual system</subject><ispartof>PloS one, 2021-11, Vol.16 (11), p.e0254195-e0254195</ispartof><rights>COPYRIGHT 2021 Public Library of Science</rights><rights>2021 Ling et al. This is an open access article distributed under the terms of the Creative Commons Attribution License: http://creativecommons.org/licenses/by/4.0/ (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>2021 Ling et al 2021 Ling et al</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c641t-c73d0cf62d222be9266088af377ffd1958002a550a27d6093cb3f22b7c01f07a3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC8568268/pdf/$$EPDF$$P50$$Gpubmedcentral$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC8568268/$$EHTML$$P50$$Gpubmedcentral$$Hfree_for_read</linktohtml><link.rule.ids>230,314,723,776,780,860,881,2095,2914,23846,27903,27904,53770,53772,79347,79348</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/34735455$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><contributor>Swindale, Nicholas V</contributor><creatorcontrib>Ling, Xiao</creatorcontrib><creatorcontrib>Silson, Edward H</creatorcontrib><creatorcontrib>McIntosh, Robert D</creatorcontrib><title>Did you see it? A Python tool for psychophysical assessment of the human blind spot</title><title>PloS one</title><addtitle>PLoS One</addtitle><description>The blind spot is a region in the temporal monocular visual field in humans, which corresponds to a physiological scotoma within the nasal hemi-retina. This region has no photoreceptors, so is insensitive to visual stimulation. There is no corresponding perceptual scotoma because the visual stimulation is "filled-in" by the visual system. Investigations of visual perception in and around the blind spot allow us to investigate this filling-in process. However, because the location and size of the blind spot are individually variable, experimenters must first map the blind spot in every observer. We present an open-source tool, which runs in Psychopy software, to estimate the location and size of the blind spot psychophysically. The tool will ideally be used with an Eyelink eye-tracker (SR Research), but it can also run in standalone mode. Here, we explain the rationale for the tool and demonstrate its validity in normally-sighted observers. We develop a detailed map of the blind spot in one observer. Then, in a group of 12 observers, we propose a more efficient, pragmatic method to define a "safe zone" within the blind spot, for which the experimenter can be fully confident that visual stimuli will not be seen. Links are provided to this open-source tool and a user manual.</description><subject>Adult</subject><subject>Analysis</subject><subject>Biology and Life Sciences</subject><subject>Estimates</subject><subject>Evaluation</subject><subject>Experiments</subject><subject>Female</subject><subject>Humans</subject><subject>Male</subject><subject>Medical examination</subject><subject>Medicine and Health Sciences</subject><subject>Observers</subject><subject>Optic disc</subject><subject>Optic Disk</subject><subject>Philosophy</subject><subject>Photoreceptors</subject><subject>Physical Sciences</subject><subject>Physiology</subject><subject>Programming Languages</subject><subject>Psychophysics</subject><subject>Psychophysiology</subject><subject>Quantitative psychology</subject><subject>Retina</subject><subject>Social Sciences</subject><subject>Source code</subject><subject>Stimulation</subject><subject>Vision, Monocular</subject><subject>Visual field</subject><subject>Visual fields</subject><subject>Visual Perception</subject><subject>Visual stimuli</subject><subject>Visual system</subject><issn>1932-6203</issn><issn>1932-6203</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><sourceid>DOA</sourceid><recordid>eNqNkl2L1DAUhoso7rr6D0QDgujFjPloPnqjDOvXwMKKq96GNE2mGdqm26Ti_HszO91lKnshuUg4ec57ck7eLHuO4BIRjt5t_Th0qln2vjNLiGmOCvogO0UFwQuGIXl4dD7JnoSwhZASwdjj7ITknNCc0tPs6qOrwM6PIBgDXPwAVuDbLta-A9H7Blg_gD7sdO37ehecVg1QIZgQWtNF4C2ItQH12KoOlI3rKhB6H59mj6xqgnk27WfZz8-ffpx_XVxcflmfry4WmuUoLjQnFdSW4QpjXJoCMwaFUJZwbm2VuhEQYkUpVJhXDBZEl8QmkmuILOSKnGUvD7p944Oc5hEkpgWhQnBcJGJ9ICqvtrIfXKuGnfTKyZuAHzZSDdHpxkiMVJmK81zwMs8ZFYimENY8V4wyS5PW-6naWLam0mkAg2pmovObztVy439LQZnATCSBN5PA4K9HE6JsXdCmaVRn_Hjz7hwXCLE8oa_-Qe_vbqI2KjXgOutTXb0XlSsmEBYCYpSo5T1UWpVpnU7msS7FZwlvZwmJieZP3KgxBLm--v7_7OWvOfv6iK2NamIdfDNG57swB_MDqAcfwmDs3ZARlHvv305D7r0vJ--ntBfHH3SXdGt28hda8_yX</recordid><startdate>20211104</startdate><enddate>20211104</enddate><creator>Ling, Xiao</creator><creator>Silson, Edward H</creator><creator>McIntosh, Robert D</creator><general>Public Library of Science</general><general>Public Library of Science (PLoS)</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>IOV</scope><scope>ISR</scope><scope>3V.</scope><scope>7QG</scope><scope>7QL</scope><scope>7QO</scope><scope>7RV</scope><scope>7SN</scope><scope>7SS</scope><scope>7T5</scope><scope>7TG</scope><scope>7TM</scope><scope>7U9</scope><scope>7X2</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8AO</scope><scope>8C1</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AEUYN</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>ATCPS</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>C1K</scope><scope>CCPQU</scope><scope>D1I</scope><scope>DWQXO</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>H94</scope><scope>HCIFZ</scope><scope>K9.</scope><scope>KB.</scope><scope>KB0</scope><scope>KL.</scope><scope>L6V</scope><scope>LK8</scope><scope>M0K</scope><scope>M0S</scope><scope>M1P</scope><scope>M7N</scope><scope>M7P</scope><scope>M7S</scope><scope>NAPCQ</scope><scope>P5Z</scope><scope>P62</scope><scope>P64</scope><scope>PATMY</scope><scope>PDBOC</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope><scope>PYCSY</scope><scope>RC3</scope><scope>7X8</scope><scope>5PM</scope><scope>DOA</scope></search><sort><creationdate>20211104</creationdate><title>Did you see it? A Python tool for psychophysical assessment of the human blind spot</title><author>Ling, Xiao ; Silson, Edward H ; McIntosh, Robert D</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c641t-c73d0cf62d222be9266088af377ffd1958002a550a27d6093cb3f22b7c01f07a3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Adult</topic><topic>Analysis</topic><topic>Biology and Life Sciences</topic><topic>Estimates</topic><topic>Evaluation</topic><topic>Experiments</topic><topic>Female</topic><topic>Humans</topic><topic>Male</topic><topic>Medical examination</topic><topic>Medicine and Health Sciences</topic><topic>Observers</topic><topic>Optic disc</topic><topic>Optic Disk</topic><topic>Philosophy</topic><topic>Photoreceptors</topic><topic>Physical Sciences</topic><topic>Physiology</topic><topic>Programming Languages</topic><topic>Psychophysics</topic><topic>Psychophysiology</topic><topic>Quantitative psychology</topic><topic>Retina</topic><topic>Social Sciences</topic><topic>Source code</topic><topic>Stimulation</topic><topic>Vision, Monocular</topic><topic>Visual field</topic><topic>Visual fields</topic><topic>Visual Perception</topic><topic>Visual stimuli</topic><topic>Visual system</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Ling, Xiao</creatorcontrib><creatorcontrib>Silson, Edward H</creatorcontrib><creatorcontrib>McIntosh, Robert D</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Opposing Viewpoints Resource Center</collection><collection>Science (Gale in Context)</collection><collection>ProQuest Central (Corporate)</collection><collection>Animal Behavior Abstracts</collection><collection>Bacteriology Abstracts (Microbiology B)</collection><collection>Biotechnology Research Abstracts</collection><collection>Proquest Nursing &amp; Allied Health Source</collection><collection>Ecology Abstracts</collection><collection>Entomology Abstracts (Full archive)</collection><collection>Immunology Abstracts</collection><collection>Meteorological &amp; Geoastrophysical Abstracts</collection><collection>Nucleic Acids Abstracts</collection><collection>Virology and AIDS Abstracts</collection><collection>Agricultural Science Collection</collection><collection>Health &amp; Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Public Health Database</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest One Sustainability</collection><collection>ProQuest Central</collection><collection>Advanced Technologies &amp; Aerospace Database‎ (1962 - current)</collection><collection>Agricultural &amp; Environmental Science Collection</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Environmental Sciences and Pollution Management</collection><collection>ProQuest One Community College</collection><collection>ProQuest Materials Science Collection</collection><collection>ProQuest Central</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>AIDS and Cancer Research Abstracts</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>Materials Science Database</collection><collection>Nursing &amp; Allied Health Database (Alumni Edition)</collection><collection>Meteorological &amp; Geoastrophysical Abstracts - Academic</collection><collection>ProQuest Engineering Collection</collection><collection>ProQuest Biological Science Collection</collection><collection>Agricultural Science Database</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>PML(ProQuest Medical Library)</collection><collection>Algology Mycology and Protozoology Abstracts (Microbiology C)</collection><collection>Biological Science Database</collection><collection>Engineering Database</collection><collection>Nursing &amp; Allied Health Premium</collection><collection>Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>Environmental Science Database</collection><collection>Materials science collection</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering collection</collection><collection>Environmental Science Collection</collection><collection>Genetics Abstracts</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>PloS one</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Ling, Xiao</au><au>Silson, Edward H</au><au>McIntosh, Robert D</au><au>Swindale, Nicholas V</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Did you see it? A Python tool for psychophysical assessment of the human blind spot</atitle><jtitle>PloS one</jtitle><addtitle>PLoS One</addtitle><date>2021-11-04</date><risdate>2021</risdate><volume>16</volume><issue>11</issue><spage>e0254195</spage><epage>e0254195</epage><pages>e0254195-e0254195</pages><issn>1932-6203</issn><eissn>1932-6203</eissn><abstract>The blind spot is a region in the temporal monocular visual field in humans, which corresponds to a physiological scotoma within the nasal hemi-retina. This region has no photoreceptors, so is insensitive to visual stimulation. There is no corresponding perceptual scotoma because the visual stimulation is "filled-in" by the visual system. Investigations of visual perception in and around the blind spot allow us to investigate this filling-in process. However, because the location and size of the blind spot are individually variable, experimenters must first map the blind spot in every observer. We present an open-source tool, which runs in Psychopy software, to estimate the location and size of the blind spot psychophysically. The tool will ideally be used with an Eyelink eye-tracker (SR Research), but it can also run in standalone mode. Here, we explain the rationale for the tool and demonstrate its validity in normally-sighted observers. We develop a detailed map of the blind spot in one observer. Then, in a group of 12 observers, we propose a more efficient, pragmatic method to define a "safe zone" within the blind spot, for which the experimenter can be fully confident that visual stimuli will not be seen. Links are provided to this open-source tool and a user manual.</abstract><cop>United States</cop><pub>Public Library of Science</pub><pmid>34735455</pmid><doi>10.1371/journal.pone.0254195</doi><tpages>e0254195</tpages><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1932-6203
ispartof PloS one, 2021-11, Vol.16 (11), p.e0254195-e0254195
issn 1932-6203
1932-6203
language eng
recordid cdi_plos_journals_2593588729
source PLoS; MEDLINE; Full-Text Journals in Chemistry (Open access); DOAJ Directory of Open Access Journals; PubMed Central; EZB Electronic Journals Library
subjects Adult
Analysis
Biology and Life Sciences
Estimates
Evaluation
Experiments
Female
Humans
Male
Medical examination
Medicine and Health Sciences
Observers
Optic disc
Optic Disk
Philosophy
Photoreceptors
Physical Sciences
Physiology
Programming Languages
Psychophysics
Psychophysiology
Quantitative psychology
Retina
Social Sciences
Source code
Stimulation
Vision, Monocular
Visual field
Visual fields
Visual Perception
Visual stimuli
Visual system
title Did you see it? A Python tool for psychophysical assessment of the human blind spot
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-22T18%3A17%3A20IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale_plos_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Did%20you%20see%20it?%20A%20Python%20tool%20for%20psychophysical%20assessment%20of%20the%20human%20blind%20spot&rft.jtitle=PloS%20one&rft.au=Ling,%20Xiao&rft.date=2021-11-04&rft.volume=16&rft.issue=11&rft.spage=e0254195&rft.epage=e0254195&rft.pages=e0254195-e0254195&rft.issn=1932-6203&rft.eissn=1932-6203&rft_id=info:doi/10.1371/journal.pone.0254195&rft_dat=%3Cgale_plos_%3EA681288021%3C/gale_plos_%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2593588729&rft_id=info:pmid/34735455&rft_galeid=A681288021&rft_doaj_id=oai_doaj_org_article_21ab8af7487b446581521a2c74a656f5&rfr_iscdi=true