Decoding Information for Grasping from the Macaque Dorsomedial Visual Stream

Neurodecoders have been developed by researchers mostly to control neuroprosthetic devices, but also to shed new light on neural functions. In this study, we show that signals representing grip configurations can be reliably decoded from neural data acquired from area V6A of the monkey medial poster...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:The Journal of neuroscience 2017-04, Vol.37 (16), p.4311-4322
Hauptverfasser: Filippini, Matteo, Breveglieri, Rossella, Akhras, M Ali, Bosco, Annalisa, Chinellato, Eris, Fattori, Patrizia
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 4322
container_issue 16
container_start_page 4311
container_title The Journal of neuroscience
container_volume 37
creator Filippini, Matteo
Breveglieri, Rossella
Akhras, M Ali
Bosco, Annalisa
Chinellato, Eris
Fattori, Patrizia
description Neurodecoders have been developed by researchers mostly to control neuroprosthetic devices, but also to shed new light on neural functions. In this study, we show that signals representing grip configurations can be reliably decoded from neural data acquired from area V6A of the monkey medial posterior parietal cortex. Two monkeys were trained to perform an instructed-delay reach-to-grasp task in the dark and in the light toward objects of different shapes. Population neural activity was extracted at various time intervals on vision of the objects, the delay before movement, and grasp execution. This activity was used to train and validate a Bayes classifier used for decoding objects and grip types. Recognition rates were well over chance level for all the epochs analyzed in this study. Furthermore, we detected slightly different decoding accuracies, depending on the task's visual condition. Generalization analysis was performed by training and testing the system during different time intervals. This analysis demonstrated that a change of code occurred during the course of the task. Our classifier was able to discriminate grasp types fairly well in advance with respect to grasping onset. This feature might be important when the timing is critical to send signals to external devices before the movement start. Our results suggest that the neural signals from the dorsomedial visual pathway can be a good substrate to feed neural prostheses for prehensile actions. Recordings of neural activity from nonhuman primate frontal and parietal cortex have led to the development of methods of decoding movement information to restore coordinated arm actions in paralyzed human beings. Our results show that the signals measured from the monkey medial posterior parietal cortex are valid for correctly decoding information relevant for grasping. Together with previous studies on decoding reach trajectories from the medial posterior parietal cortex, this highlights the medial parietal cortex as a target site for transforming neural activity into control signals to command prostheses to allow human patients to dexterously perform grasping actions.
doi_str_mv 10.1523/JNEUROSCI.3077-16.2017
format Article
fullrecord <record><control><sourceid>proquest_pubme</sourceid><recordid>TN_cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_6596562</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>1879667850</sourcerecordid><originalsourceid>FETCH-LOGICAL-c500t-d0c2e192e1c9c5bff352ad75f9a71d211ab2cc1864f132224ca460b10110bd4b3</originalsourceid><addsrcrecordid>eNqNkU1LxDAQhoMoun78BenRS9eZtEmaiyDrqiurgl_XkKapVtpmTbqC_94s6qI3D2EC7zvvzPAQcogwRkaz46ub6ePd7f1kNs5AiBT5mAKKDTKKqkxpDrhJRkAFpDwX-Q7ZDeEVAEQ0bZMdWmQUipyNyPzMGlc1_XMy62vnOz00rk_iL7nwOixWQu1dlwwvNrnWRr8tbXLmfHCdrRrdJk9NWMZyP3iru32yVes22IPvukcez6cPk8t0fnsxm5zOU8MAhrQCQy3K-Iw0rKzrjFFdCVZLLbCiiLqkxmDB8xozSmludM6hRECEssrLbI-cfOUulmXcw9h-8LpVC9902n8opxv1V-mbF_Xs3hVnkjNOY8DRd4B38aIwqK4Jxrat7q1bBoWFFJmM4-Q_rEJyLgoG0cq_rMa7ELyt1xshqBU1taamVtQUcrWiFhsPf9-zbvvBlH0CHgKU5A</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1879667850</pqid></control><display><type>article</type><title>Decoding Information for Grasping from the Macaque Dorsomedial Visual Stream</title><source>MEDLINE</source><source>Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals</source><source>PubMed Central</source><creator>Filippini, Matteo ; Breveglieri, Rossella ; Akhras, M Ali ; Bosco, Annalisa ; Chinellato, Eris ; Fattori, Patrizia</creator><creatorcontrib>Filippini, Matteo ; Breveglieri, Rossella ; Akhras, M Ali ; Bosco, Annalisa ; Chinellato, Eris ; Fattori, Patrizia</creatorcontrib><description>Neurodecoders have been developed by researchers mostly to control neuroprosthetic devices, but also to shed new light on neural functions. In this study, we show that signals representing grip configurations can be reliably decoded from neural data acquired from area V6A of the monkey medial posterior parietal cortex. Two monkeys were trained to perform an instructed-delay reach-to-grasp task in the dark and in the light toward objects of different shapes. Population neural activity was extracted at various time intervals on vision of the objects, the delay before movement, and grasp execution. This activity was used to train and validate a Bayes classifier used for decoding objects and grip types. Recognition rates were well over chance level for all the epochs analyzed in this study. Furthermore, we detected slightly different decoding accuracies, depending on the task's visual condition. Generalization analysis was performed by training and testing the system during different time intervals. This analysis demonstrated that a change of code occurred during the course of the task. Our classifier was able to discriminate grasp types fairly well in advance with respect to grasping onset. This feature might be important when the timing is critical to send signals to external devices before the movement start. Our results suggest that the neural signals from the dorsomedial visual pathway can be a good substrate to feed neural prostheses for prehensile actions. Recordings of neural activity from nonhuman primate frontal and parietal cortex have led to the development of methods of decoding movement information to restore coordinated arm actions in paralyzed human beings. Our results show that the signals measured from the monkey medial posterior parietal cortex are valid for correctly decoding information relevant for grasping. Together with previous studies on decoding reach trajectories from the medial posterior parietal cortex, this highlights the medial parietal cortex as a target site for transforming neural activity into control signals to command prostheses to allow human patients to dexterously perform grasping actions.</description><identifier>ISSN: 0270-6474</identifier><identifier>EISSN: 1529-2401</identifier><identifier>DOI: 10.1523/JNEUROSCI.3077-16.2017</identifier><identifier>PMID: 28320845</identifier><language>eng</language><publisher>United States: Society for Neuroscience</publisher><subject>Animals ; Hand - physiology ; Hand Strength ; Macaca fascicularis ; Male ; Movement ; Psychomotor Performance ; Visual Cortex - physiology ; Visual Pathways - physiology ; Visual Perception</subject><ispartof>The Journal of neuroscience, 2017-04, Vol.37 (16), p.4311-4322</ispartof><rights>Copyright © 2017 the authors 0270-6474/17/374311-12$15.00/0.</rights><rights>Copyright © 2017 the authors 0270-6474/17/374311-12$15.00/0 2017</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c500t-d0c2e192e1c9c5bff352ad75f9a71d211ab2cc1864f132224ca460b10110bd4b3</citedby><cites>FETCH-LOGICAL-c500t-d0c2e192e1c9c5bff352ad75f9a71d211ab2cc1864f132224ca460b10110bd4b3</cites><orcidid>0000-0002-8216-2622 ; 0000-0002-0079-3755 ; 0000-0003-1920-2238 ; 0000-0002-0730-4088 ; 0000-0003-3032-9571 ; 0000-0001-5059-1441</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC6596562/pdf/$$EPDF$$P50$$Gpubmedcentral$$H</linktopdf><linktohtml>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC6596562/$$EHTML$$P50$$Gpubmedcentral$$H</linktohtml><link.rule.ids>230,314,724,777,781,882,27905,27906,53772,53774</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/28320845$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Filippini, Matteo</creatorcontrib><creatorcontrib>Breveglieri, Rossella</creatorcontrib><creatorcontrib>Akhras, M Ali</creatorcontrib><creatorcontrib>Bosco, Annalisa</creatorcontrib><creatorcontrib>Chinellato, Eris</creatorcontrib><creatorcontrib>Fattori, Patrizia</creatorcontrib><title>Decoding Information for Grasping from the Macaque Dorsomedial Visual Stream</title><title>The Journal of neuroscience</title><addtitle>J Neurosci</addtitle><description>Neurodecoders have been developed by researchers mostly to control neuroprosthetic devices, but also to shed new light on neural functions. In this study, we show that signals representing grip configurations can be reliably decoded from neural data acquired from area V6A of the monkey medial posterior parietal cortex. Two monkeys were trained to perform an instructed-delay reach-to-grasp task in the dark and in the light toward objects of different shapes. Population neural activity was extracted at various time intervals on vision of the objects, the delay before movement, and grasp execution. This activity was used to train and validate a Bayes classifier used for decoding objects and grip types. Recognition rates were well over chance level for all the epochs analyzed in this study. Furthermore, we detected slightly different decoding accuracies, depending on the task's visual condition. Generalization analysis was performed by training and testing the system during different time intervals. This analysis demonstrated that a change of code occurred during the course of the task. Our classifier was able to discriminate grasp types fairly well in advance with respect to grasping onset. This feature might be important when the timing is critical to send signals to external devices before the movement start. Our results suggest that the neural signals from the dorsomedial visual pathway can be a good substrate to feed neural prostheses for prehensile actions. Recordings of neural activity from nonhuman primate frontal and parietal cortex have led to the development of methods of decoding movement information to restore coordinated arm actions in paralyzed human beings. Our results show that the signals measured from the monkey medial posterior parietal cortex are valid for correctly decoding information relevant for grasping. Together with previous studies on decoding reach trajectories from the medial posterior parietal cortex, this highlights the medial parietal cortex as a target site for transforming neural activity into control signals to command prostheses to allow human patients to dexterously perform grasping actions.</description><subject>Animals</subject><subject>Hand - physiology</subject><subject>Hand Strength</subject><subject>Macaca fascicularis</subject><subject>Male</subject><subject>Movement</subject><subject>Psychomotor Performance</subject><subject>Visual Cortex - physiology</subject><subject>Visual Pathways - physiology</subject><subject>Visual Perception</subject><issn>0270-6474</issn><issn>1529-2401</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2017</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><recordid>eNqNkU1LxDAQhoMoun78BenRS9eZtEmaiyDrqiurgl_XkKapVtpmTbqC_94s6qI3D2EC7zvvzPAQcogwRkaz46ub6ePd7f1kNs5AiBT5mAKKDTKKqkxpDrhJRkAFpDwX-Q7ZDeEVAEQ0bZMdWmQUipyNyPzMGlc1_XMy62vnOz00rk_iL7nwOixWQu1dlwwvNrnWRr8tbXLmfHCdrRrdJk9NWMZyP3iru32yVes22IPvukcez6cPk8t0fnsxm5zOU8MAhrQCQy3K-Iw0rKzrjFFdCVZLLbCiiLqkxmDB8xozSmludM6hRECEssrLbI-cfOUulmXcw9h-8LpVC9902n8opxv1V-mbF_Xs3hVnkjNOY8DRd4B38aIwqK4Jxrat7q1bBoWFFJmM4-Q_rEJyLgoG0cq_rMa7ELyt1xshqBU1taamVtQUcrWiFhsPf9-zbvvBlH0CHgKU5A</recordid><startdate>20170419</startdate><enddate>20170419</enddate><creator>Filippini, Matteo</creator><creator>Breveglieri, Rossella</creator><creator>Akhras, M Ali</creator><creator>Bosco, Annalisa</creator><creator>Chinellato, Eris</creator><creator>Fattori, Patrizia</creator><general>Society for Neuroscience</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope><scope>7QG</scope><scope>7TK</scope><scope>5PM</scope><orcidid>https://orcid.org/0000-0002-8216-2622</orcidid><orcidid>https://orcid.org/0000-0002-0079-3755</orcidid><orcidid>https://orcid.org/0000-0003-1920-2238</orcidid><orcidid>https://orcid.org/0000-0002-0730-4088</orcidid><orcidid>https://orcid.org/0000-0003-3032-9571</orcidid><orcidid>https://orcid.org/0000-0001-5059-1441</orcidid></search><sort><creationdate>20170419</creationdate><title>Decoding Information for Grasping from the Macaque Dorsomedial Visual Stream</title><author>Filippini, Matteo ; Breveglieri, Rossella ; Akhras, M Ali ; Bosco, Annalisa ; Chinellato, Eris ; Fattori, Patrizia</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c500t-d0c2e192e1c9c5bff352ad75f9a71d211ab2cc1864f132224ca460b10110bd4b3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2017</creationdate><topic>Animals</topic><topic>Hand - physiology</topic><topic>Hand Strength</topic><topic>Macaca fascicularis</topic><topic>Male</topic><topic>Movement</topic><topic>Psychomotor Performance</topic><topic>Visual Cortex - physiology</topic><topic>Visual Pathways - physiology</topic><topic>Visual Perception</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Filippini, Matteo</creatorcontrib><creatorcontrib>Breveglieri, Rossella</creatorcontrib><creatorcontrib>Akhras, M Ali</creatorcontrib><creatorcontrib>Bosco, Annalisa</creatorcontrib><creatorcontrib>Chinellato, Eris</creatorcontrib><creatorcontrib>Fattori, Patrizia</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><collection>Animal Behavior Abstracts</collection><collection>Neurosciences Abstracts</collection><collection>PubMed Central (Full Participant titles)</collection><jtitle>The Journal of neuroscience</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Filippini, Matteo</au><au>Breveglieri, Rossella</au><au>Akhras, M Ali</au><au>Bosco, Annalisa</au><au>Chinellato, Eris</au><au>Fattori, Patrizia</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Decoding Information for Grasping from the Macaque Dorsomedial Visual Stream</atitle><jtitle>The Journal of neuroscience</jtitle><addtitle>J Neurosci</addtitle><date>2017-04-19</date><risdate>2017</risdate><volume>37</volume><issue>16</issue><spage>4311</spage><epage>4322</epage><pages>4311-4322</pages><issn>0270-6474</issn><eissn>1529-2401</eissn><abstract>Neurodecoders have been developed by researchers mostly to control neuroprosthetic devices, but also to shed new light on neural functions. In this study, we show that signals representing grip configurations can be reliably decoded from neural data acquired from area V6A of the monkey medial posterior parietal cortex. Two monkeys were trained to perform an instructed-delay reach-to-grasp task in the dark and in the light toward objects of different shapes. Population neural activity was extracted at various time intervals on vision of the objects, the delay before movement, and grasp execution. This activity was used to train and validate a Bayes classifier used for decoding objects and grip types. Recognition rates were well over chance level for all the epochs analyzed in this study. Furthermore, we detected slightly different decoding accuracies, depending on the task's visual condition. Generalization analysis was performed by training and testing the system during different time intervals. This analysis demonstrated that a change of code occurred during the course of the task. Our classifier was able to discriminate grasp types fairly well in advance with respect to grasping onset. This feature might be important when the timing is critical to send signals to external devices before the movement start. Our results suggest that the neural signals from the dorsomedial visual pathway can be a good substrate to feed neural prostheses for prehensile actions. Recordings of neural activity from nonhuman primate frontal and parietal cortex have led to the development of methods of decoding movement information to restore coordinated arm actions in paralyzed human beings. Our results show that the signals measured from the monkey medial posterior parietal cortex are valid for correctly decoding information relevant for grasping. Together with previous studies on decoding reach trajectories from the medial posterior parietal cortex, this highlights the medial parietal cortex as a target site for transforming neural activity into control signals to command prostheses to allow human patients to dexterously perform grasping actions.</abstract><cop>United States</cop><pub>Society for Neuroscience</pub><pmid>28320845</pmid><doi>10.1523/JNEUROSCI.3077-16.2017</doi><tpages>12</tpages><orcidid>https://orcid.org/0000-0002-8216-2622</orcidid><orcidid>https://orcid.org/0000-0002-0079-3755</orcidid><orcidid>https://orcid.org/0000-0003-1920-2238</orcidid><orcidid>https://orcid.org/0000-0002-0730-4088</orcidid><orcidid>https://orcid.org/0000-0003-3032-9571</orcidid><orcidid>https://orcid.org/0000-0001-5059-1441</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 0270-6474
ispartof The Journal of neuroscience, 2017-04, Vol.37 (16), p.4311-4322
issn 0270-6474
1529-2401
language eng
recordid cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_6596562
source MEDLINE; Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals; PubMed Central
subjects Animals
Hand - physiology
Hand Strength
Macaca fascicularis
Male
Movement
Psychomotor Performance
Visual Cortex - physiology
Visual Pathways - physiology
Visual Perception
title Decoding Information for Grasping from the Macaque Dorsomedial Visual Stream
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-20T19%3A51%3A51IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_pubme&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Decoding%20Information%20for%20Grasping%20from%20the%20Macaque%20Dorsomedial%20Visual%20Stream&rft.jtitle=The%20Journal%20of%20neuroscience&rft.au=Filippini,%20Matteo&rft.date=2017-04-19&rft.volume=37&rft.issue=16&rft.spage=4311&rft.epage=4322&rft.pages=4311-4322&rft.issn=0270-6474&rft.eissn=1529-2401&rft_id=info:doi/10.1523/JNEUROSCI.3077-16.2017&rft_dat=%3Cproquest_pubme%3E1879667850%3C/proquest_pubme%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=1879667850&rft_id=info:pmid/28320845&rfr_iscdi=true