Single-Grasp Object Classification and Feature Extraction with Simple Robot Hands and Tactile Sensors

Classical robotic approaches to tactile object identification often involve rigid mechanical grippers, dense sensor arrays, and exploratory procedures (EPs). Though EPs are a natural method for humans to acquire object information, evidence also exists for meaningful tactile property inference from...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on haptics 2016-04, Vol.9 (2), p.207-220
Hauptverfasser: Spiers, Adam J., Liarokapis, Minas V., Calli, Berk, Dollar, Aaron M.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 220
container_issue 2
container_start_page 207
container_title IEEE transactions on haptics
container_volume 9
creator Spiers, Adam J.
Liarokapis, Minas V.
Calli, Berk
Dollar, Aaron M.
description Classical robotic approaches to tactile object identification often involve rigid mechanical grippers, dense sensor arrays, and exploratory procedures (EPs). Though EPs are a natural method for humans to acquire object information, evidence also exists for meaningful tactile property inference from brief, non-exploratory motions (a 'haptic glance'). In this work, we implement tactile object identification and feature extraction techniques on data acquired during a single, unplanned grasp with a simple, underactuated robot hand equipped with inexpensive barometric pressure sensors. Our methodology utilizes two cooperating schemes based on an advanced machine learning technique (random forests) and parametric methods that estimate object properties. The available data is limited to actuator positions (one per two link finger) and force sensors values (eight per finger). The schemes are able to work both independently and collaboratively, depending on the task scenario. When collaborating, the results of each method contribute to the other, improving the overall result in a synergistic fashion. Unlike prior work, the proposed approach does not require object exploration, re-grasping, grasp-release, or force modulation and works for arbitrary object start positions and orientations. Due to these factors, the technique may be integrated into practical robotic grasping scenarios without adding time or manipulation overheads.
doi_str_mv 10.1109/TOH.2016.2521378
format Article
fullrecord <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_crossref_primary_10_1109_TOH_2016_2521378</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>7390277</ieee_id><sourcerecordid>1825523163</sourcerecordid><originalsourceid>FETCH-LOGICAL-c469t-3d1e343fadd90440b116d46084e32e9d80ff369b2dd65f77ef72d9733eb241d73</originalsourceid><addsrcrecordid>eNqF0c9LKzEQB_AgT7RW78IDWfDiZWsyyebH8VHUCkLB1vOS3cz6Ura7NdlF_e_d2urBi6fA5DMDM19CzhmdMEbN9XI-mwBlcgIZMK70ARkBB5MKmrE_ZMQMNykTDI7JSYwrSiUoI47IMUgNRlMxIrjwzXON6V2wcZPMixWWXTKtbYy-8qXtfNsktnHJLdquD5jcvHXBlp_lV9_9TxZ-vakxeWyLtktmg4yffLk1Q32BTWxDPCWHla0jnu3fMXm6vVlOZ-nD_O5--u8hLYU0XcodQy54ZZ0zVAhaMCadkFQL5IDGaVpVXJoCnJNZpRRWCpxRnGMBgjnFx-RqN3cT2pceY5evfSyxrm2DbR9zpiHLgDPJf6fKmEwC1XKglz_oqu1DMyyyVVoboYbrjwndqTK0MQas8k3waxvec0bzbVr5kFa-TSvfpzW0XOwH98Ua3XfDVzwD-LsDHhG_vxU3FJTiH-Bnl0I</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1798894721</pqid></control><display><type>article</type><title>Single-Grasp Object Classification and Feature Extraction with Simple Robot Hands and Tactile Sensors</title><source>IEEE Electronic Library (IEL)</source><creator>Spiers, Adam J. ; Liarokapis, Minas V. ; Calli, Berk ; Dollar, Aaron M.</creator><creatorcontrib>Spiers, Adam J. ; Liarokapis, Minas V. ; Calli, Berk ; Dollar, Aaron M.</creatorcontrib><description>Classical robotic approaches to tactile object identification often involve rigid mechanical grippers, dense sensor arrays, and exploratory procedures (EPs). Though EPs are a natural method for humans to acquire object information, evidence also exists for meaningful tactile property inference from brief, non-exploratory motions (a 'haptic glance'). In this work, we implement tactile object identification and feature extraction techniques on data acquired during a single, unplanned grasp with a simple, underactuated robot hand equipped with inexpensive barometric pressure sensors. Our methodology utilizes two cooperating schemes based on an advanced machine learning technique (random forests) and parametric methods that estimate object properties. The available data is limited to actuator positions (one per two link finger) and force sensors values (eight per finger). The schemes are able to work both independently and collaboratively, depending on the task scenario. When collaborating, the results of each method contribute to the other, improving the overall result in a synergistic fashion. Unlike prior work, the proposed approach does not require object exploration, re-grasping, grasp-release, or force modulation and works for arbitrary object start positions and orientations. Due to these factors, the technique may be integrated into practical robotic grasping scenarios without adding time or manipulation overheads.</description><identifier>ISSN: 1939-1412</identifier><identifier>EISSN: 2329-4051</identifier><identifier>DOI: 10.1109/TOH.2016.2521378</identifier><identifier>PMID: 26829804</identifier><identifier>CODEN: ITHEBX</identifier><language>eng</language><publisher>United States: IEEE</publisher><subject>Adaptive Grasping ; Animals ; Biomechanical Phenomena - physiology ; EPS ; Equipment Design ; Feature extraction ; Fingers ; Fingers - anatomy &amp; histology ; Fingers - physiology ; Grasping ; Hand - anatomy &amp; histology ; Hand - physiology ; Hand Strength - physiology ; Haptics Applications ; Humans ; Machine Learning ; Object Classification ; Object Feature Extraction ; Object recognition ; Robot sensing systems ; Robotics ; Robotics - methods ; Robots ; Sensors ; Tactile ; Tactile Sensing ; Thumb ; Touch - physiology ; Underactuated Robot Hands</subject><ispartof>IEEE transactions on haptics, 2016-04, Vol.9 (2), p.207-220</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2016</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c469t-3d1e343fadd90440b116d46084e32e9d80ff369b2dd65f77ef72d9733eb241d73</citedby><cites>FETCH-LOGICAL-c469t-3d1e343fadd90440b116d46084e32e9d80ff369b2dd65f77ef72d9733eb241d73</cites><orcidid>0000-0002-3221-1000</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/7390277$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,792,27901,27902,54733</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/7390277$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/26829804$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Spiers, Adam J.</creatorcontrib><creatorcontrib>Liarokapis, Minas V.</creatorcontrib><creatorcontrib>Calli, Berk</creatorcontrib><creatorcontrib>Dollar, Aaron M.</creatorcontrib><title>Single-Grasp Object Classification and Feature Extraction with Simple Robot Hands and Tactile Sensors</title><title>IEEE transactions on haptics</title><addtitle>TOH</addtitle><addtitle>IEEE Trans Haptics</addtitle><description>Classical robotic approaches to tactile object identification often involve rigid mechanical grippers, dense sensor arrays, and exploratory procedures (EPs). Though EPs are a natural method for humans to acquire object information, evidence also exists for meaningful tactile property inference from brief, non-exploratory motions (a 'haptic glance'). In this work, we implement tactile object identification and feature extraction techniques on data acquired during a single, unplanned grasp with a simple, underactuated robot hand equipped with inexpensive barometric pressure sensors. Our methodology utilizes two cooperating schemes based on an advanced machine learning technique (random forests) and parametric methods that estimate object properties. The available data is limited to actuator positions (one per two link finger) and force sensors values (eight per finger). The schemes are able to work both independently and collaboratively, depending on the task scenario. When collaborating, the results of each method contribute to the other, improving the overall result in a synergistic fashion. Unlike prior work, the proposed approach does not require object exploration, re-grasping, grasp-release, or force modulation and works for arbitrary object start positions and orientations. Due to these factors, the technique may be integrated into practical robotic grasping scenarios without adding time or manipulation overheads.</description><subject>Adaptive Grasping</subject><subject>Animals</subject><subject>Biomechanical Phenomena - physiology</subject><subject>EPS</subject><subject>Equipment Design</subject><subject>Feature extraction</subject><subject>Fingers</subject><subject>Fingers - anatomy &amp; histology</subject><subject>Fingers - physiology</subject><subject>Grasping</subject><subject>Hand - anatomy &amp; histology</subject><subject>Hand - physiology</subject><subject>Hand Strength - physiology</subject><subject>Haptics Applications</subject><subject>Humans</subject><subject>Machine Learning</subject><subject>Object Classification</subject><subject>Object Feature Extraction</subject><subject>Object recognition</subject><subject>Robot sensing systems</subject><subject>Robotics</subject><subject>Robotics - methods</subject><subject>Robots</subject><subject>Sensors</subject><subject>Tactile</subject><subject>Tactile Sensing</subject><subject>Thumb</subject><subject>Touch - physiology</subject><subject>Underactuated Robot Hands</subject><issn>1939-1412</issn><issn>2329-4051</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2016</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><sourceid>EIF</sourceid><recordid>eNqF0c9LKzEQB_AgT7RW78IDWfDiZWsyyebH8VHUCkLB1vOS3cz6Ura7NdlF_e_d2urBi6fA5DMDM19CzhmdMEbN9XI-mwBlcgIZMK70ARkBB5MKmrE_ZMQMNykTDI7JSYwrSiUoI47IMUgNRlMxIrjwzXON6V2wcZPMixWWXTKtbYy-8qXtfNsktnHJLdquD5jcvHXBlp_lV9_9TxZ-vakxeWyLtktmg4yffLk1Q32BTWxDPCWHla0jnu3fMXm6vVlOZ-nD_O5--u8hLYU0XcodQy54ZZ0zVAhaMCadkFQL5IDGaVpVXJoCnJNZpRRWCpxRnGMBgjnFx-RqN3cT2pceY5evfSyxrm2DbR9zpiHLgDPJf6fKmEwC1XKglz_oqu1DMyyyVVoboYbrjwndqTK0MQas8k3waxvec0bzbVr5kFa-TSvfpzW0XOwH98Ua3XfDVzwD-LsDHhG_vxU3FJTiH-Bnl0I</recordid><startdate>20160401</startdate><enddate>20160401</enddate><creator>Spiers, Adam J.</creator><creator>Liarokapis, Minas V.</creator><creator>Calli, Berk</creator><creator>Dollar, Aaron M.</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0002-3221-1000</orcidid></search><sort><creationdate>20160401</creationdate><title>Single-Grasp Object Classification and Feature Extraction with Simple Robot Hands and Tactile Sensors</title><author>Spiers, Adam J. ; Liarokapis, Minas V. ; Calli, Berk ; Dollar, Aaron M.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c469t-3d1e343fadd90440b116d46084e32e9d80ff369b2dd65f77ef72d9733eb241d73</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2016</creationdate><topic>Adaptive Grasping</topic><topic>Animals</topic><topic>Biomechanical Phenomena - physiology</topic><topic>EPS</topic><topic>Equipment Design</topic><topic>Feature extraction</topic><topic>Fingers</topic><topic>Fingers - anatomy &amp; histology</topic><topic>Fingers - physiology</topic><topic>Grasping</topic><topic>Hand - anatomy &amp; histology</topic><topic>Hand - physiology</topic><topic>Hand Strength - physiology</topic><topic>Haptics Applications</topic><topic>Humans</topic><topic>Machine Learning</topic><topic>Object Classification</topic><topic>Object Feature Extraction</topic><topic>Object recognition</topic><topic>Robot sensing systems</topic><topic>Robotics</topic><topic>Robotics - methods</topic><topic>Robots</topic><topic>Sensors</topic><topic>Tactile</topic><topic>Tactile Sensing</topic><topic>Thumb</topic><topic>Touch - physiology</topic><topic>Underactuated Robot Hands</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Spiers, Adam J.</creatorcontrib><creatorcontrib>Liarokapis, Minas V.</creatorcontrib><creatorcontrib>Calli, Berk</creatorcontrib><creatorcontrib>Dollar, Aaron M.</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>MEDLINE - Academic</collection><jtitle>IEEE transactions on haptics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Spiers, Adam J.</au><au>Liarokapis, Minas V.</au><au>Calli, Berk</au><au>Dollar, Aaron M.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Single-Grasp Object Classification and Feature Extraction with Simple Robot Hands and Tactile Sensors</atitle><jtitle>IEEE transactions on haptics</jtitle><stitle>TOH</stitle><addtitle>IEEE Trans Haptics</addtitle><date>2016-04-01</date><risdate>2016</risdate><volume>9</volume><issue>2</issue><spage>207</spage><epage>220</epage><pages>207-220</pages><issn>1939-1412</issn><eissn>2329-4051</eissn><coden>ITHEBX</coden><abstract>Classical robotic approaches to tactile object identification often involve rigid mechanical grippers, dense sensor arrays, and exploratory procedures (EPs). Though EPs are a natural method for humans to acquire object information, evidence also exists for meaningful tactile property inference from brief, non-exploratory motions (a 'haptic glance'). In this work, we implement tactile object identification and feature extraction techniques on data acquired during a single, unplanned grasp with a simple, underactuated robot hand equipped with inexpensive barometric pressure sensors. Our methodology utilizes two cooperating schemes based on an advanced machine learning technique (random forests) and parametric methods that estimate object properties. The available data is limited to actuator positions (one per two link finger) and force sensors values (eight per finger). The schemes are able to work both independently and collaboratively, depending on the task scenario. When collaborating, the results of each method contribute to the other, improving the overall result in a synergistic fashion. Unlike prior work, the proposed approach does not require object exploration, re-grasping, grasp-release, or force modulation and works for arbitrary object start positions and orientations. Due to these factors, the technique may be integrated into practical robotic grasping scenarios without adding time or manipulation overheads.</abstract><cop>United States</cop><pub>IEEE</pub><pmid>26829804</pmid><doi>10.1109/TOH.2016.2521378</doi><tpages>14</tpages><orcidid>https://orcid.org/0000-0002-3221-1000</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 1939-1412
ispartof IEEE transactions on haptics, 2016-04, Vol.9 (2), p.207-220
issn 1939-1412
2329-4051
language eng
recordid cdi_crossref_primary_10_1109_TOH_2016_2521378
source IEEE Electronic Library (IEL)
subjects Adaptive Grasping
Animals
Biomechanical Phenomena - physiology
EPS
Equipment Design
Feature extraction
Fingers
Fingers - anatomy & histology
Fingers - physiology
Grasping
Hand - anatomy & histology
Hand - physiology
Hand Strength - physiology
Haptics Applications
Humans
Machine Learning
Object Classification
Object Feature Extraction
Object recognition
Robot sensing systems
Robotics
Robotics - methods
Robots
Sensors
Tactile
Tactile Sensing
Thumb
Touch - physiology
Underactuated Robot Hands
title Single-Grasp Object Classification and Feature Extraction with Simple Robot Hands and Tactile Sensors
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-31T04%3A32%3A52IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Single-Grasp%20Object%20Classification%20and%20Feature%20Extraction%20with%20Simple%20Robot%20Hands%20and%20Tactile%20Sensors&rft.jtitle=IEEE%20transactions%20on%20haptics&rft.au=Spiers,%20Adam%20J.&rft.date=2016-04-01&rft.volume=9&rft.issue=2&rft.spage=207&rft.epage=220&rft.pages=207-220&rft.issn=1939-1412&rft.eissn=2329-4051&rft.coden=ITHEBX&rft_id=info:doi/10.1109/TOH.2016.2521378&rft_dat=%3Cproquest_RIE%3E1825523163%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=1798894721&rft_id=info:pmid/26829804&rft_ieee_id=7390277&rfr_iscdi=true