High-efficiency automated triaxial robot grasping system for motor rotors using 3D structured light sensor

With the rapid development of artificial intelligence and computer vision, numerous technologies have been introduced to automate manufacturing in the industry. Typical metal workpieces in the industry often have highly reflective surfaces, come in various sizes, and are positioned irregularly. The...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Machine vision and applications 2024-11, Vol.35 (6), p.132, Article 132
Hauptverfasser: Liang, Jixin, Ye, Yuping, Wu, Di, Chen, Siyuan, Song, Zhan
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue 6
container_start_page 132
container_title Machine vision and applications
container_volume 35
creator Liang, Jixin
Ye, Yuping
Wu, Di
Chen, Siyuan
Song, Zhan
description With the rapid development of artificial intelligence and computer vision, numerous technologies have been introduced to automate manufacturing in the industry. Typical metal workpieces in the industry often have highly reflective surfaces, come in various sizes, and are positioned irregularly. The motor rotor presented in this paper is one such representative workpiece. Traditional grasping methods for workpiece loading and unloading are pre-programmed and often struggle to cope with complex and disordered situations. In this paper, we introduce a structured light (SL) sensor as the visual guide for the triaxial robot. Furthermore, we propose a high-precision hand-eye calibration method for the non-orthogonal coordinate system of the triaxial robot. Additionally, a motor rotor center localization method based on U-Net image segmentation is proposed. By combining the high-precision hand-eye calibration and localization, we can accurately and automatically locate and grasp the rotor. We have conducted sufficient experiments to verify the effectiveness and accuracy of our system.
doi_str_mv 10.1007/s00138-024-01610-7
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_3112646053</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3112646053</sourcerecordid><originalsourceid>FETCH-LOGICAL-c200t-56a50042101a13736ea1dc03e4efe0f5b67cdcd0878350e64de9ac6fa00508983</originalsourceid><addsrcrecordid>eNp9kE9LAzEQxYMoWKtfwFPAc3Sy2U12j1L_VCh40XNIs8m6pbupmSzYb29qBW9eZgbmvd8Mj5BrDrccQN0hABc1g6JkwCUHpk7IjJeiYFzJ5pTMoMlzDU1xTi4QNwBQKlXOyGbZdx_Med_b3o12T82UwmCSa2mKvfnqzZbGsA6JdtHgrh87intMbqA-RDqElGs8VKQTHrbigWKKk01TzIxtpieKbsQQL8mZN1t0V799Tt6fHt8WS7Z6fX5Z3K-YLQASq6Sp8ncFB264UEI6w1sLwpXOO_DVWirb2hZqVYsKnCxb1xgrvQGooG5qMSc3R-4uhs_JYdKbMMUxn9SC80KWEiqRVcVRZWNAjM7rXewHE_eagz5kqo-Z6pyp_slUq2wSRxNm8di5-If-x_UN-OR6_g</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3112646053</pqid></control><display><type>article</type><title>High-efficiency automated triaxial robot grasping system for motor rotors using 3D structured light sensor</title><source>Springer Nature - Complete Springer Journals</source><creator>Liang, Jixin ; Ye, Yuping ; Wu, Di ; Chen, Siyuan ; Song, Zhan</creator><creatorcontrib>Liang, Jixin ; Ye, Yuping ; Wu, Di ; Chen, Siyuan ; Song, Zhan</creatorcontrib><description>With the rapid development of artificial intelligence and computer vision, numerous technologies have been introduced to automate manufacturing in the industry. Typical metal workpieces in the industry often have highly reflective surfaces, come in various sizes, and are positioned irregularly. The motor rotor presented in this paper is one such representative workpiece. Traditional grasping methods for workpiece loading and unloading are pre-programmed and often struggle to cope with complex and disordered situations. In this paper, we introduce a structured light (SL) sensor as the visual guide for the triaxial robot. Furthermore, we propose a high-precision hand-eye calibration method for the non-orthogonal coordinate system of the triaxial robot. Additionally, a motor rotor center localization method based on U-Net image segmentation is proposed. By combining the high-precision hand-eye calibration and localization, we can accurately and automatically locate and grasp the rotor. We have conducted sufficient experiments to verify the effectiveness and accuracy of our system.</description><identifier>ISSN: 0932-8092</identifier><identifier>EISSN: 1432-1769</identifier><identifier>DOI: 10.1007/s00138-024-01610-7</identifier><language>eng</language><publisher>Berlin/Heidelberg: Springer Berlin Heidelberg</publisher><subject>Artificial intelligence ; Automation ; Calibration ; Communications Engineering ; Computer Science ; Computer vision ; Coordinates ; Eye (anatomy) ; Grasping (robotics) ; Image Processing and Computer Vision ; Image segmentation ; Industrial development ; Localization ; Localization method ; Motor rotors ; Networks ; Pattern Recognition ; Robots ; Workpieces</subject><ispartof>Machine vision and applications, 2024-11, Vol.35 (6), p.132, Article 132</ispartof><rights>The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2024. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c200t-56a50042101a13736ea1dc03e4efe0f5b67cdcd0878350e64de9ac6fa00508983</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s00138-024-01610-7$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s00138-024-01610-7$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,776,780,27901,27902,41464,42533,51294</link.rule.ids></links><search><creatorcontrib>Liang, Jixin</creatorcontrib><creatorcontrib>Ye, Yuping</creatorcontrib><creatorcontrib>Wu, Di</creatorcontrib><creatorcontrib>Chen, Siyuan</creatorcontrib><creatorcontrib>Song, Zhan</creatorcontrib><title>High-efficiency automated triaxial robot grasping system for motor rotors using 3D structured light sensor</title><title>Machine vision and applications</title><addtitle>Machine Vision and Applications</addtitle><description>With the rapid development of artificial intelligence and computer vision, numerous technologies have been introduced to automate manufacturing in the industry. Typical metal workpieces in the industry often have highly reflective surfaces, come in various sizes, and are positioned irregularly. The motor rotor presented in this paper is one such representative workpiece. Traditional grasping methods for workpiece loading and unloading are pre-programmed and often struggle to cope with complex and disordered situations. In this paper, we introduce a structured light (SL) sensor as the visual guide for the triaxial robot. Furthermore, we propose a high-precision hand-eye calibration method for the non-orthogonal coordinate system of the triaxial robot. Additionally, a motor rotor center localization method based on U-Net image segmentation is proposed. By combining the high-precision hand-eye calibration and localization, we can accurately and automatically locate and grasp the rotor. We have conducted sufficient experiments to verify the effectiveness and accuracy of our system.</description><subject>Artificial intelligence</subject><subject>Automation</subject><subject>Calibration</subject><subject>Communications Engineering</subject><subject>Computer Science</subject><subject>Computer vision</subject><subject>Coordinates</subject><subject>Eye (anatomy)</subject><subject>Grasping (robotics)</subject><subject>Image Processing and Computer Vision</subject><subject>Image segmentation</subject><subject>Industrial development</subject><subject>Localization</subject><subject>Localization method</subject><subject>Motor rotors</subject><subject>Networks</subject><subject>Pattern Recognition</subject><subject>Robots</subject><subject>Workpieces</subject><issn>0932-8092</issn><issn>1432-1769</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><recordid>eNp9kE9LAzEQxYMoWKtfwFPAc3Sy2U12j1L_VCh40XNIs8m6pbupmSzYb29qBW9eZgbmvd8Mj5BrDrccQN0hABc1g6JkwCUHpk7IjJeiYFzJ5pTMoMlzDU1xTi4QNwBQKlXOyGbZdx_Med_b3o12T82UwmCSa2mKvfnqzZbGsA6JdtHgrh87intMbqA-RDqElGs8VKQTHrbigWKKk01TzIxtpieKbsQQL8mZN1t0V799Tt6fHt8WS7Z6fX5Z3K-YLQASq6Sp8ncFB264UEI6w1sLwpXOO_DVWirb2hZqVYsKnCxb1xgrvQGooG5qMSc3R-4uhs_JYdKbMMUxn9SC80KWEiqRVcVRZWNAjM7rXewHE_eagz5kqo-Z6pyp_slUq2wSRxNm8di5-If-x_UN-OR6_g</recordid><startdate>20241101</startdate><enddate>20241101</enddate><creator>Liang, Jixin</creator><creator>Ye, Yuping</creator><creator>Wu, Di</creator><creator>Chen, Siyuan</creator><creator>Song, Zhan</creator><general>Springer Berlin Heidelberg</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope></search><sort><creationdate>20241101</creationdate><title>High-efficiency automated triaxial robot grasping system for motor rotors using 3D structured light sensor</title><author>Liang, Jixin ; Ye, Yuping ; Wu, Di ; Chen, Siyuan ; Song, Zhan</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c200t-56a50042101a13736ea1dc03e4efe0f5b67cdcd0878350e64de9ac6fa00508983</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Artificial intelligence</topic><topic>Automation</topic><topic>Calibration</topic><topic>Communications Engineering</topic><topic>Computer Science</topic><topic>Computer vision</topic><topic>Coordinates</topic><topic>Eye (anatomy)</topic><topic>Grasping (robotics)</topic><topic>Image Processing and Computer Vision</topic><topic>Image segmentation</topic><topic>Industrial development</topic><topic>Localization</topic><topic>Localization method</topic><topic>Motor rotors</topic><topic>Networks</topic><topic>Pattern Recognition</topic><topic>Robots</topic><topic>Workpieces</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Liang, Jixin</creatorcontrib><creatorcontrib>Ye, Yuping</creatorcontrib><creatorcontrib>Wu, Di</creatorcontrib><creatorcontrib>Chen, Siyuan</creatorcontrib><creatorcontrib>Song, Zhan</creatorcontrib><collection>CrossRef</collection><jtitle>Machine vision and applications</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Liang, Jixin</au><au>Ye, Yuping</au><au>Wu, Di</au><au>Chen, Siyuan</au><au>Song, Zhan</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>High-efficiency automated triaxial robot grasping system for motor rotors using 3D structured light sensor</atitle><jtitle>Machine vision and applications</jtitle><stitle>Machine Vision and Applications</stitle><date>2024-11-01</date><risdate>2024</risdate><volume>35</volume><issue>6</issue><spage>132</spage><pages>132-</pages><artnum>132</artnum><issn>0932-8092</issn><eissn>1432-1769</eissn><abstract>With the rapid development of artificial intelligence and computer vision, numerous technologies have been introduced to automate manufacturing in the industry. Typical metal workpieces in the industry often have highly reflective surfaces, come in various sizes, and are positioned irregularly. The motor rotor presented in this paper is one such representative workpiece. Traditional grasping methods for workpiece loading and unloading are pre-programmed and often struggle to cope with complex and disordered situations. In this paper, we introduce a structured light (SL) sensor as the visual guide for the triaxial robot. Furthermore, we propose a high-precision hand-eye calibration method for the non-orthogonal coordinate system of the triaxial robot. Additionally, a motor rotor center localization method based on U-Net image segmentation is proposed. By combining the high-precision hand-eye calibration and localization, we can accurately and automatically locate and grasp the rotor. We have conducted sufficient experiments to verify the effectiveness and accuracy of our system.</abstract><cop>Berlin/Heidelberg</cop><pub>Springer Berlin Heidelberg</pub><doi>10.1007/s00138-024-01610-7</doi></addata></record>
fulltext fulltext
identifier ISSN: 0932-8092
ispartof Machine vision and applications, 2024-11, Vol.35 (6), p.132, Article 132
issn 0932-8092
1432-1769
language eng
recordid cdi_proquest_journals_3112646053
source Springer Nature - Complete Springer Journals
subjects Artificial intelligence
Automation
Calibration
Communications Engineering
Computer Science
Computer vision
Coordinates
Eye (anatomy)
Grasping (robotics)
Image Processing and Computer Vision
Image segmentation
Industrial development
Localization
Localization method
Motor rotors
Networks
Pattern Recognition
Robots
Workpieces
title High-efficiency automated triaxial robot grasping system for motor rotors using 3D structured light sensor
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-10T22%3A06%3A44IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=High-efficiency%20automated%20triaxial%20robot%20grasping%20system%20for%20motor%20rotors%20using%203D%20structured%20light%20sensor&rft.jtitle=Machine%20vision%20and%20applications&rft.au=Liang,%20Jixin&rft.date=2024-11-01&rft.volume=35&rft.issue=6&rft.spage=132&rft.pages=132-&rft.artnum=132&rft.issn=0932-8092&rft.eissn=1432-1769&rft_id=info:doi/10.1007/s00138-024-01610-7&rft_dat=%3Cproquest_cross%3E3112646053%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3112646053&rft_id=info:pmid/&rfr_iscdi=true