Integrated remote control of the process capability and the accuracy of vision calibration

This study aims at jointly controlling two critical process parameters from a remote site, of which include the process capability of robotic assembly operations and the accuracy of vision calibration. The process capability is regarded as the indication of robot positioning accuracy. When the robot...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Robotics and computer-integrated manufacturing 2014-10, Vol.30 (5), p.451-459
Hauptverfasser: Kwon, Yongjin (James), Hong, Jungwan
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 459
container_issue 5
container_start_page 451
container_title Robotics and computer-integrated manufacturing
container_volume 30
creator Kwon, Yongjin (James)
Hong, Jungwan
description This study aims at jointly controlling two critical process parameters from a remote site, of which include the process capability of robotic assembly operations and the accuracy of vision calibration. The process capability is regarded as the indication of robot positioning accuracy. When the robot is driven by the vision camera, the process capability becomes mainly dependent on the calibration accuracy of vision-guided robot system. Even though newly commissioned, high precision assembly robots typically display excellent positioning accuracies under normal working conditions, the imperfect mathematical conversion of vision coordinates into robot coordinates imparts the accuracy problems. In this study, a novel vision calibration method is proposed that effectively rectifies the inherent complications associated with lens distortions. Our analysis shows that the degree of lens distortion appears very differently along the vision field of view. Because of this non-uniform distortion, a single mathematical equation for vision calibration is deemed ineffective. The proposed methodology significantly improves the positioning accuracy, which can be performed over the network from a remote site. This is better suited for today׳s global manufacturing companies, where fast product cycles and geographically distributed production lines dictate more efficient and effective quality control strategies. •New vision calibration methods are proposed, which improve the robot positioning accuracy.•Compared to the previous studies, a much improved result is derived due to new approach.•Joint monitoring and controlling of two critical process parameters are presented, which can be conducted from a remote site.•By integrating two critical process parameters in terms of process control, a better product quality can be ascertained.•A proposed method is best suited for today׳s fast changing and dynamic production environment, where production lines are geographically dispersed.
doi_str_mv 10.1016/j.rcim.2014.02.004
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_1671530867</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0736584514000155</els_id><sourcerecordid>1671530867</sourcerecordid><originalsourceid>FETCH-LOGICAL-c333t-833ee6d6ee4b46d5f4bcf2121560fe7036b1acd0435def6d1780299592dba0543</originalsourceid><addsrcrecordid>eNp9kD1PwzAQhi0EEqXwB5gysiScP5NILKjiS6rEAguL5dgXcJXGxXYr9d-TUmamO-me93T3EHJNoaJA1e2qitavKwZUVMAqAHFCZrSp25JJXp-SGdRclbIR8pxcpLQCACYkn5GPlzHjZzQZXRFxHTIWNow5hqEIfZG_sNjEYDGlwpqN6fzg874wo_sdGWu30dj9Ad355MM4UYPvpnVTf0nOejMkvPqrc_L--PC2eC6Xr08vi_tlaTnnuWw4R1ROIYpOKCd70dmeUUalgh5r4KqjxjoQXDrslaN1A6xtZctcZ0AKPic3x73Tpd9bTFmvfbI4DGbEsE2aqppKDo2qJ5QdURtDShF7vYl-beJeU9AHkXqlDyL1QaQGpieRU-juGMLpiZ3HqJP1OFp0PqLN2gX_X_wHcM19LQ</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1671530867</pqid></control><display><type>article</type><title>Integrated remote control of the process capability and the accuracy of vision calibration</title><source>Elsevier ScienceDirect Journals Complete</source><creator>Kwon, Yongjin (James) ; Hong, Jungwan</creator><creatorcontrib>Kwon, Yongjin (James) ; Hong, Jungwan</creatorcontrib><description>This study aims at jointly controlling two critical process parameters from a remote site, of which include the process capability of robotic assembly operations and the accuracy of vision calibration. The process capability is regarded as the indication of robot positioning accuracy. When the robot is driven by the vision camera, the process capability becomes mainly dependent on the calibration accuracy of vision-guided robot system. Even though newly commissioned, high precision assembly robots typically display excellent positioning accuracies under normal working conditions, the imperfect mathematical conversion of vision coordinates into robot coordinates imparts the accuracy problems. In this study, a novel vision calibration method is proposed that effectively rectifies the inherent complications associated with lens distortions. Our analysis shows that the degree of lens distortion appears very differently along the vision field of view. Because of this non-uniform distortion, a single mathematical equation for vision calibration is deemed ineffective. The proposed methodology significantly improves the positioning accuracy, which can be performed over the network from a remote site. This is better suited for today׳s global manufacturing companies, where fast product cycles and geographically distributed production lines dictate more efficient and effective quality control strategies. •New vision calibration methods are proposed, which improve the robot positioning accuracy.•Compared to the previous studies, a much improved result is derived due to new approach.•Joint monitoring and controlling of two critical process parameters are presented, which can be conducted from a remote site.•By integrating two critical process parameters in terms of process control, a better product quality can be ascertained.•A proposed method is best suited for today׳s fast changing and dynamic production environment, where production lines are geographically dispersed.</description><identifier>ISSN: 0736-5845</identifier><identifier>EISSN: 1879-2537</identifier><identifier>DOI: 10.1016/j.rcim.2014.02.004</identifier><language>eng</language><publisher>Elsevier Ltd</publisher><subject>Accuracy ; Automation ; Calibration ; Distortion ; Mathematical analysis ; Mathematical models ; Networked robotics ; New calibration methods ; Process capability ; Remote control ; Robots ; Vision ; Vision guidance</subject><ispartof>Robotics and computer-integrated manufacturing, 2014-10, Vol.30 (5), p.451-459</ispartof><rights>2014 Elsevier Ltd</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c333t-833ee6d6ee4b46d5f4bcf2121560fe7036b1acd0435def6d1780299592dba0543</citedby><cites>FETCH-LOGICAL-c333t-833ee6d6ee4b46d5f4bcf2121560fe7036b1acd0435def6d1780299592dba0543</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.sciencedirect.com/science/article/pii/S0736584514000155$$EHTML$$P50$$Gelsevier$$H</linktohtml><link.rule.ids>314,776,780,3537,27901,27902,65534</link.rule.ids></links><search><creatorcontrib>Kwon, Yongjin (James)</creatorcontrib><creatorcontrib>Hong, Jungwan</creatorcontrib><title>Integrated remote control of the process capability and the accuracy of vision calibration</title><title>Robotics and computer-integrated manufacturing</title><description>This study aims at jointly controlling two critical process parameters from a remote site, of which include the process capability of robotic assembly operations and the accuracy of vision calibration. The process capability is regarded as the indication of robot positioning accuracy. When the robot is driven by the vision camera, the process capability becomes mainly dependent on the calibration accuracy of vision-guided robot system. Even though newly commissioned, high precision assembly robots typically display excellent positioning accuracies under normal working conditions, the imperfect mathematical conversion of vision coordinates into robot coordinates imparts the accuracy problems. In this study, a novel vision calibration method is proposed that effectively rectifies the inherent complications associated with lens distortions. Our analysis shows that the degree of lens distortion appears very differently along the vision field of view. Because of this non-uniform distortion, a single mathematical equation for vision calibration is deemed ineffective. The proposed methodology significantly improves the positioning accuracy, which can be performed over the network from a remote site. This is better suited for today׳s global manufacturing companies, where fast product cycles and geographically distributed production lines dictate more efficient and effective quality control strategies. •New vision calibration methods are proposed, which improve the robot positioning accuracy.•Compared to the previous studies, a much improved result is derived due to new approach.•Joint monitoring and controlling of two critical process parameters are presented, which can be conducted from a remote site.•By integrating two critical process parameters in terms of process control, a better product quality can be ascertained.•A proposed method is best suited for today׳s fast changing and dynamic production environment, where production lines are geographically dispersed.</description><subject>Accuracy</subject><subject>Automation</subject><subject>Calibration</subject><subject>Distortion</subject><subject>Mathematical analysis</subject><subject>Mathematical models</subject><subject>Networked robotics</subject><subject>New calibration methods</subject><subject>Process capability</subject><subject>Remote control</subject><subject>Robots</subject><subject>Vision</subject><subject>Vision guidance</subject><issn>0736-5845</issn><issn>1879-2537</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2014</creationdate><recordtype>article</recordtype><recordid>eNp9kD1PwzAQhi0EEqXwB5gysiScP5NILKjiS6rEAguL5dgXcJXGxXYr9d-TUmamO-me93T3EHJNoaJA1e2qitavKwZUVMAqAHFCZrSp25JJXp-SGdRclbIR8pxcpLQCACYkn5GPlzHjZzQZXRFxHTIWNow5hqEIfZG_sNjEYDGlwpqN6fzg874wo_sdGWu30dj9Ad355MM4UYPvpnVTf0nOejMkvPqrc_L--PC2eC6Xr08vi_tlaTnnuWw4R1ROIYpOKCd70dmeUUalgh5r4KqjxjoQXDrslaN1A6xtZctcZ0AKPic3x73Tpd9bTFmvfbI4DGbEsE2aqppKDo2qJ5QdURtDShF7vYl-beJeU9AHkXqlDyL1QaQGpieRU-juGMLpiZ3HqJP1OFp0PqLN2gX_X_wHcM19LQ</recordid><startdate>201410</startdate><enddate>201410</enddate><creator>Kwon, Yongjin (James)</creator><creator>Hong, Jungwan</creator><general>Elsevier Ltd</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7TB</scope><scope>8FD</scope><scope>F28</scope><scope>FR3</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope></search><sort><creationdate>201410</creationdate><title>Integrated remote control of the process capability and the accuracy of vision calibration</title><author>Kwon, Yongjin (James) ; Hong, Jungwan</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c333t-833ee6d6ee4b46d5f4bcf2121560fe7036b1acd0435def6d1780299592dba0543</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2014</creationdate><topic>Accuracy</topic><topic>Automation</topic><topic>Calibration</topic><topic>Distortion</topic><topic>Mathematical analysis</topic><topic>Mathematical models</topic><topic>Networked robotics</topic><topic>New calibration methods</topic><topic>Process capability</topic><topic>Remote control</topic><topic>Robots</topic><topic>Vision</topic><topic>Vision guidance</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Kwon, Yongjin (James)</creatorcontrib><creatorcontrib>Hong, Jungwan</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Mechanical &amp; Transportation Engineering Abstracts</collection><collection>Technology Research Database</collection><collection>ANTE: Abstracts in New Technology &amp; Engineering</collection><collection>Engineering Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Robotics and computer-integrated manufacturing</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Kwon, Yongjin (James)</au><au>Hong, Jungwan</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Integrated remote control of the process capability and the accuracy of vision calibration</atitle><jtitle>Robotics and computer-integrated manufacturing</jtitle><date>2014-10</date><risdate>2014</risdate><volume>30</volume><issue>5</issue><spage>451</spage><epage>459</epage><pages>451-459</pages><issn>0736-5845</issn><eissn>1879-2537</eissn><abstract>This study aims at jointly controlling two critical process parameters from a remote site, of which include the process capability of robotic assembly operations and the accuracy of vision calibration. The process capability is regarded as the indication of robot positioning accuracy. When the robot is driven by the vision camera, the process capability becomes mainly dependent on the calibration accuracy of vision-guided robot system. Even though newly commissioned, high precision assembly robots typically display excellent positioning accuracies under normal working conditions, the imperfect mathematical conversion of vision coordinates into robot coordinates imparts the accuracy problems. In this study, a novel vision calibration method is proposed that effectively rectifies the inherent complications associated with lens distortions. Our analysis shows that the degree of lens distortion appears very differently along the vision field of view. Because of this non-uniform distortion, a single mathematical equation for vision calibration is deemed ineffective. The proposed methodology significantly improves the positioning accuracy, which can be performed over the network from a remote site. This is better suited for today׳s global manufacturing companies, where fast product cycles and geographically distributed production lines dictate more efficient and effective quality control strategies. •New vision calibration methods are proposed, which improve the robot positioning accuracy.•Compared to the previous studies, a much improved result is derived due to new approach.•Joint monitoring and controlling of two critical process parameters are presented, which can be conducted from a remote site.•By integrating two critical process parameters in terms of process control, a better product quality can be ascertained.•A proposed method is best suited for today׳s fast changing and dynamic production environment, where production lines are geographically dispersed.</abstract><pub>Elsevier Ltd</pub><doi>10.1016/j.rcim.2014.02.004</doi><tpages>9</tpages></addata></record>
fulltext fulltext
identifier ISSN: 0736-5845
ispartof Robotics and computer-integrated manufacturing, 2014-10, Vol.30 (5), p.451-459
issn 0736-5845
1879-2537
language eng
recordid cdi_proquest_miscellaneous_1671530867
source Elsevier ScienceDirect Journals Complete
subjects Accuracy
Automation
Calibration
Distortion
Mathematical analysis
Mathematical models
Networked robotics
New calibration methods
Process capability
Remote control
Robots
Vision
Vision guidance
title Integrated remote control of the process capability and the accuracy of vision calibration
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-21T18%3A00%3A51IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Integrated%20remote%20control%20of%20the%20process%20capability%20and%20the%20accuracy%20of%20vision%20calibration&rft.jtitle=Robotics%20and%20computer-integrated%20manufacturing&rft.au=Kwon,%20Yongjin%20(James)&rft.date=2014-10&rft.volume=30&rft.issue=5&rft.spage=451&rft.epage=459&rft.pages=451-459&rft.issn=0736-5845&rft.eissn=1879-2537&rft_id=info:doi/10.1016/j.rcim.2014.02.004&rft_dat=%3Cproquest_cross%3E1671530867%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=1671530867&rft_id=info:pmid/&rft_els_id=S0736584514000155&rfr_iscdi=true