Surface reconstruction using neural network mapping of range-sensor images to object space

Range sensors that employ structured-light triangulation techniques often require calibration procedures, based on the system optics and geometry, to relate the captured image data to object coordinates. A Bernstein basis function (BBF) neural network that directly maps measured image coordinates to...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of electronic imaging 2002-04, Vol.11 (2), p.187-194
Hauptverfasser: Knopf, George K, Kofman, Jonathan
Format: Artikel
Sprache:eng
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 194
container_issue 2
container_start_page 187
container_title Journal of electronic imaging
container_volume 11
creator Knopf, George K
Kofman, Jonathan
description Range sensors that employ structured-light triangulation techniques often require calibration procedures, based on the system optics and geometry, to relate the captured image data to object coordinates. A Bernstein basis function (BBF) neural network that directly maps measured image coordinates to object coordinates is described in this paper. The proposed technique eliminates the need to explicitly determine the sensor's optical and geometric parameters by creating a functional map between image-to-object coordinates. The training and test data used to determine the map are obtained by capturing successive images of the points of intersection between a projected light line and horizontal markings on a calibration bar, which is stepped through the object space. The surface coordinates corresponding to the illuminated pixels in the image are determined from the neural network. An experimental study that involves the calibration of a range sensor using a BBF network is presented to demonstrate the effectiveness and accuracy of this approach. The root mean squared errors for the and coordinates in the calibrated plane, 0.25 and 0.15 mm, respectively, are quite low and are suitable for many reverse engineering and part inspection applications. Once the network is trained, a hand carved wooden mask of unknown shape is placed in the work envelope and translated perpendicular to the projected light plane. The surface shape of the mask is determined using the trained network. ©
doi_str_mv 10.1117/1.1453411
format Article
fullrecord <record><control><sourceid>spie_cross</sourceid><recordid>TN_cdi_crossref_primary_10_1117_1_1453411</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>JEIME5000011000002000187000001</sourcerecordid><originalsourceid>FETCH-LOGICAL-c270t-a85b7d8d2d32425543c5fde7882916c3f95907552c89fe58c2bc55a66b4e09e3</originalsourceid><addsrcrecordid>eNo9kDFPwzAUhC0EEqUw8A-8MqT4OXFsj6gqUFTEQAfEEjnOc5TSxpGdCPHvSWjF8u50Op2ePkJugS0AQN7DAjKRZgBnZAYiZwnn-uN89AxkojXTl-Qqxh1jACqDGfl8H4IzFmlA69vYh8H2jW_pEJu2pi0OwexH6b99-KIH03VT7B0Npq0xidhGH2hzMDVG2nvqyx3ansZunLwmF87sI96cdE62j6vt8jnZvD2tlw-bxHLJ-sQoUcpKVbxKecaFyFIrXIVSKa4ht6nTQjMpBLdKOxTK8tIKYfK8zJBpTOfk7jhrg48xoCu6MD4UfgpgxcSkgOLEZOzyYzd2Df73Xlbr15VgbIIyXcYnq-Sfh_QX--9iZw</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Surface reconstruction using neural network mapping of range-sensor images to object space</title><source>SPIE Digital Library (Journals)</source><creator>Knopf, George K ; Kofman, Jonathan</creator><creatorcontrib>Knopf, George K ; Kofman, Jonathan</creatorcontrib><description>Range sensors that employ structured-light triangulation techniques often require calibration procedures, based on the system optics and geometry, to relate the captured image data to object coordinates. A Bernstein basis function (BBF) neural network that directly maps measured image coordinates to object coordinates is described in this paper. The proposed technique eliminates the need to explicitly determine the sensor's optical and geometric parameters by creating a functional map between image-to-object coordinates. The training and test data used to determine the map are obtained by capturing successive images of the points of intersection between a projected light line and horizontal markings on a calibration bar, which is stepped through the object space. The surface coordinates corresponding to the illuminated pixels in the image are determined from the neural network. An experimental study that involves the calibration of a range sensor using a BBF network is presented to demonstrate the effectiveness and accuracy of this approach. The root mean squared errors for the and coordinates in the calibrated plane, 0.25 and 0.15 mm, respectively, are quite low and are suitable for many reverse engineering and part inspection applications. Once the network is trained, a hand carved wooden mask of unknown shape is placed in the work envelope and translated perpendicular to the projected light plane. The surface shape of the mask is determined using the trained network. ©</description><identifier>ISSN: 1017-9909</identifier><identifier>EISSN: 1560-229X</identifier><identifier>DOI: 10.1117/1.1453411</identifier><identifier>CODEN: JEIME5</identifier><language>eng</language><ispartof>Journal of electronic imaging, 2002-04, Vol.11 (2), p.187-194</ispartof><rights>2002 SPIE and IS&amp;T</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c270t-a85b7d8d2d32425543c5fde7882916c3f95907552c89fe58c2bc55a66b4e09e3</citedby><cites>FETCH-LOGICAL-c270t-a85b7d8d2d32425543c5fde7882916c3f95907552c89fe58c2bc55a66b4e09e3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.spiedigitallibrary.org/journalArticle/Download?urlId=10.1117/1.1453411$$EPDF$$P50$$Gspie$$H</linktopdf><linktohtml>$$Uhttp://dx.doi.org/10.1117/1.1453411$$EHTML$$P50$$Gspie$$H</linktohtml><link.rule.ids>314,780,784,18965,27924,27925,55386,55387</link.rule.ids></links><search><creatorcontrib>Knopf, George K</creatorcontrib><creatorcontrib>Kofman, Jonathan</creatorcontrib><title>Surface reconstruction using neural network mapping of range-sensor images to object space</title><title>Journal of electronic imaging</title><description>Range sensors that employ structured-light triangulation techniques often require calibration procedures, based on the system optics and geometry, to relate the captured image data to object coordinates. A Bernstein basis function (BBF) neural network that directly maps measured image coordinates to object coordinates is described in this paper. The proposed technique eliminates the need to explicitly determine the sensor's optical and geometric parameters by creating a functional map between image-to-object coordinates. The training and test data used to determine the map are obtained by capturing successive images of the points of intersection between a projected light line and horizontal markings on a calibration bar, which is stepped through the object space. The surface coordinates corresponding to the illuminated pixels in the image are determined from the neural network. An experimental study that involves the calibration of a range sensor using a BBF network is presented to demonstrate the effectiveness and accuracy of this approach. The root mean squared errors for the and coordinates in the calibrated plane, 0.25 and 0.15 mm, respectively, are quite low and are suitable for many reverse engineering and part inspection applications. Once the network is trained, a hand carved wooden mask of unknown shape is placed in the work envelope and translated perpendicular to the projected light plane. The surface shape of the mask is determined using the trained network. ©</description><issn>1017-9909</issn><issn>1560-229X</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2002</creationdate><recordtype>article</recordtype><recordid>eNo9kDFPwzAUhC0EEqUw8A-8MqT4OXFsj6gqUFTEQAfEEjnOc5TSxpGdCPHvSWjF8u50Op2ePkJugS0AQN7DAjKRZgBnZAYiZwnn-uN89AxkojXTl-Qqxh1jACqDGfl8H4IzFmlA69vYh8H2jW_pEJu2pi0OwexH6b99-KIH03VT7B0Npq0xidhGH2hzMDVG2nvqyx3ansZunLwmF87sI96cdE62j6vt8jnZvD2tlw-bxHLJ-sQoUcpKVbxKecaFyFIrXIVSKa4ht6nTQjMpBLdKOxTK8tIKYfK8zJBpTOfk7jhrg48xoCu6MD4UfgpgxcSkgOLEZOzyYzd2Df73Xlbr15VgbIIyXcYnq-Sfh_QX--9iZw</recordid><startdate>20020401</startdate><enddate>20020401</enddate><creator>Knopf, George K</creator><creator>Kofman, Jonathan</creator><scope>AAYXX</scope><scope>CITATION</scope></search><sort><creationdate>20020401</creationdate><title>Surface reconstruction using neural network mapping of range-sensor images to object space</title><author>Knopf, George K ; Kofman, Jonathan</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c270t-a85b7d8d2d32425543c5fde7882916c3f95907552c89fe58c2bc55a66b4e09e3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2002</creationdate><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Knopf, George K</creatorcontrib><creatorcontrib>Kofman, Jonathan</creatorcontrib><collection>CrossRef</collection><jtitle>Journal of electronic imaging</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Knopf, George K</au><au>Kofman, Jonathan</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Surface reconstruction using neural network mapping of range-sensor images to object space</atitle><jtitle>Journal of electronic imaging</jtitle><date>2002-04-01</date><risdate>2002</risdate><volume>11</volume><issue>2</issue><spage>187</spage><epage>194</epage><pages>187-194</pages><issn>1017-9909</issn><eissn>1560-229X</eissn><coden>JEIME5</coden><abstract>Range sensors that employ structured-light triangulation techniques often require calibration procedures, based on the system optics and geometry, to relate the captured image data to object coordinates. A Bernstein basis function (BBF) neural network that directly maps measured image coordinates to object coordinates is described in this paper. The proposed technique eliminates the need to explicitly determine the sensor's optical and geometric parameters by creating a functional map between image-to-object coordinates. The training and test data used to determine the map are obtained by capturing successive images of the points of intersection between a projected light line and horizontal markings on a calibration bar, which is stepped through the object space. The surface coordinates corresponding to the illuminated pixels in the image are determined from the neural network. An experimental study that involves the calibration of a range sensor using a BBF network is presented to demonstrate the effectiveness and accuracy of this approach. The root mean squared errors for the and coordinates in the calibrated plane, 0.25 and 0.15 mm, respectively, are quite low and are suitable for many reverse engineering and part inspection applications. Once the network is trained, a hand carved wooden mask of unknown shape is placed in the work envelope and translated perpendicular to the projected light plane. The surface shape of the mask is determined using the trained network. ©</abstract><doi>10.1117/1.1453411</doi><tpages>8</tpages></addata></record>
fulltext fulltext
identifier ISSN: 1017-9909
ispartof Journal of electronic imaging, 2002-04, Vol.11 (2), p.187-194
issn 1017-9909
1560-229X
language eng
recordid cdi_crossref_primary_10_1117_1_1453411
source SPIE Digital Library (Journals)
title Surface reconstruction using neural network mapping of range-sensor images to object space
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-06T23%3A26%3A35IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-spie_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Surface%20reconstruction%20using%20neural%20network%20mapping%20of%20range-sensor%20images%20to%20object%20space&rft.jtitle=Journal%20of%20electronic%20imaging&rft.au=Knopf,%20George%20K&rft.date=2002-04-01&rft.volume=11&rft.issue=2&rft.spage=187&rft.epage=194&rft.pages=187-194&rft.issn=1017-9909&rft.eissn=1560-229X&rft.coden=JEIME5&rft_id=info:doi/10.1117/1.1453411&rft_dat=%3Cspie_cross%3EJEIME5000011000002000187000001%3C/spie_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true