A Color and Depth Image Defect Segmentation Framework for EEE Reuse and Recycling

Defect inspection is a crucial step in evaluating the reuse potential for Electrical and Electronic Equipment (EEE). Manual defect inspection demands operators with experience, and is subjective, which highlights the necessity for automatic defect detection. However, the accuracy of color image-base...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Wu, Yifan, Zhou, Chuangchuang, Sterkens, Wouter, Piessens, Mathijs, De Marelle, Dieter, Dewulf, Wim, Peeters, Jef
Format: Tagungsbericht
Sprache:eng
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 7
container_issue
container_start_page 1
container_title
container_volume
creator Wu, Yifan
Zhou, Chuangchuang
Sterkens, Wouter
Piessens, Mathijs
De Marelle, Dieter
Dewulf, Wim
Peeters, Jef
description Defect inspection is a crucial step in evaluating the reuse potential for Electrical and Electronic Equipment (EEE). Manual defect inspection demands operators with experience, and is subjective, which highlights the necessity for automatic defect detection. However, the accuracy of color image-based defect segmentation is known to be limited, as defects and backgrounds often have similar textures and colors. In addition, the reflective properties of commonly used materials in EEE complicate defect detection. Furthermore, manual segmentation labeling is highly time-consuming. Therefore, in this paper, an automatic defect segmentation pipeline for color (RGB) and Depth (D) image fusion employing deep learning is presented. To simplify the annotation process, multiple-angle images of the same device are captured using an integrated RGBD camera and only bounding box annotations are made. Then, these RGB and D images are processed by the developed framework for simultaneous defect detection and instance segmentation. The modular design of this framework allows free choice of fusion methods (addition, concatenation, Cross-attention Module, etc.), as well as the use of different backbones, and prediction heads, and provides the predicted bounding boxes for the segment anything (SAM) family. Finally, the proposed method is evaluated on a laptop dataset consisting of 513 RGB and corresponding depth images in which two categories of missing batteries and missing pieces are annotated. Results show that the RGBD-based Faster-RCNN-FPN with SAM outperforms the RGB-based method, highlighting the potential for automated RGBD defect segmentation for EEE reuse and recycling using deep learning and data fusion techniques.
format Conference Proceeding
fullrecord <record><control><sourceid>kuleuven_FZOIL</sourceid><recordid>TN_cdi_kuleuven_dspace_20_500_12942_750806</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>20_500_12942_750806</sourcerecordid><originalsourceid>FETCH-kuleuven_dspace_20_500_12942_7508063</originalsourceid><addsrcrecordid>eNqVjL0OgjAURpsYB4O8Q2cTzYWKwGiwREfRvWnKBQmlECj-vL3E-AA6fWc455sRNw4jBgBhzBh4C3Le06TVbU-lyekBO3ujp0aWOHGBytILlg0aK23VGpr2ssFH29e0mArOOc1wHPDTZqheSlemXJJ5IfWA7ncdskr5NTmu61HjeEcj8qGTCoUPIgAQnh9vfREGEMGOOWTzsyzs07K_3t9cOExs</addsrcrecordid><sourcetype>Institutional Repository</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>A Color and Depth Image Defect Segmentation Framework for EEE Reuse and Recycling</title><source>Lirias (KU Leuven Association)</source><creator>Wu, Yifan ; Zhou, Chuangchuang ; Sterkens, Wouter ; Piessens, Mathijs ; De Marelle, Dieter ; Dewulf, Wim ; Peeters, Jef</creator><creatorcontrib>Wu, Yifan ; Zhou, Chuangchuang ; Sterkens, Wouter ; Piessens, Mathijs ; De Marelle, Dieter ; Dewulf, Wim ; Peeters, Jef</creatorcontrib><description>Defect inspection is a crucial step in evaluating the reuse potential for Electrical and Electronic Equipment (EEE). Manual defect inspection demands operators with experience, and is subjective, which highlights the necessity for automatic defect detection. However, the accuracy of color image-based defect segmentation is known to be limited, as defects and backgrounds often have similar textures and colors. In addition, the reflective properties of commonly used materials in EEE complicate defect detection. Furthermore, manual segmentation labeling is highly time-consuming. Therefore, in this paper, an automatic defect segmentation pipeline for color (RGB) and Depth (D) image fusion employing deep learning is presented. To simplify the annotation process, multiple-angle images of the same device are captured using an integrated RGBD camera and only bounding box annotations are made. Then, these RGB and D images are processed by the developed framework for simultaneous defect detection and instance segmentation. The modular design of this framework allows free choice of fusion methods (addition, concatenation, Cross-attention Module, etc.), as well as the use of different backbones, and prediction heads, and provides the predicted bounding boxes for the segment anything (SAM) family. Finally, the proposed method is evaluated on a laptop dataset consisting of 513 RGB and corresponding depth images in which two categories of missing batteries and missing pieces are annotated. Results show that the RGBD-based Faster-RCNN-FPN with SAM outperforms the RGB-based method, highlighting the potential for automated RGBD defect segmentation for EEE reuse and recycling using deep learning and data fusion techniques.</description><identifier>ISBN: 9783000793301</identifier><identifier>ISBN: 3000793305</identifier><language>eng</language><publisher>IEEE</publisher><ispartof>2024 Electronics Goes Green 2024+ (EGG), 2024, p.1-7</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>309,310,315,776,4036,4037,27837</link.rule.ids><linktorsrc>$$Uhttps://lirias.kuleuven.be/handle/20.500.12942/750806$$EView_record_in_KU_Leuven_Association$$FView_record_in_$$GKU_Leuven_Association</linktorsrc></links><search><creatorcontrib>Wu, Yifan</creatorcontrib><creatorcontrib>Zhou, Chuangchuang</creatorcontrib><creatorcontrib>Sterkens, Wouter</creatorcontrib><creatorcontrib>Piessens, Mathijs</creatorcontrib><creatorcontrib>De Marelle, Dieter</creatorcontrib><creatorcontrib>Dewulf, Wim</creatorcontrib><creatorcontrib>Peeters, Jef</creatorcontrib><title>A Color and Depth Image Defect Segmentation Framework for EEE Reuse and Recycling</title><title>2024 Electronics Goes Green 2024+ (EGG)</title><description>Defect inspection is a crucial step in evaluating the reuse potential for Electrical and Electronic Equipment (EEE). Manual defect inspection demands operators with experience, and is subjective, which highlights the necessity for automatic defect detection. However, the accuracy of color image-based defect segmentation is known to be limited, as defects and backgrounds often have similar textures and colors. In addition, the reflective properties of commonly used materials in EEE complicate defect detection. Furthermore, manual segmentation labeling is highly time-consuming. Therefore, in this paper, an automatic defect segmentation pipeline for color (RGB) and Depth (D) image fusion employing deep learning is presented. To simplify the annotation process, multiple-angle images of the same device are captured using an integrated RGBD camera and only bounding box annotations are made. Then, these RGB and D images are processed by the developed framework for simultaneous defect detection and instance segmentation. The modular design of this framework allows free choice of fusion methods (addition, concatenation, Cross-attention Module, etc.), as well as the use of different backbones, and prediction heads, and provides the predicted bounding boxes for the segment anything (SAM) family. Finally, the proposed method is evaluated on a laptop dataset consisting of 513 RGB and corresponding depth images in which two categories of missing batteries and missing pieces are annotated. Results show that the RGBD-based Faster-RCNN-FPN with SAM outperforms the RGB-based method, highlighting the potential for automated RGBD defect segmentation for EEE reuse and recycling using deep learning and data fusion techniques.</description><isbn>9783000793301</isbn><isbn>3000793305</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2024</creationdate><recordtype>conference_proceeding</recordtype><sourceid>FZOIL</sourceid><recordid>eNqVjL0OgjAURpsYB4O8Q2cTzYWKwGiwREfRvWnKBQmlECj-vL3E-AA6fWc455sRNw4jBgBhzBh4C3Le06TVbU-lyekBO3ujp0aWOHGBytILlg0aK23VGpr2ssFH29e0mArOOc1wHPDTZqheSlemXJJ5IfWA7ncdskr5NTmu61HjeEcj8qGTCoUPIgAQnh9vfREGEMGOOWTzsyzs07K_3t9cOExs</recordid><startdate>2024</startdate><enddate>2024</enddate><creator>Wu, Yifan</creator><creator>Zhou, Chuangchuang</creator><creator>Sterkens, Wouter</creator><creator>Piessens, Mathijs</creator><creator>De Marelle, Dieter</creator><creator>Dewulf, Wim</creator><creator>Peeters, Jef</creator><general>IEEE</general><scope>FZOIL</scope></search><sort><creationdate>2024</creationdate><title>A Color and Depth Image Defect Segmentation Framework for EEE Reuse and Recycling</title><author>Wu, Yifan ; Zhou, Chuangchuang ; Sterkens, Wouter ; Piessens, Mathijs ; De Marelle, Dieter ; Dewulf, Wim ; Peeters, Jef</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-kuleuven_dspace_20_500_12942_7508063</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2024</creationdate><toplevel>online_resources</toplevel><creatorcontrib>Wu, Yifan</creatorcontrib><creatorcontrib>Zhou, Chuangchuang</creatorcontrib><creatorcontrib>Sterkens, Wouter</creatorcontrib><creatorcontrib>Piessens, Mathijs</creatorcontrib><creatorcontrib>De Marelle, Dieter</creatorcontrib><creatorcontrib>Dewulf, Wim</creatorcontrib><creatorcontrib>Peeters, Jef</creatorcontrib><collection>Lirias (KU Leuven Association)</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Wu, Yifan</au><au>Zhou, Chuangchuang</au><au>Sterkens, Wouter</au><au>Piessens, Mathijs</au><au>De Marelle, Dieter</au><au>Dewulf, Wim</au><au>Peeters, Jef</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>A Color and Depth Image Defect Segmentation Framework for EEE Reuse and Recycling</atitle><btitle>2024 Electronics Goes Green 2024+ (EGG)</btitle><date>2024</date><risdate>2024</risdate><spage>1</spage><epage>7</epage><pages>1-7</pages><isbn>9783000793301</isbn><isbn>3000793305</isbn><abstract>Defect inspection is a crucial step in evaluating the reuse potential for Electrical and Electronic Equipment (EEE). Manual defect inspection demands operators with experience, and is subjective, which highlights the necessity for automatic defect detection. However, the accuracy of color image-based defect segmentation is known to be limited, as defects and backgrounds often have similar textures and colors. In addition, the reflective properties of commonly used materials in EEE complicate defect detection. Furthermore, manual segmentation labeling is highly time-consuming. Therefore, in this paper, an automatic defect segmentation pipeline for color (RGB) and Depth (D) image fusion employing deep learning is presented. To simplify the annotation process, multiple-angle images of the same device are captured using an integrated RGBD camera and only bounding box annotations are made. Then, these RGB and D images are processed by the developed framework for simultaneous defect detection and instance segmentation. The modular design of this framework allows free choice of fusion methods (addition, concatenation, Cross-attention Module, etc.), as well as the use of different backbones, and prediction heads, and provides the predicted bounding boxes for the segment anything (SAM) family. Finally, the proposed method is evaluated on a laptop dataset consisting of 513 RGB and corresponding depth images in which two categories of missing batteries and missing pieces are annotated. Results show that the RGBD-based Faster-RCNN-FPN with SAM outperforms the RGB-based method, highlighting the potential for automated RGBD defect segmentation for EEE reuse and recycling using deep learning and data fusion techniques.</abstract><pub>IEEE</pub></addata></record>
fulltext fulltext_linktorsrc
identifier ISBN: 9783000793301
ispartof 2024 Electronics Goes Green 2024+ (EGG), 2024, p.1-7
issn
language eng
recordid cdi_kuleuven_dspace_20_500_12942_750806
source Lirias (KU Leuven Association)
title A Color and Depth Image Defect Segmentation Framework for EEE Reuse and Recycling
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-10T09%3A14%3A54IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-kuleuven_FZOIL&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=A%20Color%20and%20Depth%20Image%20Defect%20Segmentation%20Framework%20for%20EEE%20Reuse%20and%20Recycling&rft.btitle=2024%20Electronics%20Goes%20Green%202024+%20(EGG)&rft.au=Wu,%20Yifan&rft.date=2024&rft.spage=1&rft.epage=7&rft.pages=1-7&rft.isbn=9783000793301&rft.isbn_list=3000793305&rft_id=info:doi/&rft_dat=%3Ckuleuven_FZOIL%3E20_500_12942_750806%3C/kuleuven_FZOIL%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true