Graph neural network comparison for 2D nesting efficiency estimation
Minimizing the level of material consumption in textile production is a major concern. The cornerstone of this optimization task is the nesting problem, whose goal is to lay a set of irregular 2D parts out onto a rectangular surface, called the nesting zone, while respecting a set of constraints. Kn...
Gespeichert in:
Veröffentlicht in: | Journal of intelligent manufacturing 2024-02, Vol.35 (2), p.859-873 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 873 |
---|---|
container_issue | 2 |
container_start_page | 859 |
container_title | Journal of intelligent manufacturing |
container_volume | 35 |
creator | Lallier, Corentin Blin, Guillaume Pinaud, Bruno Vézard, Laurent |
description | Minimizing the level of material consumption in textile production is a major concern. The cornerstone of this optimization task is the nesting problem, whose goal is to lay a set of irregular 2D parts out onto a rectangular surface, called the nesting zone, while respecting a set of constraints. Knowing the efficiency—ratio of usable to used up material enables the optimization of several textile production problems. Unfortunately, knowing the efficiency requires the nesting problem to be solved, which is computationally intensive and has been proven to be
NP
-hard. This paper introduces a regression approach to estimate efficiency without solving the nesting problem. Our approach models the 2D nesting problem as a graph where the nodes are images derived from parts and the edges hold the constraints. The method then consists of combining convolutional neural networks for addressing the image-based aspects and graph neural networks (GNNs) for the constraint aspects. We evaluate several neural message passing approaches on our dataset and obtain results that are sufficiently accurate for enabling several business use cases, where our model best solves this task with a mean absolute error of 1.65. We provide open access to our dataset, whose properties differ from those of other graph datasets found in the literature. This dataset is constructed on 100,000 real customers’ nesting data. Along the way, we compare the performance and generalization capabilities of four GNN architectures obtained from the literature on this dataset. |
doi_str_mv | 10.1007/s10845-023-02084-6 |
format | Article |
fullrecord | <record><control><sourceid>proquest_hal_p</sourceid><recordid>TN_cdi_hal_primary_oai_HAL_hal_03952756v1</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2921454921</sourcerecordid><originalsourceid>FETCH-LOGICAL-c348t-e4b6e2e78b7a51fd1a88dcb2b0bbb44eb334d5d9e5dd0dd1dea3e96c8843fe083</originalsourceid><addsrcrecordid>eNp9kM1OAyEUhYnRxFp9AVeTuHKB8jvDLJtWW5MmbnRNYIB2agsjTDV9e6ljdOcC7s3lO4ebA8A1RncYoeo-YSQYh4jQfHILyxMwwrwiUGDGT8EI1byEnGN-Di5S2iCEalHiEZjNo-rWhbf7qLa59J8hvhVN2HUqtin4woVYkFl-SX3rV4V1rm1a65tDcZzsVN8GfwnOnNome_VTx-D18eFluoDL5_nTdLKEDWWih5bp0hJbCV0pjp3BSgjTaKKR1poxqyllhpvacmOQMdhYRW1dNkIw6iwSdAxuB9-12sou5t_jQQbVysVkKY8zRGtOKl5-4MzeDGwXw_s-7yo3YR99Xk-SmuRQWL4zRQaqiSGlaN2vLUbymKwckpU5WfmdrCyziA6ilGG_svHP-h_VF5ode8s</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2921454921</pqid></control><display><type>article</type><title>Graph neural network comparison for 2D nesting efficiency estimation</title><source>SpringerLink Journals</source><creator>Lallier, Corentin ; Blin, Guillaume ; Pinaud, Bruno ; Vézard, Laurent</creator><creatorcontrib>Lallier, Corentin ; Blin, Guillaume ; Pinaud, Bruno ; Vézard, Laurent</creatorcontrib><description>Minimizing the level of material consumption in textile production is a major concern. The cornerstone of this optimization task is the nesting problem, whose goal is to lay a set of irregular 2D parts out onto a rectangular surface, called the nesting zone, while respecting a set of constraints. Knowing the efficiency—ratio of usable to used up material enables the optimization of several textile production problems. Unfortunately, knowing the efficiency requires the nesting problem to be solved, which is computationally intensive and has been proven to be
NP
-hard. This paper introduces a regression approach to estimate efficiency without solving the nesting problem. Our approach models the 2D nesting problem as a graph where the nodes are images derived from parts and the edges hold the constraints. The method then consists of combining convolutional neural networks for addressing the image-based aspects and graph neural networks (GNNs) for the constraint aspects. We evaluate several neural message passing approaches on our dataset and obtain results that are sufficiently accurate for enabling several business use cases, where our model best solves this task with a mean absolute error of 1.65. We provide open access to our dataset, whose properties differ from those of other graph datasets found in the literature. This dataset is constructed on 100,000 real customers’ nesting data. Along the way, we compare the performance and generalization capabilities of four GNN architectures obtained from the literature on this dataset.</description><identifier>ISSN: 0956-5515</identifier><identifier>EISSN: 1572-8145</identifier><identifier>DOI: 10.1007/s10845-023-02084-6</identifier><language>eng</language><publisher>New York: Springer US</publisher><subject>Advanced manufacturing technologies ; Artificial Intelligence ; Artificial neural networks ; Business and Management ; Computer Science ; Control ; Datasets ; Efficiency ; Graph neural networks ; Graph theory ; Machine Learning ; Machines ; Manufacturing ; Mechatronics ; Message passing ; Nesting ; Neural networks ; Optimization ; Processes ; Production ; Robotics</subject><ispartof>Journal of intelligent manufacturing, 2024-02, Vol.35 (2), p.859-873</ispartof><rights>The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.</rights><rights>Distributed under a Creative Commons Attribution 4.0 International License</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c348t-e4b6e2e78b7a51fd1a88dcb2b0bbb44eb334d5d9e5dd0dd1dea3e96c8843fe083</cites><orcidid>0000-0003-0518-8308 ; 0000-0002-0708-0838 ; 0000-0003-4814-3273</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s10845-023-02084-6$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s10845-023-02084-6$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>230,314,776,780,881,27901,27902,41464,42533,51294</link.rule.ids><backlink>$$Uhttps://hal.science/hal-03952756$$DView record in HAL$$Hfree_for_read</backlink></links><search><creatorcontrib>Lallier, Corentin</creatorcontrib><creatorcontrib>Blin, Guillaume</creatorcontrib><creatorcontrib>Pinaud, Bruno</creatorcontrib><creatorcontrib>Vézard, Laurent</creatorcontrib><title>Graph neural network comparison for 2D nesting efficiency estimation</title><title>Journal of intelligent manufacturing</title><addtitle>J Intell Manuf</addtitle><description>Minimizing the level of material consumption in textile production is a major concern. The cornerstone of this optimization task is the nesting problem, whose goal is to lay a set of irregular 2D parts out onto a rectangular surface, called the nesting zone, while respecting a set of constraints. Knowing the efficiency—ratio of usable to used up material enables the optimization of several textile production problems. Unfortunately, knowing the efficiency requires the nesting problem to be solved, which is computationally intensive and has been proven to be
NP
-hard. This paper introduces a regression approach to estimate efficiency without solving the nesting problem. Our approach models the 2D nesting problem as a graph where the nodes are images derived from parts and the edges hold the constraints. The method then consists of combining convolutional neural networks for addressing the image-based aspects and graph neural networks (GNNs) for the constraint aspects. We evaluate several neural message passing approaches on our dataset and obtain results that are sufficiently accurate for enabling several business use cases, where our model best solves this task with a mean absolute error of 1.65. We provide open access to our dataset, whose properties differ from those of other graph datasets found in the literature. This dataset is constructed on 100,000 real customers’ nesting data. Along the way, we compare the performance and generalization capabilities of four GNN architectures obtained from the literature on this dataset.</description><subject>Advanced manufacturing technologies</subject><subject>Artificial Intelligence</subject><subject>Artificial neural networks</subject><subject>Business and Management</subject><subject>Computer Science</subject><subject>Control</subject><subject>Datasets</subject><subject>Efficiency</subject><subject>Graph neural networks</subject><subject>Graph theory</subject><subject>Machine Learning</subject><subject>Machines</subject><subject>Manufacturing</subject><subject>Mechatronics</subject><subject>Message passing</subject><subject>Nesting</subject><subject>Neural networks</subject><subject>Optimization</subject><subject>Processes</subject><subject>Production</subject><subject>Robotics</subject><issn>0956-5515</issn><issn>1572-8145</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>BENPR</sourceid><recordid>eNp9kM1OAyEUhYnRxFp9AVeTuHKB8jvDLJtWW5MmbnRNYIB2agsjTDV9e6ljdOcC7s3lO4ebA8A1RncYoeo-YSQYh4jQfHILyxMwwrwiUGDGT8EI1byEnGN-Di5S2iCEalHiEZjNo-rWhbf7qLa59J8hvhVN2HUqtin4woVYkFl-SX3rV4V1rm1a65tDcZzsVN8GfwnOnNome_VTx-D18eFluoDL5_nTdLKEDWWih5bp0hJbCV0pjp3BSgjTaKKR1poxqyllhpvacmOQMdhYRW1dNkIw6iwSdAxuB9-12sou5t_jQQbVysVkKY8zRGtOKl5-4MzeDGwXw_s-7yo3YR99Xk-SmuRQWL4zRQaqiSGlaN2vLUbymKwckpU5WfmdrCyziA6ilGG_svHP-h_VF5ode8s</recordid><startdate>20240201</startdate><enddate>20240201</enddate><creator>Lallier, Corentin</creator><creator>Blin, Guillaume</creator><creator>Pinaud, Bruno</creator><creator>Vézard, Laurent</creator><general>Springer US</general><general>Springer Nature B.V</general><general>Springer Verlag (Germany)</general><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7SC</scope><scope>7TB</scope><scope>7WY</scope><scope>7WZ</scope><scope>7XB</scope><scope>87Z</scope><scope>88E</scope><scope>8AL</scope><scope>8AO</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FJ</scope><scope>8FK</scope><scope>8FL</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AEUYN</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BEZIV</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FR3</scope><scope>FRNLG</scope><scope>F~G</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K60</scope><scope>K6~</scope><scope>K7-</scope><scope>K9.</scope><scope>L.-</scope><scope>L6V</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>M0C</scope><scope>M0N</scope><scope>M0S</scope><scope>M7S</scope><scope>P5Z</scope><scope>P62</scope><scope>PQBIZ</scope><scope>PQBZA</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope><scope>Q9U</scope><scope>1XC</scope><scope>VOOES</scope><orcidid>https://orcid.org/0000-0003-0518-8308</orcidid><orcidid>https://orcid.org/0000-0002-0708-0838</orcidid><orcidid>https://orcid.org/0000-0003-4814-3273</orcidid></search><sort><creationdate>20240201</creationdate><title>Graph neural network comparison for 2D nesting efficiency estimation</title><author>Lallier, Corentin ; Blin, Guillaume ; Pinaud, Bruno ; Vézard, Laurent</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c348t-e4b6e2e78b7a51fd1a88dcb2b0bbb44eb334d5d9e5dd0dd1dea3e96c8843fe083</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Advanced manufacturing technologies</topic><topic>Artificial Intelligence</topic><topic>Artificial neural networks</topic><topic>Business and Management</topic><topic>Computer Science</topic><topic>Control</topic><topic>Datasets</topic><topic>Efficiency</topic><topic>Graph neural networks</topic><topic>Graph theory</topic><topic>Machine Learning</topic><topic>Machines</topic><topic>Manufacturing</topic><topic>Mechatronics</topic><topic>Message passing</topic><topic>Nesting</topic><topic>Neural networks</topic><topic>Optimization</topic><topic>Processes</topic><topic>Production</topic><topic>Robotics</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Lallier, Corentin</creatorcontrib><creatorcontrib>Blin, Guillaume</creatorcontrib><creatorcontrib>Pinaud, Bruno</creatorcontrib><creatorcontrib>Vézard, Laurent</creatorcontrib><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Computer and Information Systems Abstracts</collection><collection>Mechanical & Transportation Engineering Abstracts</collection><collection>ABI/INFORM Collection</collection><collection>ABI/INFORM Global (PDF only)</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>ABI/INFORM Global (Alumni Edition)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Computing Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ABI/INFORM Collection (Alumni Edition)</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest One Sustainability</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Business Premium Collection</collection><collection>Technology Collection (ProQuest)</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Engineering Research Database</collection><collection>Business Premium Collection (Alumni)</collection><collection>ABI/INFORM Global (Corporate)</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>ProQuest Business Collection (Alumni Edition)</collection><collection>ProQuest Business Collection</collection><collection>Computer Science Database</collection><collection>ProQuest Health & Medical Complete (Alumni)</collection><collection>ABI/INFORM Professional Advanced</collection><collection>ProQuest Engineering Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>ABI/INFORM Global</collection><collection>Computing Database</collection><collection>Health & Medical Collection (Alumni Edition)</collection><collection>Engineering Database</collection><collection>Advanced Technologies & Aerospace Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>ProQuest One Business</collection><collection>ProQuest One Business (Alumni)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection><collection>ProQuest Central Basic</collection><collection>Hyper Article en Ligne (HAL)</collection><collection>Hyper Article en Ligne (HAL) (Open Access)</collection><jtitle>Journal of intelligent manufacturing</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Lallier, Corentin</au><au>Blin, Guillaume</au><au>Pinaud, Bruno</au><au>Vézard, Laurent</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Graph neural network comparison for 2D nesting efficiency estimation</atitle><jtitle>Journal of intelligent manufacturing</jtitle><stitle>J Intell Manuf</stitle><date>2024-02-01</date><risdate>2024</risdate><volume>35</volume><issue>2</issue><spage>859</spage><epage>873</epage><pages>859-873</pages><issn>0956-5515</issn><eissn>1572-8145</eissn><abstract>Minimizing the level of material consumption in textile production is a major concern. The cornerstone of this optimization task is the nesting problem, whose goal is to lay a set of irregular 2D parts out onto a rectangular surface, called the nesting zone, while respecting a set of constraints. Knowing the efficiency—ratio of usable to used up material enables the optimization of several textile production problems. Unfortunately, knowing the efficiency requires the nesting problem to be solved, which is computationally intensive and has been proven to be
NP
-hard. This paper introduces a regression approach to estimate efficiency without solving the nesting problem. Our approach models the 2D nesting problem as a graph where the nodes are images derived from parts and the edges hold the constraints. The method then consists of combining convolutional neural networks for addressing the image-based aspects and graph neural networks (GNNs) for the constraint aspects. We evaluate several neural message passing approaches on our dataset and obtain results that are sufficiently accurate for enabling several business use cases, where our model best solves this task with a mean absolute error of 1.65. We provide open access to our dataset, whose properties differ from those of other graph datasets found in the literature. This dataset is constructed on 100,000 real customers’ nesting data. Along the way, we compare the performance and generalization capabilities of four GNN architectures obtained from the literature on this dataset.</abstract><cop>New York</cop><pub>Springer US</pub><doi>10.1007/s10845-023-02084-6</doi><tpages>15</tpages><orcidid>https://orcid.org/0000-0003-0518-8308</orcidid><orcidid>https://orcid.org/0000-0002-0708-0838</orcidid><orcidid>https://orcid.org/0000-0003-4814-3273</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0956-5515 |
ispartof | Journal of intelligent manufacturing, 2024-02, Vol.35 (2), p.859-873 |
issn | 0956-5515 1572-8145 |
language | eng |
recordid | cdi_hal_primary_oai_HAL_hal_03952756v1 |
source | SpringerLink Journals |
subjects | Advanced manufacturing technologies Artificial Intelligence Artificial neural networks Business and Management Computer Science Control Datasets Efficiency Graph neural networks Graph theory Machine Learning Machines Manufacturing Mechatronics Message passing Nesting Neural networks Optimization Processes Production Robotics |
title | Graph neural network comparison for 2D nesting efficiency estimation |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-07T23%3A26%3A47IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_hal_p&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Graph%20neural%20network%20comparison%20for%202D%20nesting%20efficiency%20estimation&rft.jtitle=Journal%20of%20intelligent%20manufacturing&rft.au=Lallier,%20Corentin&rft.date=2024-02-01&rft.volume=35&rft.issue=2&rft.spage=859&rft.epage=873&rft.pages=859-873&rft.issn=0956-5515&rft.eissn=1572-8145&rft_id=info:doi/10.1007/s10845-023-02084-6&rft_dat=%3Cproquest_hal_p%3E2921454921%3C/proquest_hal_p%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2921454921&rft_id=info:pmid/&rfr_iscdi=true |