Concurrent Neural Tree and Data Preprocessing AutoML for Image Classification
Deep Neural Networks (DNN's) are a widely-used solution for a variety of machine learning problems. However, it is often necessary to invest a significant amount of a data scientist's time to pre-process input data, test different neural network architectures, and tune hyper-parameters for...
Gespeichert in:
Veröffentlicht in: | arXiv.org 2022-05 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | arXiv.org |
container_volume | |
creator | Thite, Anish Dodda, Mohan Agarwal, Pulak Zutty, Jason |
description | Deep Neural Networks (DNN's) are a widely-used solution for a variety of machine learning problems. However, it is often necessary to invest a significant amount of a data scientist's time to pre-process input data, test different neural network architectures, and tune hyper-parameters for optimal performance. Automated machine learning (autoML) methods automatically search the architecture and hyper-parameter space for optimal neural networks. However, current state-of-the-art (SOTA) methods do not include traditional methods for manipulating input data as part of the algorithmic search space. We adapt the Evolutionary Multi-objective Algorithm Design Engine (EMADE), a multi-objective evolutionary search framework for traditional machine learning methods, to perform neural architecture search. We also integrate EMADE's signal processing and image processing primitives. These primitives allow EMADE to manipulate input data before ingestion into the simultaneously evolved DNN. We show that including these methods as part of the search space shows potential to provide benefits to performance on the CIFAR-10 image classification benchmark dataset. |
doi_str_mv | 10.48550/arxiv.2205.13033 |
format | Article |
fullrecord | <record><control><sourceid>proquest_arxiv</sourceid><recordid>TN_cdi_arxiv_primary_2205_13033</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2670359882</sourcerecordid><originalsourceid>FETCH-LOGICAL-a522-883256dfa0b52a98c5341c8d8dd5ba9b65f73b680e633c62709ff1aff1bb3e373</originalsourceid><addsrcrecordid>eNotj09Pg0AUxDcmJja1H8CTm3gGl_dYWI4N_mtC1QN38lh2GxoKdQGj395t62Eyh5lM5sfYXSTCWEkpHsn9tN8hgJBhhALxii0AMQpUDHDDVuO4F0JAkoKUuGDbfOj17JzpJ_5uZkcdL50xnPqGP9FE_NOZoxu0Gce23_H1PA3bgtvB8c2BdobnHfnEtpqmduhv2bWlbjSrf1-y8uW5zN-C4uN1k6-LgCRAoBSCTBpLopZAmdIS40irRjWNrCmrE2lTrBMlTIKoE0hFZm1EXnWNBlNcsvvL7Jm1Orr2QO63OjFXZ2bfeLg0_Pev2YxTtR9m1_tPlUcXKDOlAP8AwXlZMA</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2670359882</pqid></control><display><type>article</type><title>Concurrent Neural Tree and Data Preprocessing AutoML for Image Classification</title><source>arXiv.org</source><source>Free E- Journals</source><creator>Thite, Anish ; Dodda, Mohan ; Agarwal, Pulak ; Zutty, Jason</creator><creatorcontrib>Thite, Anish ; Dodda, Mohan ; Agarwal, Pulak ; Zutty, Jason</creatorcontrib><description>Deep Neural Networks (DNN's) are a widely-used solution for a variety of machine learning problems. However, it is often necessary to invest a significant amount of a data scientist's time to pre-process input data, test different neural network architectures, and tune hyper-parameters for optimal performance. Automated machine learning (autoML) methods automatically search the architecture and hyper-parameter space for optimal neural networks. However, current state-of-the-art (SOTA) methods do not include traditional methods for manipulating input data as part of the algorithmic search space. We adapt the Evolutionary Multi-objective Algorithm Design Engine (EMADE), a multi-objective evolutionary search framework for traditional machine learning methods, to perform neural architecture search. We also integrate EMADE's signal processing and image processing primitives. These primitives allow EMADE to manipulate input data before ingestion into the simultaneously evolved DNN. We show that including these methods as part of the search space shows potential to provide benefits to performance on the CIFAR-10 image classification benchmark dataset.</description><identifier>EISSN: 2331-8422</identifier><identifier>DOI: 10.48550/arxiv.2205.13033</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Artificial neural networks ; Computer architecture ; Computer Science - Learning ; Computer Science - Neural and Evolutionary Computing ; Evolutionary algorithms ; Image classification ; Image manipulation ; Image processing ; Ingestion ; Machine learning ; Multiple objective analysis ; Neural networks ; Parameters ; Searching ; Signal processing</subject><ispartof>arXiv.org, 2022-05</ispartof><rights>2022. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,776,780,881,27902</link.rule.ids><backlink>$$Uhttps://doi.org/10.1145/3449726.3463221$$DView published paper (Access to full text may be restricted)$$Hfree_for_read</backlink><backlink>$$Uhttps://doi.org/10.48550/arXiv.2205.13033$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Thite, Anish</creatorcontrib><creatorcontrib>Dodda, Mohan</creatorcontrib><creatorcontrib>Agarwal, Pulak</creatorcontrib><creatorcontrib>Zutty, Jason</creatorcontrib><title>Concurrent Neural Tree and Data Preprocessing AutoML for Image Classification</title><title>arXiv.org</title><description>Deep Neural Networks (DNN's) are a widely-used solution for a variety of machine learning problems. However, it is often necessary to invest a significant amount of a data scientist's time to pre-process input data, test different neural network architectures, and tune hyper-parameters for optimal performance. Automated machine learning (autoML) methods automatically search the architecture and hyper-parameter space for optimal neural networks. However, current state-of-the-art (SOTA) methods do not include traditional methods for manipulating input data as part of the algorithmic search space. We adapt the Evolutionary Multi-objective Algorithm Design Engine (EMADE), a multi-objective evolutionary search framework for traditional machine learning methods, to perform neural architecture search. We also integrate EMADE's signal processing and image processing primitives. These primitives allow EMADE to manipulate input data before ingestion into the simultaneously evolved DNN. We show that including these methods as part of the search space shows potential to provide benefits to performance on the CIFAR-10 image classification benchmark dataset.</description><subject>Artificial neural networks</subject><subject>Computer architecture</subject><subject>Computer Science - Learning</subject><subject>Computer Science - Neural and Evolutionary Computing</subject><subject>Evolutionary algorithms</subject><subject>Image classification</subject><subject>Image manipulation</subject><subject>Image processing</subject><subject>Ingestion</subject><subject>Machine learning</subject><subject>Multiple objective analysis</subject><subject>Neural networks</subject><subject>Parameters</subject><subject>Searching</subject><subject>Signal processing</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>BENPR</sourceid><sourceid>GOX</sourceid><recordid>eNotj09Pg0AUxDcmJja1H8CTm3gGl_dYWI4N_mtC1QN38lh2GxoKdQGj395t62Eyh5lM5sfYXSTCWEkpHsn9tN8hgJBhhALxii0AMQpUDHDDVuO4F0JAkoKUuGDbfOj17JzpJ_5uZkcdL50xnPqGP9FE_NOZoxu0Gce23_H1PA3bgtvB8c2BdobnHfnEtpqmduhv2bWlbjSrf1-y8uW5zN-C4uN1k6-LgCRAoBSCTBpLopZAmdIS40irRjWNrCmrE2lTrBMlTIKoE0hFZm1EXnWNBlNcsvvL7Jm1Orr2QO63OjFXZ2bfeLg0_Pev2YxTtR9m1_tPlUcXKDOlAP8AwXlZMA</recordid><startdate>20220525</startdate><enddate>20220525</enddate><creator>Thite, Anish</creator><creator>Dodda, Mohan</creator><creator>Agarwal, Pulak</creator><creator>Zutty, Jason</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20220525</creationdate><title>Concurrent Neural Tree and Data Preprocessing AutoML for Image Classification</title><author>Thite, Anish ; Dodda, Mohan ; Agarwal, Pulak ; Zutty, Jason</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a522-883256dfa0b52a98c5341c8d8dd5ba9b65f73b680e633c62709ff1aff1bb3e373</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Artificial neural networks</topic><topic>Computer architecture</topic><topic>Computer Science - Learning</topic><topic>Computer Science - Neural and Evolutionary Computing</topic><topic>Evolutionary algorithms</topic><topic>Image classification</topic><topic>Image manipulation</topic><topic>Image processing</topic><topic>Ingestion</topic><topic>Machine learning</topic><topic>Multiple objective analysis</topic><topic>Neural networks</topic><topic>Parameters</topic><topic>Searching</topic><topic>Signal processing</topic><toplevel>online_resources</toplevel><creatorcontrib>Thite, Anish</creatorcontrib><creatorcontrib>Dodda, Mohan</creatorcontrib><creatorcontrib>Agarwal, Pulak</creatorcontrib><creatorcontrib>Zutty, Jason</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection><collection>arXiv Computer Science</collection><collection>arXiv.org</collection><jtitle>arXiv.org</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Thite, Anish</au><au>Dodda, Mohan</au><au>Agarwal, Pulak</au><au>Zutty, Jason</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Concurrent Neural Tree and Data Preprocessing AutoML for Image Classification</atitle><jtitle>arXiv.org</jtitle><date>2022-05-25</date><risdate>2022</risdate><eissn>2331-8422</eissn><abstract>Deep Neural Networks (DNN's) are a widely-used solution for a variety of machine learning problems. However, it is often necessary to invest a significant amount of a data scientist's time to pre-process input data, test different neural network architectures, and tune hyper-parameters for optimal performance. Automated machine learning (autoML) methods automatically search the architecture and hyper-parameter space for optimal neural networks. However, current state-of-the-art (SOTA) methods do not include traditional methods for manipulating input data as part of the algorithmic search space. We adapt the Evolutionary Multi-objective Algorithm Design Engine (EMADE), a multi-objective evolutionary search framework for traditional machine learning methods, to perform neural architecture search. We also integrate EMADE's signal processing and image processing primitives. These primitives allow EMADE to manipulate input data before ingestion into the simultaneously evolved DNN. We show that including these methods as part of the search space shows potential to provide benefits to performance on the CIFAR-10 image classification benchmark dataset.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><doi>10.48550/arxiv.2205.13033</doi><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | EISSN: 2331-8422 |
ispartof | arXiv.org, 2022-05 |
issn | 2331-8422 |
language | eng |
recordid | cdi_arxiv_primary_2205_13033 |
source | arXiv.org; Free E- Journals |
subjects | Artificial neural networks Computer architecture Computer Science - Learning Computer Science - Neural and Evolutionary Computing Evolutionary algorithms Image classification Image manipulation Image processing Ingestion Machine learning Multiple objective analysis Neural networks Parameters Searching Signal processing |
title | Concurrent Neural Tree and Data Preprocessing AutoML for Image Classification |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-12T13%3A34%3A11IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_arxiv&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Concurrent%20Neural%20Tree%20and%20Data%20Preprocessing%20AutoML%20for%20Image%20Classification&rft.jtitle=arXiv.org&rft.au=Thite,%20Anish&rft.date=2022-05-25&rft.eissn=2331-8422&rft_id=info:doi/10.48550/arxiv.2205.13033&rft_dat=%3Cproquest_arxiv%3E2670359882%3C/proquest_arxiv%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2670359882&rft_id=info:pmid/&rfr_iscdi=true |