A Neural Network Approach for Online Nonlinear Neyman-Pearson Classification
We propose a novel Neyman-Pearson (NP) classifier that is both online and nonlinear as the first time in the literature. The proposed classifier operates on a binary labeled data stream in an online manner, and maximizes the detection power about a user-specified and controllable false positive rate...
Gespeichert in:
Veröffentlicht in: | IEEE access 2020, Vol.8, p.210234-210250 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 210250 |
---|---|
container_issue | |
container_start_page | 210234 |
container_title | IEEE access |
container_volume | 8 |
creator | Can, Basarbatu Ozkan, Huseyin |
description | We propose a novel Neyman-Pearson (NP) classifier that is both online and nonlinear as the first time in the literature. The proposed classifier operates on a binary labeled data stream in an online manner, and maximizes the detection power about a user-specified and controllable false positive rate. Our NP classifier is a single hidden layer feedforward neural network (SLFN), which is initialized with random Fourier features (RFFs) to construct the kernel space of the radial basis function at its hidden layer with sinusoidal activation. Not only does this use of RFFs provide an excellent initialization with great nonlinear modeling capability, but it also exponentially reduces the parameter complexity and compactifies the network to mitigate overfitting while improving the processing efficiency substantially. We sequentially learn the SLFN with stochastic gradient descent updates based on a Lagrangian NP objective. As a result, we obtain an expedited online adaptation and powerful nonlinear Neyman-Pearson modeling. Our algorithm is appropriate for large scale data applications and provides a decent false positive rate controllability with real time processing since it only has O(N) computational and O(1) space complexity ( N : number of data instances). In our extensive set of experiments on several real datasets, our algorithm is highly superior over the competing state-of-the-art techniques, either by outperforming in terms of the NP classification objective with a comparable computational as well as space complexity or by achieving a comparable performance with significantly lower complexity. |
doi_str_mv | 10.1109/ACCESS.2020.3039724 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2468750743</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9265182</ieee_id><doaj_id>oai_doaj_org_article_22a0cea2fcd74e7c8bc1e4849dbc8d49</doaj_id><sourcerecordid>2468750743</sourcerecordid><originalsourceid>FETCH-LOGICAL-c408t-5b633e1172699d4dd1979bbfa73c99a7d01d0601809fb68852793a6cb27e05503</originalsourceid><addsrcrecordid>eNpNUctqwzAQNKWFhjRfkIuhZ6d6WY-jMWkbCEkh7VnIktw6daxUcij5-ypxCN3LLMvM7C6TJFMIZhAC8VSU5XyzmSGAwAwDLBgiN8kIQSoynGN6-6-_TyYhbEEsHkc5GyXLIl3Zg1dthP7X-e-02O-9U_orrZ1P113bdDZduTMqH1nHneqyt9gH16Vlq0Jo6karvnHdQ3JXqzbYyQXHycfz_L18zZbrl0VZLDNNAO-zvKIYWwgZokIYYgwUTFRVrRjWQihmADSAAsiBqCvKeY6YwIrqCjEL8hzgcbIYfI1TW7n3zU75o3SqkeeB859S-b7RrZUIKaCtQrU2jFimeaWhJZwIU2luiIhej4NX_PrnYEMvt-7gu3i-RIRylgNGcGThgaW9C8Hb-roVAnlKQQ4pyFMK8pJCVE0HVWOtvSoEojnkCP8BLaKCAA</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2468750743</pqid></control><display><type>article</type><title>A Neural Network Approach for Online Nonlinear Neyman-Pearson Classification</title><source>IEEE Open Access Journals</source><source>DOAJ Directory of Open Access Journals</source><source>Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals</source><creator>Can, Basarbatu ; Ozkan, Huseyin</creator><creatorcontrib>Can, Basarbatu ; Ozkan, Huseyin</creatorcontrib><description><![CDATA[We propose a novel Neyman-Pearson (NP) classifier that is both online and nonlinear as the first time in the literature. The proposed classifier operates on a binary labeled data stream in an online manner, and maximizes the detection power about a user-specified and controllable false positive rate. Our NP classifier is a single hidden layer feedforward neural network (SLFN), which is initialized with random Fourier features (RFFs) to construct the kernel space of the radial basis function at its hidden layer with sinusoidal activation. Not only does this use of RFFs provide an excellent initialization with great nonlinear modeling capability, but it also exponentially reduces the parameter complexity and compactifies the network to mitigate overfitting while improving the processing efficiency substantially. We sequentially learn the SLFN with stochastic gradient descent updates based on a Lagrangian NP objective. As a result, we obtain an expedited online adaptation and powerful nonlinear Neyman-Pearson modeling. Our algorithm is appropriate for large scale data applications and provides a decent false positive rate controllability with real time processing since it only has <inline-formula> <tex-math notation="LaTeX">O(N) </tex-math></inline-formula> computational and <inline-formula> <tex-math notation="LaTeX">O(1) </tex-math></inline-formula> space complexity (<inline-formula> <tex-math notation="LaTeX">N </tex-math></inline-formula>: number of data instances). In our extensive set of experiments on several real datasets, our algorithm is highly superior over the competing state-of-the-art techniques, either by outperforming in terms of the NP classification objective with a comparable computational as well as space complexity or by achieving a comparable performance with significantly lower complexity.]]></description><identifier>ISSN: 2169-3536</identifier><identifier>EISSN: 2169-3536</identifier><identifier>DOI: 10.1109/ACCESS.2020.3039724</identifier><identifier>CODEN: IAECCG</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>Algorithms ; Artificial neural networks ; Classification ; Classifiers ; Complexity ; Complexity theory ; Computational modeling ; Controllability ; Data transmission ; Kernel ; large scale ; neural network ; Neural networks ; Neyman-Pearson ; nonlinear ; online ; Optimization ; Radial basis function ; Support vector machines ; Tuning</subject><ispartof>IEEE access, 2020, Vol.8, p.210234-210250</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2020</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c408t-5b633e1172699d4dd1979bbfa73c99a7d01d0601809fb68852793a6cb27e05503</citedby><cites>FETCH-LOGICAL-c408t-5b633e1172699d4dd1979bbfa73c99a7d01d0601809fb68852793a6cb27e05503</cites><orcidid>0000-0001-8844-1860 ; 0000-0002-5539-9085</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9265182$$EHTML$$P50$$Gieee$$Hfree_for_read</linktohtml><link.rule.ids>314,776,780,860,2096,4010,27610,27900,27901,27902,54908</link.rule.ids></links><search><creatorcontrib>Can, Basarbatu</creatorcontrib><creatorcontrib>Ozkan, Huseyin</creatorcontrib><title>A Neural Network Approach for Online Nonlinear Neyman-Pearson Classification</title><title>IEEE access</title><addtitle>Access</addtitle><description><![CDATA[We propose a novel Neyman-Pearson (NP) classifier that is both online and nonlinear as the first time in the literature. The proposed classifier operates on a binary labeled data stream in an online manner, and maximizes the detection power about a user-specified and controllable false positive rate. Our NP classifier is a single hidden layer feedforward neural network (SLFN), which is initialized with random Fourier features (RFFs) to construct the kernel space of the radial basis function at its hidden layer with sinusoidal activation. Not only does this use of RFFs provide an excellent initialization with great nonlinear modeling capability, but it also exponentially reduces the parameter complexity and compactifies the network to mitigate overfitting while improving the processing efficiency substantially. We sequentially learn the SLFN with stochastic gradient descent updates based on a Lagrangian NP objective. As a result, we obtain an expedited online adaptation and powerful nonlinear Neyman-Pearson modeling. Our algorithm is appropriate for large scale data applications and provides a decent false positive rate controllability with real time processing since it only has <inline-formula> <tex-math notation="LaTeX">O(N) </tex-math></inline-formula> computational and <inline-formula> <tex-math notation="LaTeX">O(1) </tex-math></inline-formula> space complexity (<inline-formula> <tex-math notation="LaTeX">N </tex-math></inline-formula>: number of data instances). In our extensive set of experiments on several real datasets, our algorithm is highly superior over the competing state-of-the-art techniques, either by outperforming in terms of the NP classification objective with a comparable computational as well as space complexity or by achieving a comparable performance with significantly lower complexity.]]></description><subject>Algorithms</subject><subject>Artificial neural networks</subject><subject>Classification</subject><subject>Classifiers</subject><subject>Complexity</subject><subject>Complexity theory</subject><subject>Computational modeling</subject><subject>Controllability</subject><subject>Data transmission</subject><subject>Kernel</subject><subject>large scale</subject><subject>neural network</subject><subject>Neural networks</subject><subject>Neyman-Pearson</subject><subject>nonlinear</subject><subject>online</subject><subject>Optimization</subject><subject>Radial basis function</subject><subject>Support vector machines</subject><subject>Tuning</subject><issn>2169-3536</issn><issn>2169-3536</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><sourceid>ESBDL</sourceid><sourceid>RIE</sourceid><sourceid>DOA</sourceid><recordid>eNpNUctqwzAQNKWFhjRfkIuhZ6d6WY-jMWkbCEkh7VnIktw6daxUcij5-ypxCN3LLMvM7C6TJFMIZhAC8VSU5XyzmSGAwAwDLBgiN8kIQSoynGN6-6-_TyYhbEEsHkc5GyXLIl3Zg1dthP7X-e-02O-9U_orrZ1P113bdDZduTMqH1nHneqyt9gH16Vlq0Jo6karvnHdQ3JXqzbYyQXHycfz_L18zZbrl0VZLDNNAO-zvKIYWwgZokIYYgwUTFRVrRjWQihmADSAAsiBqCvKeY6YwIrqCjEL8hzgcbIYfI1TW7n3zU75o3SqkeeB859S-b7RrZUIKaCtQrU2jFimeaWhJZwIU2luiIhej4NX_PrnYEMvt-7gu3i-RIRylgNGcGThgaW9C8Hb-roVAnlKQQ4pyFMK8pJCVE0HVWOtvSoEojnkCP8BLaKCAA</recordid><startdate>2020</startdate><enddate>2020</enddate><creator>Can, Basarbatu</creator><creator>Ozkan, Huseyin</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>ESBDL</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7SR</scope><scope>8BQ</scope><scope>8FD</scope><scope>JG9</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0001-8844-1860</orcidid><orcidid>https://orcid.org/0000-0002-5539-9085</orcidid></search><sort><creationdate>2020</creationdate><title>A Neural Network Approach for Online Nonlinear Neyman-Pearson Classification</title><author>Can, Basarbatu ; Ozkan, Huseyin</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c408t-5b633e1172699d4dd1979bbfa73c99a7d01d0601809fb68852793a6cb27e05503</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Algorithms</topic><topic>Artificial neural networks</topic><topic>Classification</topic><topic>Classifiers</topic><topic>Complexity</topic><topic>Complexity theory</topic><topic>Computational modeling</topic><topic>Controllability</topic><topic>Data transmission</topic><topic>Kernel</topic><topic>large scale</topic><topic>neural network</topic><topic>Neural networks</topic><topic>Neyman-Pearson</topic><topic>nonlinear</topic><topic>online</topic><topic>Optimization</topic><topic>Radial basis function</topic><topic>Support vector machines</topic><topic>Tuning</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Can, Basarbatu</creatorcontrib><creatorcontrib>Ozkan, Huseyin</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE Open Access Journals</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>IEEE access</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Can, Basarbatu</au><au>Ozkan, Huseyin</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A Neural Network Approach for Online Nonlinear Neyman-Pearson Classification</atitle><jtitle>IEEE access</jtitle><stitle>Access</stitle><date>2020</date><risdate>2020</risdate><volume>8</volume><spage>210234</spage><epage>210250</epage><pages>210234-210250</pages><issn>2169-3536</issn><eissn>2169-3536</eissn><coden>IAECCG</coden><abstract><![CDATA[We propose a novel Neyman-Pearson (NP) classifier that is both online and nonlinear as the first time in the literature. The proposed classifier operates on a binary labeled data stream in an online manner, and maximizes the detection power about a user-specified and controllable false positive rate. Our NP classifier is a single hidden layer feedforward neural network (SLFN), which is initialized with random Fourier features (RFFs) to construct the kernel space of the radial basis function at its hidden layer with sinusoidal activation. Not only does this use of RFFs provide an excellent initialization with great nonlinear modeling capability, but it also exponentially reduces the parameter complexity and compactifies the network to mitigate overfitting while improving the processing efficiency substantially. We sequentially learn the SLFN with stochastic gradient descent updates based on a Lagrangian NP objective. As a result, we obtain an expedited online adaptation and powerful nonlinear Neyman-Pearson modeling. Our algorithm is appropriate for large scale data applications and provides a decent false positive rate controllability with real time processing since it only has <inline-formula> <tex-math notation="LaTeX">O(N) </tex-math></inline-formula> computational and <inline-formula> <tex-math notation="LaTeX">O(1) </tex-math></inline-formula> space complexity (<inline-formula> <tex-math notation="LaTeX">N </tex-math></inline-formula>: number of data instances). In our extensive set of experiments on several real datasets, our algorithm is highly superior over the competing state-of-the-art techniques, either by outperforming in terms of the NP classification objective with a comparable computational as well as space complexity or by achieving a comparable performance with significantly lower complexity.]]></abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/ACCESS.2020.3039724</doi><tpages>17</tpages><orcidid>https://orcid.org/0000-0001-8844-1860</orcidid><orcidid>https://orcid.org/0000-0002-5539-9085</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 2169-3536 |
ispartof | IEEE access, 2020, Vol.8, p.210234-210250 |
issn | 2169-3536 2169-3536 |
language | eng |
recordid | cdi_proquest_journals_2468750743 |
source | IEEE Open Access Journals; DOAJ Directory of Open Access Journals; Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals |
subjects | Algorithms Artificial neural networks Classification Classifiers Complexity Complexity theory Computational modeling Controllability Data transmission Kernel large scale neural network Neural networks Neyman-Pearson nonlinear online Optimization Radial basis function Support vector machines Tuning |
title | A Neural Network Approach for Online Nonlinear Neyman-Pearson Classification |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-20T06%3A37%3A21IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20Neural%20Network%20Approach%20for%20Online%20Nonlinear%20Neyman-Pearson%20Classification&rft.jtitle=IEEE%20access&rft.au=Can,%20Basarbatu&rft.date=2020&rft.volume=8&rft.spage=210234&rft.epage=210250&rft.pages=210234-210250&rft.issn=2169-3536&rft.eissn=2169-3536&rft.coden=IAECCG&rft_id=info:doi/10.1109/ACCESS.2020.3039724&rft_dat=%3Cproquest_cross%3E2468750743%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2468750743&rft_id=info:pmid/&rft_ieee_id=9265182&rft_doaj_id=oai_doaj_org_article_22a0cea2fcd74e7c8bc1e4849dbc8d49&rfr_iscdi=true |