Derivative-Based Learning of Interval Type-2 Intuitionistic Fuzzy Logic Systems for Noisy Regression Problems

This study presents a comparative evaluation of interval type-2 intuitionistic fuzzy logic system using three derivative-based learning algorithms on noisy regression problems. The motivation for this study is to manage uncertainty in noisy regression problems for the first time using both membershi...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:International journal of fuzzy systems 2020-04, Vol.22 (3), p.1007-1019
Hauptverfasser: Eyoh, Imo Jeremiah, Umoh, Uduak Augustine, Inyang, Udoinyang Godwin, Eyoh, Jeremiah Effiong
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 1019
container_issue 3
container_start_page 1007
container_title International journal of fuzzy systems
container_volume 22
creator Eyoh, Imo Jeremiah
Umoh, Uduak Augustine
Inyang, Udoinyang Godwin
Eyoh, Jeremiah Effiong
description This study presents a comparative evaluation of interval type-2 intuitionistic fuzzy logic system using three derivative-based learning algorithms on noisy regression problems. The motivation for this study is to manage uncertainty in noisy regression problems for the first time using both membership and non-membership functions that are fuzzy. The proposed models are able to handle ‘neither this nor that state’ in the noisy regression data with the aim of enabling hesitation and handling more uncertainty in the data. The gradient descent-backpropagation (first-order derivative), decoupled extended Kalman filter (second-order derivative) and hybrid approach (where the decoupled extended Kalman filter is used to learn the consequent parameters and gradient descent is used to optimise the antecedent parameters) are applied for the adaptation of the model parameters. The experiments are conducted using two artificially generated and one real-world datasets, namely Mackey–Glass time series, Lorenz time series and US stock datasets. Experimental analyses show that the extended Kalman filter-based learning approaches of interval type-2 intuitionistic fuzzy logic exhibit superior prediction accuracies to gradient descent approach especially at high noise level. The decoupled extended Kalman filter model however converges faster but incurs more computational overhead in terms of the running time.
doi_str_mv 10.1007/s40815-020-00806-z
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2932478911</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2932478911</sourcerecordid><originalsourceid>FETCH-LOGICAL-c319t-5c81aef1173c870592d121e0a450389f88e9980954780b7b29820baee5c027493</originalsourceid><addsrcrecordid>eNp9kEtLAzEQx4MoWNQv4CngOTqT7HaTo28LRcXHOWS3syXSbmqyLWw_vdEK3jzNDPN_wI-xU4RzBKguUgEaSwESBICGsdjusZFEY4SSiPtshOVYCllU5pCdpORrUCjHqhyrEVveUPQb1_sNiSuXaMan5GLnuzkPLZ90PcWNW_C3YUVCft9r3_vQ-dT7ht-tt9uBT8M8769D6mmZeBsifww-DfyF5pFyXej4cwz1In-P2UHrFolOfucRe7-7fbt-ENOn-8n15VQ0Ck0vykajoxaxUo2uoDRyhhIJXFGC0qbVmozRYMqi0lBXtTRaQu2IygZkVRh1xM52uasYPteUevsR1rHLlVYalUlog5hVcqdqYkgpUmtX0S9dHCyC_SZrd2RtJmt_yNptNqmdKWVxN6f4F_2P6wuwfnvz</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2932478911</pqid></control><display><type>article</type><title>Derivative-Based Learning of Interval Type-2 Intuitionistic Fuzzy Logic Systems for Noisy Regression Problems</title><source>ProQuest Central UK/Ireland</source><source>SpringerLink Journals - AutoHoldings</source><source>ProQuest Central</source><creator>Eyoh, Imo Jeremiah ; Umoh, Uduak Augustine ; Inyang, Udoinyang Godwin ; Eyoh, Jeremiah Effiong</creator><creatorcontrib>Eyoh, Imo Jeremiah ; Umoh, Uduak Augustine ; Inyang, Udoinyang Godwin ; Eyoh, Jeremiah Effiong</creatorcontrib><description>This study presents a comparative evaluation of interval type-2 intuitionistic fuzzy logic system using three derivative-based learning algorithms on noisy regression problems. The motivation for this study is to manage uncertainty in noisy regression problems for the first time using both membership and non-membership functions that are fuzzy. The proposed models are able to handle ‘neither this nor that state’ in the noisy regression data with the aim of enabling hesitation and handling more uncertainty in the data. The gradient descent-backpropagation (first-order derivative), decoupled extended Kalman filter (second-order derivative) and hybrid approach (where the decoupled extended Kalman filter is used to learn the consequent parameters and gradient descent is used to optimise the antecedent parameters) are applied for the adaptation of the model parameters. The experiments are conducted using two artificially generated and one real-world datasets, namely Mackey–Glass time series, Lorenz time series and US stock datasets. Experimental analyses show that the extended Kalman filter-based learning approaches of interval type-2 intuitionistic fuzzy logic exhibit superior prediction accuracies to gradient descent approach especially at high noise level. The decoupled extended Kalman filter model however converges faster but incurs more computational overhead in terms of the running time.</description><identifier>ISSN: 1562-2479</identifier><identifier>EISSN: 2199-3211</identifier><identifier>DOI: 10.1007/s40815-020-00806-z</identifier><language>eng</language><publisher>Berlin/Heidelberg: Springer Berlin Heidelberg</publisher><subject>Algorithms ; Artificial Intelligence ; Back propagation ; Computational Intelligence ; Datasets ; Decision making ; Distance learning ; Engineering ; Extended Kalman filter ; Fuzzy logic ; Fuzzy sets ; Fuzzy systems ; Genetic algorithms ; Machine learning ; Management Science ; Mathematical models ; Neural networks ; Noise levels ; Operations Research ; Parameters ; Regression ; Run time (computers) ; Time series ; Uncertainty</subject><ispartof>International journal of fuzzy systems, 2020-04, Vol.22 (3), p.1007-1019</ispartof><rights>Taiwan Fuzzy Systems Association 2020</rights><rights>Taiwan Fuzzy Systems Association 2020.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c319t-5c81aef1173c870592d121e0a450389f88e9980954780b7b29820baee5c027493</citedby><cites>FETCH-LOGICAL-c319t-5c81aef1173c870592d121e0a450389f88e9980954780b7b29820baee5c027493</cites><orcidid>0000-0002-6548-7644</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s40815-020-00806-z$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://www.proquest.com/docview/2932478911?pq-origsite=primo$$EHTML$$P50$$Gproquest$$H</linktohtml><link.rule.ids>314,780,784,21387,27923,27924,33743,41487,42556,43804,51318,64384,64388,72240</link.rule.ids></links><search><creatorcontrib>Eyoh, Imo Jeremiah</creatorcontrib><creatorcontrib>Umoh, Uduak Augustine</creatorcontrib><creatorcontrib>Inyang, Udoinyang Godwin</creatorcontrib><creatorcontrib>Eyoh, Jeremiah Effiong</creatorcontrib><title>Derivative-Based Learning of Interval Type-2 Intuitionistic Fuzzy Logic Systems for Noisy Regression Problems</title><title>International journal of fuzzy systems</title><addtitle>Int. J. Fuzzy Syst</addtitle><description>This study presents a comparative evaluation of interval type-2 intuitionistic fuzzy logic system using three derivative-based learning algorithms on noisy regression problems. The motivation for this study is to manage uncertainty in noisy regression problems for the first time using both membership and non-membership functions that are fuzzy. The proposed models are able to handle ‘neither this nor that state’ in the noisy regression data with the aim of enabling hesitation and handling more uncertainty in the data. The gradient descent-backpropagation (first-order derivative), decoupled extended Kalman filter (second-order derivative) and hybrid approach (where the decoupled extended Kalman filter is used to learn the consequent parameters and gradient descent is used to optimise the antecedent parameters) are applied for the adaptation of the model parameters. The experiments are conducted using two artificially generated and one real-world datasets, namely Mackey–Glass time series, Lorenz time series and US stock datasets. Experimental analyses show that the extended Kalman filter-based learning approaches of interval type-2 intuitionistic fuzzy logic exhibit superior prediction accuracies to gradient descent approach especially at high noise level. The decoupled extended Kalman filter model however converges faster but incurs more computational overhead in terms of the running time.</description><subject>Algorithms</subject><subject>Artificial Intelligence</subject><subject>Back propagation</subject><subject>Computational Intelligence</subject><subject>Datasets</subject><subject>Decision making</subject><subject>Distance learning</subject><subject>Engineering</subject><subject>Extended Kalman filter</subject><subject>Fuzzy logic</subject><subject>Fuzzy sets</subject><subject>Fuzzy systems</subject><subject>Genetic algorithms</subject><subject>Machine learning</subject><subject>Management Science</subject><subject>Mathematical models</subject><subject>Neural networks</subject><subject>Noise levels</subject><subject>Operations Research</subject><subject>Parameters</subject><subject>Regression</subject><subject>Run time (computers)</subject><subject>Time series</subject><subject>Uncertainty</subject><issn>1562-2479</issn><issn>2199-3211</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><recordid>eNp9kEtLAzEQx4MoWNQv4CngOTqT7HaTo28LRcXHOWS3syXSbmqyLWw_vdEK3jzNDPN_wI-xU4RzBKguUgEaSwESBICGsdjusZFEY4SSiPtshOVYCllU5pCdpORrUCjHqhyrEVveUPQb1_sNiSuXaMan5GLnuzkPLZ90PcWNW_C3YUVCft9r3_vQ-dT7ht-tt9uBT8M8769D6mmZeBsifww-DfyF5pFyXej4cwz1In-P2UHrFolOfucRe7-7fbt-ENOn-8n15VQ0Ck0vykajoxaxUo2uoDRyhhIJXFGC0qbVmozRYMqi0lBXtTRaQu2IygZkVRh1xM52uasYPteUevsR1rHLlVYalUlog5hVcqdqYkgpUmtX0S9dHCyC_SZrd2RtJmt_yNptNqmdKWVxN6f4F_2P6wuwfnvz</recordid><startdate>20200401</startdate><enddate>20200401</enddate><creator>Eyoh, Imo Jeremiah</creator><creator>Umoh, Uduak Augustine</creator><creator>Inyang, Udoinyang Godwin</creator><creator>Eyoh, Jeremiah Effiong</creator><general>Springer Berlin Heidelberg</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K7-</scope><scope>L6V</scope><scope>M7S</scope><scope>P5Z</scope><scope>P62</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PTHSS</scope><orcidid>https://orcid.org/0000-0002-6548-7644</orcidid></search><sort><creationdate>20200401</creationdate><title>Derivative-Based Learning of Interval Type-2 Intuitionistic Fuzzy Logic Systems for Noisy Regression Problems</title><author>Eyoh, Imo Jeremiah ; Umoh, Uduak Augustine ; Inyang, Udoinyang Godwin ; Eyoh, Jeremiah Effiong</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c319t-5c81aef1173c870592d121e0a450389f88e9980954780b7b29820baee5c027493</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Algorithms</topic><topic>Artificial Intelligence</topic><topic>Back propagation</topic><topic>Computational Intelligence</topic><topic>Datasets</topic><topic>Decision making</topic><topic>Distance learning</topic><topic>Engineering</topic><topic>Extended Kalman filter</topic><topic>Fuzzy logic</topic><topic>Fuzzy sets</topic><topic>Fuzzy systems</topic><topic>Genetic algorithms</topic><topic>Machine learning</topic><topic>Management Science</topic><topic>Mathematical models</topic><topic>Neural networks</topic><topic>Noise levels</topic><topic>Operations Research</topic><topic>Parameters</topic><topic>Regression</topic><topic>Run time (computers)</topic><topic>Time series</topic><topic>Uncertainty</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Eyoh, Imo Jeremiah</creatorcontrib><creatorcontrib>Umoh, Uduak Augustine</creatorcontrib><creatorcontrib>Inyang, Udoinyang Godwin</creatorcontrib><creatorcontrib>Eyoh, Jeremiah Effiong</creatorcontrib><collection>CrossRef</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer Science Database</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>Engineering Collection</collection><jtitle>International journal of fuzzy systems</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Eyoh, Imo Jeremiah</au><au>Umoh, Uduak Augustine</au><au>Inyang, Udoinyang Godwin</au><au>Eyoh, Jeremiah Effiong</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Derivative-Based Learning of Interval Type-2 Intuitionistic Fuzzy Logic Systems for Noisy Regression Problems</atitle><jtitle>International journal of fuzzy systems</jtitle><stitle>Int. J. Fuzzy Syst</stitle><date>2020-04-01</date><risdate>2020</risdate><volume>22</volume><issue>3</issue><spage>1007</spage><epage>1019</epage><pages>1007-1019</pages><issn>1562-2479</issn><eissn>2199-3211</eissn><abstract>This study presents a comparative evaluation of interval type-2 intuitionistic fuzzy logic system using three derivative-based learning algorithms on noisy regression problems. The motivation for this study is to manage uncertainty in noisy regression problems for the first time using both membership and non-membership functions that are fuzzy. The proposed models are able to handle ‘neither this nor that state’ in the noisy regression data with the aim of enabling hesitation and handling more uncertainty in the data. The gradient descent-backpropagation (first-order derivative), decoupled extended Kalman filter (second-order derivative) and hybrid approach (where the decoupled extended Kalman filter is used to learn the consequent parameters and gradient descent is used to optimise the antecedent parameters) are applied for the adaptation of the model parameters. The experiments are conducted using two artificially generated and one real-world datasets, namely Mackey–Glass time series, Lorenz time series and US stock datasets. Experimental analyses show that the extended Kalman filter-based learning approaches of interval type-2 intuitionistic fuzzy logic exhibit superior prediction accuracies to gradient descent approach especially at high noise level. The decoupled extended Kalman filter model however converges faster but incurs more computational overhead in terms of the running time.</abstract><cop>Berlin/Heidelberg</cop><pub>Springer Berlin Heidelberg</pub><doi>10.1007/s40815-020-00806-z</doi><tpages>13</tpages><orcidid>https://orcid.org/0000-0002-6548-7644</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 1562-2479
ispartof International journal of fuzzy systems, 2020-04, Vol.22 (3), p.1007-1019
issn 1562-2479
2199-3211
language eng
recordid cdi_proquest_journals_2932478911
source ProQuest Central UK/Ireland; SpringerLink Journals - AutoHoldings; ProQuest Central
subjects Algorithms
Artificial Intelligence
Back propagation
Computational Intelligence
Datasets
Decision making
Distance learning
Engineering
Extended Kalman filter
Fuzzy logic
Fuzzy sets
Fuzzy systems
Genetic algorithms
Machine learning
Management Science
Mathematical models
Neural networks
Noise levels
Operations Research
Parameters
Regression
Run time (computers)
Time series
Uncertainty
title Derivative-Based Learning of Interval Type-2 Intuitionistic Fuzzy Logic Systems for Noisy Regression Problems
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-12T20%3A46%3A06IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Derivative-Based%20Learning%20of%20Interval%20Type-2%20Intuitionistic%20Fuzzy%20Logic%20Systems%20for%20Noisy%20Regression%20Problems&rft.jtitle=International%20journal%20of%20fuzzy%20systems&rft.au=Eyoh,%20Imo%20Jeremiah&rft.date=2020-04-01&rft.volume=22&rft.issue=3&rft.spage=1007&rft.epage=1019&rft.pages=1007-1019&rft.issn=1562-2479&rft.eissn=2199-3211&rft_id=info:doi/10.1007/s40815-020-00806-z&rft_dat=%3Cproquest_cross%3E2932478911%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2932478911&rft_id=info:pmid/&rfr_iscdi=true