A survey of fractional calculus applications in artificial neural networks
Artificial neural network (ANN) is the backbone of machine learning, specifically deep learning. The interpolating and learning ability of an ANN makes it an ideal tool for modelling, control and various other complex tasks. Fractional calculus (FC) involving derivatives and integrals of arbitrary n...
Gespeichert in:
Veröffentlicht in: | The Artificial intelligence review 2023-11, Vol.56 (11), p.13897-13950 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 13950 |
---|---|
container_issue | 11 |
container_start_page | 13897 |
container_title | The Artificial intelligence review |
container_volume | 56 |
creator | Joshi, Manisha Bhosale, Savita Vyawahare, Vishwesh A. |
description | Artificial neural network (ANN) is the backbone of machine learning, specifically deep learning. The interpolating and learning ability of an ANN makes it an ideal tool for modelling, control and various other complex tasks. Fractional calculus (FC) involving derivatives and integrals of arbitrary non-integer order has recently been popular for its capability to model memory-type systems. There have been many attempts to explore the possibilities of combining these two fields, the most popular combination being the use of fractional derivative in the learning algorithm. This paper reviews the use of fractional calculus in various artificial neural network architectures, such as radial basis functions, recurrent neural networks, backpropagation NNs, and convolutional neural networks. These ANNs are popularly known as fractional-order artificial neural networks (FANNs). A detailed review of the various concepts related to FANNs, including activation functions, training algorithms based on fractional derivative, stability, synchronization, hardware implementations of FANNs, and real-world applications of FANNs, is presented. The study also highlights the advantage of combining fractional derivatives with ANN, the impact of fractional derivative order on performance indices like mean square error, the time required for training and testing FANN, stability, and synchronization in FANN. The survey reports interesting observations: combining FC to an ANN endows it with the memory feature; Caputo definition of fractional derivative is the most commonly used in FANNs; fractional derivative-based activation functions in ANN provide additional adjustable hyperparameters to the networks; the FANN has more degree of freedom for adjusting parameters compared to an ordinary ANN; use of multiple types of activation functions can be employed in FANN, and many more. |
doi_str_mv | 10.1007/s10462-023-10474-8 |
format | Article |
fullrecord | <record><control><sourceid>gale_proqu</sourceid><recordid>TN_cdi_proquest_journals_2867415379</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A766307703</galeid><sourcerecordid>A766307703</sourcerecordid><originalsourceid>FETCH-LOGICAL-c358t-d99f9a115dbd75f8ec315a95471d08dc326feaa3ee65151a90ecf8deae55dfb43</originalsourceid><addsrcrecordid>eNp9kE1LAzEQhoMoWKt_wNOC59R8bDbZYyl-UvCi55BmJyV1u1mTXaX_3tgVvMkcZnh5n2HmReiakgUlRN4mSsqKYcI4zpMssTpBMyokxzLrp2hGWFVjphg9Rxcp7QghgpV8hp6XRRrjJxyK4AoXjR186ExbWNPasR1TYfq-9db8yKnwXWHi4J23Pns6GOOxDV8hvqdLdOZMm-Dqt8_R2_3d6-oRr18enlbLNbZcqAE3de1qQ6loNo0UToHlVJhalJI2RDWWs8qBMRygElRQUxOwTjVgQIjGbUo-RzfT3j6GjxHSoHdhjPnopJmqZEkFl3V2LSbX1rSgfefCkL_L1cDe29CB81lfyqriRErCM8AmwMaQUgSn--j3Jh40JfonZD2FrHPI-hiyVhniE5SyudtC_LvlH-ob94qAQw</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2867415379</pqid></control><display><type>article</type><title>A survey of fractional calculus applications in artificial neural networks</title><source>Springer Nature - Complete Springer Journals</source><creator>Joshi, Manisha ; Bhosale, Savita ; Vyawahare, Vishwesh A.</creator><creatorcontrib>Joshi, Manisha ; Bhosale, Savita ; Vyawahare, Vishwesh A.</creatorcontrib><description>Artificial neural network (ANN) is the backbone of machine learning, specifically deep learning. The interpolating and learning ability of an ANN makes it an ideal tool for modelling, control and various other complex tasks. Fractional calculus (FC) involving derivatives and integrals of arbitrary non-integer order has recently been popular for its capability to model memory-type systems. There have been many attempts to explore the possibilities of combining these two fields, the most popular combination being the use of fractional derivative in the learning algorithm. This paper reviews the use of fractional calculus in various artificial neural network architectures, such as radial basis functions, recurrent neural networks, backpropagation NNs, and convolutional neural networks. These ANNs are popularly known as fractional-order artificial neural networks (FANNs). A detailed review of the various concepts related to FANNs, including activation functions, training algorithms based on fractional derivative, stability, synchronization, hardware implementations of FANNs, and real-world applications of FANNs, is presented. The study also highlights the advantage of combining fractional derivatives with ANN, the impact of fractional derivative order on performance indices like mean square error, the time required for training and testing FANN, stability, and synchronization in FANN. The survey reports interesting observations: combining FC to an ANN endows it with the memory feature; Caputo definition of fractional derivative is the most commonly used in FANNs; fractional derivative-based activation functions in ANN provide additional adjustable hyperparameters to the networks; the FANN has more degree of freedom for adjusting parameters compared to an ordinary ANN; use of multiple types of activation functions can be employed in FANN, and many more.</description><identifier>ISSN: 0269-2821</identifier><identifier>EISSN: 1573-7462</identifier><identifier>DOI: 10.1007/s10462-023-10474-8</identifier><language>eng</language><publisher>Dordrecht: Springer Netherlands</publisher><subject>Algorithms ; Artificial Intelligence ; Artificial neural networks ; Back propagation networks ; Calculus ; Computer architecture ; Computer Science ; Data mining ; Deep learning ; Derivatives ; Fractional calculus ; Machine learning ; Neural networks ; Performance indices ; Radial basis function ; Recurrent neural networks ; Stability ; Synchronism ; Task complexity</subject><ispartof>The Artificial intelligence review, 2023-11, Vol.56 (11), p.13897-13950</ispartof><rights>The Author(s), under exclusive licence to Springer Nature B.V. 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.</rights><rights>COPYRIGHT 2023 Springer</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c358t-d99f9a115dbd75f8ec315a95471d08dc326feaa3ee65151a90ecf8deae55dfb43</citedby><cites>FETCH-LOGICAL-c358t-d99f9a115dbd75f8ec315a95471d08dc326feaa3ee65151a90ecf8deae55dfb43</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s10462-023-10474-8$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s10462-023-10474-8$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,778,782,27911,27912,41475,42544,51306</link.rule.ids></links><search><creatorcontrib>Joshi, Manisha</creatorcontrib><creatorcontrib>Bhosale, Savita</creatorcontrib><creatorcontrib>Vyawahare, Vishwesh A.</creatorcontrib><title>A survey of fractional calculus applications in artificial neural networks</title><title>The Artificial intelligence review</title><addtitle>Artif Intell Rev</addtitle><description>Artificial neural network (ANN) is the backbone of machine learning, specifically deep learning. The interpolating and learning ability of an ANN makes it an ideal tool for modelling, control and various other complex tasks. Fractional calculus (FC) involving derivatives and integrals of arbitrary non-integer order has recently been popular for its capability to model memory-type systems. There have been many attempts to explore the possibilities of combining these two fields, the most popular combination being the use of fractional derivative in the learning algorithm. This paper reviews the use of fractional calculus in various artificial neural network architectures, such as radial basis functions, recurrent neural networks, backpropagation NNs, and convolutional neural networks. These ANNs are popularly known as fractional-order artificial neural networks (FANNs). A detailed review of the various concepts related to FANNs, including activation functions, training algorithms based on fractional derivative, stability, synchronization, hardware implementations of FANNs, and real-world applications of FANNs, is presented. The study also highlights the advantage of combining fractional derivatives with ANN, the impact of fractional derivative order on performance indices like mean square error, the time required for training and testing FANN, stability, and synchronization in FANN. The survey reports interesting observations: combining FC to an ANN endows it with the memory feature; Caputo definition of fractional derivative is the most commonly used in FANNs; fractional derivative-based activation functions in ANN provide additional adjustable hyperparameters to the networks; the FANN has more degree of freedom for adjusting parameters compared to an ordinary ANN; use of multiple types of activation functions can be employed in FANN, and many more.</description><subject>Algorithms</subject><subject>Artificial Intelligence</subject><subject>Artificial neural networks</subject><subject>Back propagation networks</subject><subject>Calculus</subject><subject>Computer architecture</subject><subject>Computer Science</subject><subject>Data mining</subject><subject>Deep learning</subject><subject>Derivatives</subject><subject>Fractional calculus</subject><subject>Machine learning</subject><subject>Neural networks</subject><subject>Performance indices</subject><subject>Radial basis function</subject><subject>Recurrent neural networks</subject><subject>Stability</subject><subject>Synchronism</subject><subject>Task complexity</subject><issn>0269-2821</issn><issn>1573-7462</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><recordid>eNp9kE1LAzEQhoMoWKt_wNOC59R8bDbZYyl-UvCi55BmJyV1u1mTXaX_3tgVvMkcZnh5n2HmReiakgUlRN4mSsqKYcI4zpMssTpBMyokxzLrp2hGWFVjphg9Rxcp7QghgpV8hp6XRRrjJxyK4AoXjR186ExbWNPasR1TYfq-9db8yKnwXWHi4J23Pns6GOOxDV8hvqdLdOZMm-Dqt8_R2_3d6-oRr18enlbLNbZcqAE3de1qQ6loNo0UToHlVJhalJI2RDWWs8qBMRygElRQUxOwTjVgQIjGbUo-RzfT3j6GjxHSoHdhjPnopJmqZEkFl3V2LSbX1rSgfefCkL_L1cDe29CB81lfyqriRErCM8AmwMaQUgSn--j3Jh40JfonZD2FrHPI-hiyVhniE5SyudtC_LvlH-ob94qAQw</recordid><startdate>20231101</startdate><enddate>20231101</enddate><creator>Joshi, Manisha</creator><creator>Bhosale, Savita</creator><creator>Vyawahare, Vishwesh A.</creator><general>Springer Netherlands</general><general>Springer</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7SC</scope><scope>7WY</scope><scope>7WZ</scope><scope>7XB</scope><scope>87Z</scope><scope>8AL</scope><scope>8AO</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FK</scope><scope>8FL</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ALSLI</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BEZIV</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>CNYFK</scope><scope>DWQXO</scope><scope>E3H</scope><scope>F2A</scope><scope>FRNLG</scope><scope>F~G</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K60</scope><scope>K6~</scope><scope>K7-</scope><scope>L.-</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>M0C</scope><scope>M0N</scope><scope>M1O</scope><scope>P5Z</scope><scope>P62</scope><scope>PQBIZ</scope><scope>PQBZA</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PSYQQ</scope><scope>Q9U</scope></search><sort><creationdate>20231101</creationdate><title>A survey of fractional calculus applications in artificial neural networks</title><author>Joshi, Manisha ; Bhosale, Savita ; Vyawahare, Vishwesh A.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c358t-d99f9a115dbd75f8ec315a95471d08dc326feaa3ee65151a90ecf8deae55dfb43</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Algorithms</topic><topic>Artificial Intelligence</topic><topic>Artificial neural networks</topic><topic>Back propagation networks</topic><topic>Calculus</topic><topic>Computer architecture</topic><topic>Computer Science</topic><topic>Data mining</topic><topic>Deep learning</topic><topic>Derivatives</topic><topic>Fractional calculus</topic><topic>Machine learning</topic><topic>Neural networks</topic><topic>Performance indices</topic><topic>Radial basis function</topic><topic>Recurrent neural networks</topic><topic>Stability</topic><topic>Synchronism</topic><topic>Task complexity</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Joshi, Manisha</creatorcontrib><creatorcontrib>Bhosale, Savita</creatorcontrib><creatorcontrib>Vyawahare, Vishwesh A.</creatorcontrib><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Computer and Information Systems Abstracts</collection><collection>ABI/INFORM Collection</collection><collection>ABI/INFORM Global (PDF only)</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>ABI/INFORM Global (Alumni Edition)</collection><collection>Computing Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ABI/INFORM Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Social Science Premium Collection</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Business Premium Collection</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>Library & Information Science Collection</collection><collection>ProQuest Central Korea</collection><collection>Library & Information Sciences Abstracts (LISA)</collection><collection>Library & Information Science Abstracts (LISA)</collection><collection>Business Premium Collection (Alumni)</collection><collection>ABI/INFORM Global (Corporate)</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>ProQuest Business Collection (Alumni Edition)</collection><collection>ProQuest Business Collection</collection><collection>Computer Science Database</collection><collection>ABI/INFORM Professional Advanced</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>ABI/INFORM Global</collection><collection>Computing Database</collection><collection>Library Science Database</collection><collection>Advanced Technologies & Aerospace Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>ProQuest One Business</collection><collection>ProQuest One Business (Alumni)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest One Psychology</collection><collection>ProQuest Central Basic</collection><jtitle>The Artificial intelligence review</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Joshi, Manisha</au><au>Bhosale, Savita</au><au>Vyawahare, Vishwesh A.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A survey of fractional calculus applications in artificial neural networks</atitle><jtitle>The Artificial intelligence review</jtitle><stitle>Artif Intell Rev</stitle><date>2023-11-01</date><risdate>2023</risdate><volume>56</volume><issue>11</issue><spage>13897</spage><epage>13950</epage><pages>13897-13950</pages><issn>0269-2821</issn><eissn>1573-7462</eissn><abstract>Artificial neural network (ANN) is the backbone of machine learning, specifically deep learning. The interpolating and learning ability of an ANN makes it an ideal tool for modelling, control and various other complex tasks. Fractional calculus (FC) involving derivatives and integrals of arbitrary non-integer order has recently been popular for its capability to model memory-type systems. There have been many attempts to explore the possibilities of combining these two fields, the most popular combination being the use of fractional derivative in the learning algorithm. This paper reviews the use of fractional calculus in various artificial neural network architectures, such as radial basis functions, recurrent neural networks, backpropagation NNs, and convolutional neural networks. These ANNs are popularly known as fractional-order artificial neural networks (FANNs). A detailed review of the various concepts related to FANNs, including activation functions, training algorithms based on fractional derivative, stability, synchronization, hardware implementations of FANNs, and real-world applications of FANNs, is presented. The study also highlights the advantage of combining fractional derivatives with ANN, the impact of fractional derivative order on performance indices like mean square error, the time required for training and testing FANN, stability, and synchronization in FANN. The survey reports interesting observations: combining FC to an ANN endows it with the memory feature; Caputo definition of fractional derivative is the most commonly used in FANNs; fractional derivative-based activation functions in ANN provide additional adjustable hyperparameters to the networks; the FANN has more degree of freedom for adjusting parameters compared to an ordinary ANN; use of multiple types of activation functions can be employed in FANN, and many more.</abstract><cop>Dordrecht</cop><pub>Springer Netherlands</pub><doi>10.1007/s10462-023-10474-8</doi><tpages>54</tpages></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0269-2821 |
ispartof | The Artificial intelligence review, 2023-11, Vol.56 (11), p.13897-13950 |
issn | 0269-2821 1573-7462 |
language | eng |
recordid | cdi_proquest_journals_2867415379 |
source | Springer Nature - Complete Springer Journals |
subjects | Algorithms Artificial Intelligence Artificial neural networks Back propagation networks Calculus Computer architecture Computer Science Data mining Deep learning Derivatives Fractional calculus Machine learning Neural networks Performance indices Radial basis function Recurrent neural networks Stability Synchronism Task complexity |
title | A survey of fractional calculus applications in artificial neural networks |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-15T20%3A45%3A27IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale_proqu&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20survey%20of%20fractional%20calculus%20applications%20in%20artificial%20neural%20networks&rft.jtitle=The%20Artificial%20intelligence%20review&rft.au=Joshi,%20Manisha&rft.date=2023-11-01&rft.volume=56&rft.issue=11&rft.spage=13897&rft.epage=13950&rft.pages=13897-13950&rft.issn=0269-2821&rft.eissn=1573-7462&rft_id=info:doi/10.1007/s10462-023-10474-8&rft_dat=%3Cgale_proqu%3EA766307703%3C/gale_proqu%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2867415379&rft_id=info:pmid/&rft_galeid=A766307703&rfr_iscdi=true |