Hurst entropy: A method to determine predictability in a binary series based on a fractal-related process
Shannon's concept of information is related to predictability. In a binary series, the value of information relies on the frequency of 0's and 1's, or how it is expected to occur. However, information entropy does not consider the bias in randomness related to autocorrelation. In fact...
Gespeichert in:
Veröffentlicht in: | Physical review. E 2019-06, Vol.99 (6-1), p.062115-062115, Article 062115 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 062115 |
---|---|
container_issue | 6-1 |
container_start_page | 062115 |
container_title | Physical review. E |
container_volume | 99 |
creator | Ferraz, Mariana Sacrini Ayres Kihara, Alexandre Hiroaki |
description | Shannon's concept of information is related to predictability. In a binary series, the value of information relies on the frequency of 0's and 1's, or how it is expected to occur. However, information entropy does not consider the bias in randomness related to autocorrelation. In fact, it is possible for a binary temporal series to carry both short- and long-term memories related to the sequential distribution of 0's and 1's. Although the Hurst exponent measures the range of autocorrelation, there is a lack of mathematical connection between information entropy and autocorrelation present in the series. To fill this important gap, we combined numerical simulations and an analytical approach to determine how information entropy changes according to the frequency of 0's and 1's and the Hurst exponent. Indeed, we were able to determine how predictability depends on both parameters. Our findings are certainly useful to several fields when binary times series are applied, such as neuroscience to econophysics. |
doi_str_mv | 10.1103/PhysRevE.99.062115 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_2264226036</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2264226036</sourcerecordid><originalsourceid>FETCH-LOGICAL-c303t-a19065bc33164e95d24716276ed33964bb61b4b9dd940dff3b524a842e3d4f623</originalsourceid><addsrcrecordid>eNo9UE1LAzEQDaLYUvsHPEiOXrYmmWxqvJVSrVBQRM9Lspmlkf2oSVbov3dLaw_DDDPvveE9Qm45m3HO4OF9u48f-LuaaT1jSnCeX5CxkHOWMZbD5XmW-YhMY_xmjHHF9JyLazICDsAUzMfEr_sQE8U2hW63f6IL2mDado6mjjpMGBrfIt0FdL5Mxvrapz31LTXU-taEPY0YPEZqTURHu8OhCmaA1lnA2qRhuQtdiTHekKvK1BGnpz4hX8-rz-U627y9vC4Xm6wEBikzXDOV2xKAK4k6d4MPrsRcoQPQSlqruJVWO6clc1UFNhfSPEqB4GSlBEzI_VF3-PvTY0xF42OJdW1a7PpYCKHkUAzUABVHaBm6GANWxS74ZnBVcFYcUi7-Uy60Lo4pD6S7k35vG3Rnyn-m8Ad1qXnW</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2264226036</pqid></control><display><type>article</type><title>Hurst entropy: A method to determine predictability in a binary series based on a fractal-related process</title><source>American Physical Society Journals</source><creator>Ferraz, Mariana Sacrini Ayres ; Kihara, Alexandre Hiroaki</creator><creatorcontrib>Ferraz, Mariana Sacrini Ayres ; Kihara, Alexandre Hiroaki</creatorcontrib><description>Shannon's concept of information is related to predictability. In a binary series, the value of information relies on the frequency of 0's and 1's, or how it is expected to occur. However, information entropy does not consider the bias in randomness related to autocorrelation. In fact, it is possible for a binary temporal series to carry both short- and long-term memories related to the sequential distribution of 0's and 1's. Although the Hurst exponent measures the range of autocorrelation, there is a lack of mathematical connection between information entropy and autocorrelation present in the series. To fill this important gap, we combined numerical simulations and an analytical approach to determine how information entropy changes according to the frequency of 0's and 1's and the Hurst exponent. Indeed, we were able to determine how predictability depends on both parameters. Our findings are certainly useful to several fields when binary times series are applied, such as neuroscience to econophysics.</description><identifier>ISSN: 2470-0045</identifier><identifier>EISSN: 2470-0053</identifier><identifier>DOI: 10.1103/PhysRevE.99.062115</identifier><identifier>PMID: 31330637</identifier><language>eng</language><publisher>United States</publisher><ispartof>Physical review. E, 2019-06, Vol.99 (6-1), p.062115-062115, Article 062115</ispartof><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c303t-a19065bc33164e95d24716276ed33964bb61b4b9dd940dff3b524a842e3d4f623</citedby><cites>FETCH-LOGICAL-c303t-a19065bc33164e95d24716276ed33964bb61b4b9dd940dff3b524a842e3d4f623</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,776,780,2863,2864,27901,27902</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/31330637$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Ferraz, Mariana Sacrini Ayres</creatorcontrib><creatorcontrib>Kihara, Alexandre Hiroaki</creatorcontrib><title>Hurst entropy: A method to determine predictability in a binary series based on a fractal-related process</title><title>Physical review. E</title><addtitle>Phys Rev E</addtitle><description>Shannon's concept of information is related to predictability. In a binary series, the value of information relies on the frequency of 0's and 1's, or how it is expected to occur. However, information entropy does not consider the bias in randomness related to autocorrelation. In fact, it is possible for a binary temporal series to carry both short- and long-term memories related to the sequential distribution of 0's and 1's. Although the Hurst exponent measures the range of autocorrelation, there is a lack of mathematical connection between information entropy and autocorrelation present in the series. To fill this important gap, we combined numerical simulations and an analytical approach to determine how information entropy changes according to the frequency of 0's and 1's and the Hurst exponent. Indeed, we were able to determine how predictability depends on both parameters. Our findings are certainly useful to several fields when binary times series are applied, such as neuroscience to econophysics.</description><issn>2470-0045</issn><issn>2470-0053</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><recordid>eNo9UE1LAzEQDaLYUvsHPEiOXrYmmWxqvJVSrVBQRM9Lspmlkf2oSVbov3dLaw_DDDPvveE9Qm45m3HO4OF9u48f-LuaaT1jSnCeX5CxkHOWMZbD5XmW-YhMY_xmjHHF9JyLazICDsAUzMfEr_sQE8U2hW63f6IL2mDado6mjjpMGBrfIt0FdL5Mxvrapz31LTXU-taEPY0YPEZqTURHu8OhCmaA1lnA2qRhuQtdiTHekKvK1BGnpz4hX8-rz-U627y9vC4Xm6wEBikzXDOV2xKAK4k6d4MPrsRcoQPQSlqruJVWO6clc1UFNhfSPEqB4GSlBEzI_VF3-PvTY0xF42OJdW1a7PpYCKHkUAzUABVHaBm6GANWxS74ZnBVcFYcUi7-Uy60Lo4pD6S7k35vG3Rnyn-m8Ad1qXnW</recordid><startdate>201906</startdate><enddate>201906</enddate><creator>Ferraz, Mariana Sacrini Ayres</creator><creator>Kihara, Alexandre Hiroaki</creator><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope></search><sort><creationdate>201906</creationdate><title>Hurst entropy: A method to determine predictability in a binary series based on a fractal-related process</title><author>Ferraz, Mariana Sacrini Ayres ; Kihara, Alexandre Hiroaki</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c303t-a19065bc33164e95d24716276ed33964bb61b4b9dd940dff3b524a842e3d4f623</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Ferraz, Mariana Sacrini Ayres</creatorcontrib><creatorcontrib>Kihara, Alexandre Hiroaki</creatorcontrib><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><jtitle>Physical review. E</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Ferraz, Mariana Sacrini Ayres</au><au>Kihara, Alexandre Hiroaki</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Hurst entropy: A method to determine predictability in a binary series based on a fractal-related process</atitle><jtitle>Physical review. E</jtitle><addtitle>Phys Rev E</addtitle><date>2019-06</date><risdate>2019</risdate><volume>99</volume><issue>6-1</issue><spage>062115</spage><epage>062115</epage><pages>062115-062115</pages><artnum>062115</artnum><issn>2470-0045</issn><eissn>2470-0053</eissn><abstract>Shannon's concept of information is related to predictability. In a binary series, the value of information relies on the frequency of 0's and 1's, or how it is expected to occur. However, information entropy does not consider the bias in randomness related to autocorrelation. In fact, it is possible for a binary temporal series to carry both short- and long-term memories related to the sequential distribution of 0's and 1's. Although the Hurst exponent measures the range of autocorrelation, there is a lack of mathematical connection between information entropy and autocorrelation present in the series. To fill this important gap, we combined numerical simulations and an analytical approach to determine how information entropy changes according to the frequency of 0's and 1's and the Hurst exponent. Indeed, we were able to determine how predictability depends on both parameters. Our findings are certainly useful to several fields when binary times series are applied, such as neuroscience to econophysics.</abstract><cop>United States</cop><pmid>31330637</pmid><doi>10.1103/PhysRevE.99.062115</doi><tpages>1</tpages></addata></record> |
fulltext | fulltext |
identifier | ISSN: 2470-0045 |
ispartof | Physical review. E, 2019-06, Vol.99 (6-1), p.062115-062115, Article 062115 |
issn | 2470-0045 2470-0053 |
language | eng |
recordid | cdi_proquest_miscellaneous_2264226036 |
source | American Physical Society Journals |
title | Hurst entropy: A method to determine predictability in a binary series based on a fractal-related process |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-03T03%3A46%3A11IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Hurst%20entropy:%20A%20method%20to%20determine%20predictability%20in%20a%20binary%20series%20based%20on%20a%20fractal-related%20process&rft.jtitle=Physical%20review.%20E&rft.au=Ferraz,%20Mariana%20Sacrini%20Ayres&rft.date=2019-06&rft.volume=99&rft.issue=6-1&rft.spage=062115&rft.epage=062115&rft.pages=062115-062115&rft.artnum=062115&rft.issn=2470-0045&rft.eissn=2470-0053&rft_id=info:doi/10.1103/PhysRevE.99.062115&rft_dat=%3Cproquest_cross%3E2264226036%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2264226036&rft_id=info:pmid/31330637&rfr_iscdi=true |