Regression Models for Symbolic Interval-Valued Variables
This paper presents new approaches to fit regression models for symbolic internal-valued variables, which are shown to improve and extend the center method suggested by Billard and Diday and the center and range method proposed by Lima-Neto, E.A.and De Carvalho, F.A.T. Like the previously mentioned...
Gespeichert in:
Veröffentlicht in: | Entropy (Basel, Switzerland) Switzerland), 2021-04, Vol.23 (4), p.429 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | 4 |
container_start_page | 429 |
container_title | Entropy (Basel, Switzerland) |
container_volume | 23 |
creator | Chacón, Jose Emmanuel Rodríguez, Oldemar |
description | This paper presents new approaches to fit regression models for symbolic internal-valued variables, which are shown to improve and extend the center method suggested by Billard and Diday and the center and range method proposed by Lima-Neto, E.A.and De Carvalho, F.A.T. Like the previously mentioned methods, the proposed regression models consider the midpoints and half of the length of the intervals as additional variables. We considered various methods to fit the regression models, including tree-based models, K-nearest neighbors, support vector machines, and neural networks. The approaches proposed in this paper were applied to a real dataset and to synthetic datasets generated with linear and nonlinear relations. For an evaluation of the methods, the root-mean-squared error and the correlation coefficient were used. The methods presented herein are available in the the RSDA package written in the R language, which can be installed from CRAN. |
doi_str_mv | 10.3390/e23040429 |
format | Article |
fullrecord | <record><control><sourceid>proquest_doaj_</sourceid><recordid>TN_cdi_doaj_primary_oai_doaj_org_article_38c079a265c54b7e8db1e134a199fb76</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><doaj_id>oai_doaj_org_article_38c079a265c54b7e8db1e134a199fb76</doaj_id><sourcerecordid>2531398564</sourcerecordid><originalsourceid>FETCH-LOGICAL-c469t-fbae47073a0f23fae20050b9fb65ef051cb9fd6e107562deb4bf5f55060ed73</originalsourceid><addsrcrecordid>eNpdkU1rFTEUhgex2A9d-AdkwI0upp58TzZCKVovtAhWug1J5uQ6l9xJTWYK_fem3nppXeWQPHnykrdp3hI4ZUzDJ6QMOHCqXzRHBLTuOAN4-WQ-bI5L2QBQRol81RzWW0QxQo-a_geuM5Yypqm9SgPG0oaU2-v7rUtx9O1qmjHf2djd2Ljg0N7YPFoXsbxuDoKNBd88rifN9dcvP8-_dZffL1bnZ5ed51LPXXAWuQLFLATKgkUKIMDp4KTAAIL4Og8SCSgh6YCOuyCCECABB8VOmtXOOiS7Mbd53Np8b5Idzd-NlNfG5nn0EQ3rPShtqRRecKewHxxBwrglur6mZHV93rluF7fFweM0ZxufSZ-fTOMvs053pgepOOur4MOjIKffC5bZbMfiMUY7YVqKoYJCL0kNX9H3_6GbtOSpflSlGGG6F5JX6uOO8jmVkjHswxAwD82afbOVffc0_Z78VyX7A0h-nQs</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2531398564</pqid></control><display><type>article</type><title>Regression Models for Symbolic Interval-Valued Variables</title><source>MDPI - Multidisciplinary Digital Publishing Institute</source><source>DOAJ Directory of Open Access Journals</source><source>PubMed Central</source><source>EZB Electronic Journals Library</source><source>PubMed Central Open Access</source><creator>Chacón, Jose Emmanuel ; Rodríguez, Oldemar</creator><creatorcontrib>Chacón, Jose Emmanuel ; Rodríguez, Oldemar</creatorcontrib><description>This paper presents new approaches to fit regression models for symbolic internal-valued variables, which are shown to improve and extend the center method suggested by Billard and Diday and the center and range method proposed by Lima-Neto, E.A.and De Carvalho, F.A.T. Like the previously mentioned methods, the proposed regression models consider the midpoints and half of the length of the intervals as additional variables. We considered various methods to fit the regression models, including tree-based models, K-nearest neighbors, support vector machines, and neural networks. The approaches proposed in this paper were applied to a real dataset and to synthetic datasets generated with linear and nonlinear relations. For an evaluation of the methods, the root-mean-squared error and the correlation coefficient were used. The methods presented herein are available in the the RSDA package written in the R language, which can be installed from CRAN.</description><identifier>ISSN: 1099-4300</identifier><identifier>EISSN: 1099-4300</identifier><identifier>DOI: 10.3390/e23040429</identifier><identifier>PMID: 33917312</identifier><language>eng</language><publisher>Switzerland: MDPI AG</publisher><subject>Algorithms ; boosting ; Correlation coefficients ; Data analysis ; Data mining ; Datasets ; decision trees ; K-nearest neighbors ; Methods ; Neural networks ; random forest ; regression ; Regression analysis ; Regression models ; Support vector machines ; Variables</subject><ispartof>Entropy (Basel, Switzerland), 2021-04, Vol.23 (4), p.429</ispartof><rights>2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>2021 by the authors. 2021</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c469t-fbae47073a0f23fae20050b9fb65ef051cb9fd6e107562deb4bf5f55060ed73</citedby><cites>FETCH-LOGICAL-c469t-fbae47073a0f23fae20050b9fb65ef051cb9fd6e107562deb4bf5f55060ed73</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC8067438/pdf/$$EPDF$$P50$$Gpubmedcentral$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC8067438/$$EHTML$$P50$$Gpubmedcentral$$Hfree_for_read</linktohtml><link.rule.ids>230,314,723,776,780,860,881,2096,27901,27902,53766,53768</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/33917312$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Chacón, Jose Emmanuel</creatorcontrib><creatorcontrib>Rodríguez, Oldemar</creatorcontrib><title>Regression Models for Symbolic Interval-Valued Variables</title><title>Entropy (Basel, Switzerland)</title><addtitle>Entropy (Basel)</addtitle><description>This paper presents new approaches to fit regression models for symbolic internal-valued variables, which are shown to improve and extend the center method suggested by Billard and Diday and the center and range method proposed by Lima-Neto, E.A.and De Carvalho, F.A.T. Like the previously mentioned methods, the proposed regression models consider the midpoints and half of the length of the intervals as additional variables. We considered various methods to fit the regression models, including tree-based models, K-nearest neighbors, support vector machines, and neural networks. The approaches proposed in this paper were applied to a real dataset and to synthetic datasets generated with linear and nonlinear relations. For an evaluation of the methods, the root-mean-squared error and the correlation coefficient were used. The methods presented herein are available in the the RSDA package written in the R language, which can be installed from CRAN.</description><subject>Algorithms</subject><subject>boosting</subject><subject>Correlation coefficients</subject><subject>Data analysis</subject><subject>Data mining</subject><subject>Datasets</subject><subject>decision trees</subject><subject>K-nearest neighbors</subject><subject>Methods</subject><subject>Neural networks</subject><subject>random forest</subject><subject>regression</subject><subject>Regression analysis</subject><subject>Regression models</subject><subject>Support vector machines</subject><subject>Variables</subject><issn>1099-4300</issn><issn>1099-4300</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>BENPR</sourceid><sourceid>DOA</sourceid><recordid>eNpdkU1rFTEUhgex2A9d-AdkwI0upp58TzZCKVovtAhWug1J5uQ6l9xJTWYK_fem3nppXeWQPHnykrdp3hI4ZUzDJ6QMOHCqXzRHBLTuOAN4-WQ-bI5L2QBQRol81RzWW0QxQo-a_geuM5Yypqm9SgPG0oaU2-v7rUtx9O1qmjHf2djd2Ljg0N7YPFoXsbxuDoKNBd88rifN9dcvP8-_dZffL1bnZ5ed51LPXXAWuQLFLATKgkUKIMDp4KTAAIL4Og8SCSgh6YCOuyCCECABB8VOmtXOOiS7Mbd53Np8b5Idzd-NlNfG5nn0EQ3rPShtqRRecKewHxxBwrglur6mZHV93rluF7fFweM0ZxufSZ-fTOMvs053pgepOOur4MOjIKffC5bZbMfiMUY7YVqKoYJCL0kNX9H3_6GbtOSpflSlGGG6F5JX6uOO8jmVkjHswxAwD82afbOVffc0_Z78VyX7A0h-nQs</recordid><startdate>20210406</startdate><enddate>20210406</enddate><creator>Chacón, Jose Emmanuel</creator><creator>Rodríguez, Oldemar</creator><general>MDPI AG</general><general>MDPI</general><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7TB</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FR3</scope><scope>HCIFZ</scope><scope>KR7</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope><scope>7X8</scope><scope>5PM</scope><scope>DOA</scope></search><sort><creationdate>20210406</creationdate><title>Regression Models for Symbolic Interval-Valued Variables</title><author>Chacón, Jose Emmanuel ; Rodríguez, Oldemar</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c469t-fbae47073a0f23fae20050b9fb65ef051cb9fd6e107562deb4bf5f55060ed73</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Algorithms</topic><topic>boosting</topic><topic>Correlation coefficients</topic><topic>Data analysis</topic><topic>Data mining</topic><topic>Datasets</topic><topic>decision trees</topic><topic>K-nearest neighbors</topic><topic>Methods</topic><topic>Neural networks</topic><topic>random forest</topic><topic>regression</topic><topic>Regression analysis</topic><topic>Regression models</topic><topic>Support vector machines</topic><topic>Variables</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Chacón, Jose Emmanuel</creatorcontrib><creatorcontrib>Rodríguez, Oldemar</creatorcontrib><collection>PubMed</collection><collection>CrossRef</collection><collection>Mechanical & Transportation Engineering Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science & Engineering Database (Proquest)</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central</collection><collection>Engineering Research Database</collection><collection>SciTech Premium Collection</collection><collection>Civil Engineering Abstracts</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering collection</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>Entropy (Basel, Switzerland)</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Chacón, Jose Emmanuel</au><au>Rodríguez, Oldemar</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Regression Models for Symbolic Interval-Valued Variables</atitle><jtitle>Entropy (Basel, Switzerland)</jtitle><addtitle>Entropy (Basel)</addtitle><date>2021-04-06</date><risdate>2021</risdate><volume>23</volume><issue>4</issue><spage>429</spage><pages>429-</pages><issn>1099-4300</issn><eissn>1099-4300</eissn><abstract>This paper presents new approaches to fit regression models for symbolic internal-valued variables, which are shown to improve and extend the center method suggested by Billard and Diday and the center and range method proposed by Lima-Neto, E.A.and De Carvalho, F.A.T. Like the previously mentioned methods, the proposed regression models consider the midpoints and half of the length of the intervals as additional variables. We considered various methods to fit the regression models, including tree-based models, K-nearest neighbors, support vector machines, and neural networks. The approaches proposed in this paper were applied to a real dataset and to synthetic datasets generated with linear and nonlinear relations. For an evaluation of the methods, the root-mean-squared error and the correlation coefficient were used. The methods presented herein are available in the the RSDA package written in the R language, which can be installed from CRAN.</abstract><cop>Switzerland</cop><pub>MDPI AG</pub><pmid>33917312</pmid><doi>10.3390/e23040429</doi><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1099-4300 |
ispartof | Entropy (Basel, Switzerland), 2021-04, Vol.23 (4), p.429 |
issn | 1099-4300 1099-4300 |
language | eng |
recordid | cdi_doaj_primary_oai_doaj_org_article_38c079a265c54b7e8db1e134a199fb76 |
source | MDPI - Multidisciplinary Digital Publishing Institute; DOAJ Directory of Open Access Journals; PubMed Central; EZB Electronic Journals Library; PubMed Central Open Access |
subjects | Algorithms boosting Correlation coefficients Data analysis Data mining Datasets decision trees K-nearest neighbors Methods Neural networks random forest regression Regression analysis Regression models Support vector machines Variables |
title | Regression Models for Symbolic Interval-Valued Variables |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-01T09%3A27%3A35IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_doaj_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Regression%20Models%20for%20Symbolic%20Interval-Valued%20Variables&rft.jtitle=Entropy%20(Basel,%20Switzerland)&rft.au=Chac%C3%B3n,%20Jose%20Emmanuel&rft.date=2021-04-06&rft.volume=23&rft.issue=4&rft.spage=429&rft.pages=429-&rft.issn=1099-4300&rft.eissn=1099-4300&rft_id=info:doi/10.3390/e23040429&rft_dat=%3Cproquest_doaj_%3E2531398564%3C/proquest_doaj_%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2531398564&rft_id=info:pmid/33917312&rft_doaj_id=oai_doaj_org_article_38c079a265c54b7e8db1e134a199fb76&rfr_iscdi=true |