SymbolicGPT: A Generative Transformer Model for Symbolic Regression
Symbolic regression is the task of identifying a mathematical expression that best fits a provided dataset of input and output values. Due to the richness of the space of mathematical expressions, symbolic regression is generally a challenging problem. While conventional approaches based on genetic...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | |
container_volume | |
creator | Valipour, Mojtaba You, Bowen Panju, Maysum Ghodsi, Ali |
description | Symbolic regression is the task of identifying a mathematical expression that
best fits a provided dataset of input and output values. Due to the richness of
the space of mathematical expressions, symbolic regression is generally a
challenging problem. While conventional approaches based on genetic evolution
algorithms have been used for decades, deep learning-based methods are
relatively new and an active research area. In this work, we present
SymbolicGPT, a novel transformer-based language model for symbolic regression.
This model exploits the advantages of probabilistic language models like GPT,
including strength in performance and flexibility. Through comprehensive
experiments, we show that our model performs strongly compared to competing
models with respect to the accuracy, running time, and data efficiency. |
doi_str_mv | 10.48550/arxiv.2106.14131 |
format | Article |
fullrecord | <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2106_14131</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2106_14131</sourcerecordid><originalsourceid>FETCH-LOGICAL-a671-f1d8b48241a4d648bbad3f6974fe5af28efff42795f83519e38e130c66273a363</originalsourceid><addsrcrecordid>eNo1z81Kw0AUBeDZdCHVB3DlvEBi7vxl4q4EjUJLi80-3DT3loH8lIkU-_Zq1dXhwOHAJ8Q9ZKnx1maPGD_DOVWQuRQMaLgR5f4ytFMfDtWufpIrWdFIET_CmWQdcZx5igNFuZk66uV3kf97-U7HSPMcpvFWLBj7me7-cinql-e6fE3W2-qtXK0TdDkkDJ1vjVcG0HTO-LbFTrMrcsNkkZUnZjYqLyx7baEg7Ql0dnBO5Rq100vx8Ht7VTSnGAaMl-ZH01w1-gvFsERb</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>SymbolicGPT: A Generative Transformer Model for Symbolic Regression</title><source>arXiv.org</source><creator>Valipour, Mojtaba ; You, Bowen ; Panju, Maysum ; Ghodsi, Ali</creator><creatorcontrib>Valipour, Mojtaba ; You, Bowen ; Panju, Maysum ; Ghodsi, Ali</creatorcontrib><description>Symbolic regression is the task of identifying a mathematical expression that
best fits a provided dataset of input and output values. Due to the richness of
the space of mathematical expressions, symbolic regression is generally a
challenging problem. While conventional approaches based on genetic evolution
algorithms have been used for decades, deep learning-based methods are
relatively new and an active research area. In this work, we present
SymbolicGPT, a novel transformer-based language model for symbolic regression.
This model exploits the advantages of probabilistic language models like GPT,
including strength in performance and flexibility. Through comprehensive
experiments, we show that our model performs strongly compared to competing
models with respect to the accuracy, running time, and data efficiency.</description><identifier>DOI: 10.48550/arxiv.2106.14131</identifier><language>eng</language><subject>Computer Science - Computation and Language ; Computer Science - Learning ; Computer Science - Symbolic Computation</subject><creationdate>2021-06</creationdate><rights>http://creativecommons.org/licenses/by-sa/4.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,780,885</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2106.14131$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2106.14131$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Valipour, Mojtaba</creatorcontrib><creatorcontrib>You, Bowen</creatorcontrib><creatorcontrib>Panju, Maysum</creatorcontrib><creatorcontrib>Ghodsi, Ali</creatorcontrib><title>SymbolicGPT: A Generative Transformer Model for Symbolic Regression</title><description>Symbolic regression is the task of identifying a mathematical expression that
best fits a provided dataset of input and output values. Due to the richness of
the space of mathematical expressions, symbolic regression is generally a
challenging problem. While conventional approaches based on genetic evolution
algorithms have been used for decades, deep learning-based methods are
relatively new and an active research area. In this work, we present
SymbolicGPT, a novel transformer-based language model for symbolic regression.
This model exploits the advantages of probabilistic language models like GPT,
including strength in performance and flexibility. Through comprehensive
experiments, we show that our model performs strongly compared to competing
models with respect to the accuracy, running time, and data efficiency.</description><subject>Computer Science - Computation and Language</subject><subject>Computer Science - Learning</subject><subject>Computer Science - Symbolic Computation</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNo1z81Kw0AUBeDZdCHVB3DlvEBi7vxl4q4EjUJLi80-3DT3loH8lIkU-_Zq1dXhwOHAJ8Q9ZKnx1maPGD_DOVWQuRQMaLgR5f4ytFMfDtWufpIrWdFIET_CmWQdcZx5igNFuZk66uV3kf97-U7HSPMcpvFWLBj7me7-cinql-e6fE3W2-qtXK0TdDkkDJ1vjVcG0HTO-LbFTrMrcsNkkZUnZjYqLyx7baEg7Ql0dnBO5Rq100vx8Ht7VTSnGAaMl-ZH01w1-gvFsERb</recordid><startdate>20210626</startdate><enddate>20210626</enddate><creator>Valipour, Mojtaba</creator><creator>You, Bowen</creator><creator>Panju, Maysum</creator><creator>Ghodsi, Ali</creator><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20210626</creationdate><title>SymbolicGPT: A Generative Transformer Model for Symbolic Regression</title><author>Valipour, Mojtaba ; You, Bowen ; Panju, Maysum ; Ghodsi, Ali</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a671-f1d8b48241a4d648bbad3f6974fe5af28efff42795f83519e38e130c66273a363</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Computer Science - Computation and Language</topic><topic>Computer Science - Learning</topic><topic>Computer Science - Symbolic Computation</topic><toplevel>online_resources</toplevel><creatorcontrib>Valipour, Mojtaba</creatorcontrib><creatorcontrib>You, Bowen</creatorcontrib><creatorcontrib>Panju, Maysum</creatorcontrib><creatorcontrib>Ghodsi, Ali</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Valipour, Mojtaba</au><au>You, Bowen</au><au>Panju, Maysum</au><au>Ghodsi, Ali</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>SymbolicGPT: A Generative Transformer Model for Symbolic Regression</atitle><date>2021-06-26</date><risdate>2021</risdate><abstract>Symbolic regression is the task of identifying a mathematical expression that
best fits a provided dataset of input and output values. Due to the richness of
the space of mathematical expressions, symbolic regression is generally a
challenging problem. While conventional approaches based on genetic evolution
algorithms have been used for decades, deep learning-based methods are
relatively new and an active research area. In this work, we present
SymbolicGPT, a novel transformer-based language model for symbolic regression.
This model exploits the advantages of probabilistic language models like GPT,
including strength in performance and flexibility. Through comprehensive
experiments, we show that our model performs strongly compared to competing
models with respect to the accuracy, running time, and data efficiency.</abstract><doi>10.48550/arxiv.2106.14131</doi><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | DOI: 10.48550/arxiv.2106.14131 |
ispartof | |
issn | |
language | eng |
recordid | cdi_arxiv_primary_2106_14131 |
source | arXiv.org |
subjects | Computer Science - Computation and Language Computer Science - Learning Computer Science - Symbolic Computation |
title | SymbolicGPT: A Generative Transformer Model for Symbolic Regression |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-14T22%3A51%3A04IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=SymbolicGPT:%20A%20Generative%20Transformer%20Model%20for%20Symbolic%20Regression&rft.au=Valipour,%20Mojtaba&rft.date=2021-06-26&rft_id=info:doi/10.48550/arxiv.2106.14131&rft_dat=%3Carxiv_GOX%3E2106_14131%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true |