Static Inference Meets Deep Learning: A Hybrid Type Inference Approach for Python
Type inference for dynamic programming languages such as Python is an important yet challenging task. Static type inference techniques can precisely infer variables with enough static constraints but are unable to handle variables with dynamic features. Deep learning (DL) based approaches are featur...
Gespeichert in:
Veröffentlicht in: | arXiv.org 2022-02 |
---|---|
Hauptverfasser: | , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | arXiv.org |
container_volume | |
creator | Peng, Yun Gao, Cuiyun Li, Zongjie Gao, Bowei Lo, David Zhang, Qirun Lyu, Michael |
description | Type inference for dynamic programming languages such as Python is an important yet challenging task. Static type inference techniques can precisely infer variables with enough static constraints but are unable to handle variables with dynamic features. Deep learning (DL) based approaches are feature-agnostic, but they cannot guarantee the correctness of the predicted types. Their performance significantly depends on the quality of the training data (i.e., DL models perform poorly on some common types that rarely appear in the training dataset). It is interesting to note that the static and DL-based approaches offer complementary benefits. Unfortunately, to our knowledge, precise type inference based on both static inference and neural predictions has not been exploited and remains an open challenge. In particular, it is hard to integrate DL models into the framework of rule-based static approaches. This paper fills the gap and proposes a hybrid type inference approach named HiTyper based on both static inference and deep learning. Specifically, our key insight is to record type dependencies among variables in each function and encode the dependency information in type dependency graphs (TDGs). Based on TDGs, we can easily integrate type inference rules in the nodes to conduct static inference and type rejection rules to inspect the correctness of neural predictions. HiTyper iteratively conducts static inference and DL-based prediction until the TDG is fully inferred. Experiments on two benchmark datasets show that HiTyper outperforms state-of-the-art DL models by exactly matching 10% more human annotations. HiTyper also achieves an increase of more than 30% on inferring rare types. Considering only the static part of HiTyper, it infers 2x ~ 3x more types than existing static type inference tools. |
doi_str_mv | 10.48550/arxiv.2105.03595 |
format | Article |
fullrecord | <record><control><sourceid>proquest_arxiv</sourceid><recordid>TN_cdi_arxiv_primary_2105_03595</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2525282815</sourcerecordid><originalsourceid>FETCH-LOGICAL-a525-aa0ecf98cb04e7cbf2809a68839942cb5a78a7ef8d20c75c9bdd06b1de2144983</originalsourceid><addsrcrecordid>eNpNj09LwzAYh4MgOOY-gCcDnluTN02beCvzzwYTFXsvSfrGdWhb007st7duHjz9Lg8_noeQC87iREnJrk34rr9i4EzGTEgtT8gMhOCRSgDOyKLvd4wxSDOQUszIy-tghtrRdeMxYOOQPiIOPb1F7OgGTWjq5u2G5nQ12lBXtBg7_AfnXRda47bUt4E-j8O2bc7JqTfvPS7-dk6K-7tiuYo2Tw_rZb6JjAQZGcPQea2cZQlmznpQTJtUKaF1As5KkymToVcVMJdJp21VsdTyCoEniVZiTi6Pt4fesgv1hwlj-dtdHron4upITIqfe-yHctfuQzM5lTApgALFpfgBE-ZaXA</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2525282815</pqid></control><display><type>article</type><title>Static Inference Meets Deep Learning: A Hybrid Type Inference Approach for Python</title><source>arXiv.org</source><source>Free E- Journals</source><creator>Peng, Yun ; Gao, Cuiyun ; Li, Zongjie ; Gao, Bowei ; Lo, David ; Zhang, Qirun ; Lyu, Michael</creator><creatorcontrib>Peng, Yun ; Gao, Cuiyun ; Li, Zongjie ; Gao, Bowei ; Lo, David ; Zhang, Qirun ; Lyu, Michael</creatorcontrib><description>Type inference for dynamic programming languages such as Python is an important yet challenging task. Static type inference techniques can precisely infer variables with enough static constraints but are unable to handle variables with dynamic features. Deep learning (DL) based approaches are feature-agnostic, but they cannot guarantee the correctness of the predicted types. Their performance significantly depends on the quality of the training data (i.e., DL models perform poorly on some common types that rarely appear in the training dataset). It is interesting to note that the static and DL-based approaches offer complementary benefits. Unfortunately, to our knowledge, precise type inference based on both static inference and neural predictions has not been exploited and remains an open challenge. In particular, it is hard to integrate DL models into the framework of rule-based static approaches. This paper fills the gap and proposes a hybrid type inference approach named HiTyper based on both static inference and deep learning. Specifically, our key insight is to record type dependencies among variables in each function and encode the dependency information in type dependency graphs (TDGs). Based on TDGs, we can easily integrate type inference rules in the nodes to conduct static inference and type rejection rules to inspect the correctness of neural predictions. HiTyper iteratively conducts static inference and DL-based prediction until the TDG is fully inferred. Experiments on two benchmark datasets show that HiTyper outperforms state-of-the-art DL models by exactly matching 10% more human annotations. HiTyper also achieves an increase of more than 30% on inferring rare types. Considering only the static part of HiTyper, it infers 2x ~ 3x more types than existing static type inference tools.</description><identifier>EISSN: 2331-8422</identifier><identifier>DOI: 10.48550/arxiv.2105.03595</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Algorithms ; Annotations ; Artificial neural networks ; Computer Science - Programming Languages ; Computer Science - Software Engineering ; Deep learning ; Drift ; Dynamic programming ; Inference ; Machine learning ; Natural language (computers) ; Programming languages</subject><ispartof>arXiv.org, 2022-02</ispartof><rights>2022. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,780,784,885,27924</link.rule.ids><backlink>$$Uhttps://doi.org/10.48550/arXiv.2105.03595$$DView paper in arXiv$$Hfree_for_read</backlink><backlink>$$Uhttps://doi.org/10.1145/3510003.3510038$$DView published paper (Access to full text may be restricted)$$Hfree_for_read</backlink></links><search><creatorcontrib>Peng, Yun</creatorcontrib><creatorcontrib>Gao, Cuiyun</creatorcontrib><creatorcontrib>Li, Zongjie</creatorcontrib><creatorcontrib>Gao, Bowei</creatorcontrib><creatorcontrib>Lo, David</creatorcontrib><creatorcontrib>Zhang, Qirun</creatorcontrib><creatorcontrib>Lyu, Michael</creatorcontrib><title>Static Inference Meets Deep Learning: A Hybrid Type Inference Approach for Python</title><title>arXiv.org</title><description>Type inference for dynamic programming languages such as Python is an important yet challenging task. Static type inference techniques can precisely infer variables with enough static constraints but are unable to handle variables with dynamic features. Deep learning (DL) based approaches are feature-agnostic, but they cannot guarantee the correctness of the predicted types. Their performance significantly depends on the quality of the training data (i.e., DL models perform poorly on some common types that rarely appear in the training dataset). It is interesting to note that the static and DL-based approaches offer complementary benefits. Unfortunately, to our knowledge, precise type inference based on both static inference and neural predictions has not been exploited and remains an open challenge. In particular, it is hard to integrate DL models into the framework of rule-based static approaches. This paper fills the gap and proposes a hybrid type inference approach named HiTyper based on both static inference and deep learning. Specifically, our key insight is to record type dependencies among variables in each function and encode the dependency information in type dependency graphs (TDGs). Based on TDGs, we can easily integrate type inference rules in the nodes to conduct static inference and type rejection rules to inspect the correctness of neural predictions. HiTyper iteratively conducts static inference and DL-based prediction until the TDG is fully inferred. Experiments on two benchmark datasets show that HiTyper outperforms state-of-the-art DL models by exactly matching 10% more human annotations. HiTyper also achieves an increase of more than 30% on inferring rare types. Considering only the static part of HiTyper, it infers 2x ~ 3x more types than existing static type inference tools.</description><subject>Algorithms</subject><subject>Annotations</subject><subject>Artificial neural networks</subject><subject>Computer Science - Programming Languages</subject><subject>Computer Science - Software Engineering</subject><subject>Deep learning</subject><subject>Drift</subject><subject>Dynamic programming</subject><subject>Inference</subject><subject>Machine learning</subject><subject>Natural language (computers)</subject><subject>Programming languages</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GOX</sourceid><recordid>eNpNj09LwzAYh4MgOOY-gCcDnluTN02beCvzzwYTFXsvSfrGdWhb007st7duHjz9Lg8_noeQC87iREnJrk34rr9i4EzGTEgtT8gMhOCRSgDOyKLvd4wxSDOQUszIy-tghtrRdeMxYOOQPiIOPb1F7OgGTWjq5u2G5nQ12lBXtBg7_AfnXRda47bUt4E-j8O2bc7JqTfvPS7-dk6K-7tiuYo2Tw_rZb6JjAQZGcPQea2cZQlmznpQTJtUKaF1As5KkymToVcVMJdJp21VsdTyCoEniVZiTi6Pt4fesgv1hwlj-dtdHron4upITIqfe-yHctfuQzM5lTApgALFpfgBE-ZaXA</recordid><startdate>20220209</startdate><enddate>20220209</enddate><creator>Peng, Yun</creator><creator>Gao, Cuiyun</creator><creator>Li, Zongjie</creator><creator>Gao, Bowei</creator><creator>Lo, David</creator><creator>Zhang, Qirun</creator><creator>Lyu, Michael</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20220209</creationdate><title>Static Inference Meets Deep Learning: A Hybrid Type Inference Approach for Python</title><author>Peng, Yun ; Gao, Cuiyun ; Li, Zongjie ; Gao, Bowei ; Lo, David ; Zhang, Qirun ; Lyu, Michael</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a525-aa0ecf98cb04e7cbf2809a68839942cb5a78a7ef8d20c75c9bdd06b1de2144983</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Algorithms</topic><topic>Annotations</topic><topic>Artificial neural networks</topic><topic>Computer Science - Programming Languages</topic><topic>Computer Science - Software Engineering</topic><topic>Deep learning</topic><topic>Drift</topic><topic>Dynamic programming</topic><topic>Inference</topic><topic>Machine learning</topic><topic>Natural language (computers)</topic><topic>Programming languages</topic><toplevel>online_resources</toplevel><creatorcontrib>Peng, Yun</creatorcontrib><creatorcontrib>Gao, Cuiyun</creatorcontrib><creatorcontrib>Li, Zongjie</creatorcontrib><creatorcontrib>Gao, Bowei</creatorcontrib><creatorcontrib>Lo, David</creatorcontrib><creatorcontrib>Zhang, Qirun</creatorcontrib><creatorcontrib>Lyu, Michael</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection><collection>arXiv Computer Science</collection><collection>arXiv.org</collection><jtitle>arXiv.org</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Peng, Yun</au><au>Gao, Cuiyun</au><au>Li, Zongjie</au><au>Gao, Bowei</au><au>Lo, David</au><au>Zhang, Qirun</au><au>Lyu, Michael</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Static Inference Meets Deep Learning: A Hybrid Type Inference Approach for Python</atitle><jtitle>arXiv.org</jtitle><date>2022-02-09</date><risdate>2022</risdate><eissn>2331-8422</eissn><abstract>Type inference for dynamic programming languages such as Python is an important yet challenging task. Static type inference techniques can precisely infer variables with enough static constraints but are unable to handle variables with dynamic features. Deep learning (DL) based approaches are feature-agnostic, but they cannot guarantee the correctness of the predicted types. Their performance significantly depends on the quality of the training data (i.e., DL models perform poorly on some common types that rarely appear in the training dataset). It is interesting to note that the static and DL-based approaches offer complementary benefits. Unfortunately, to our knowledge, precise type inference based on both static inference and neural predictions has not been exploited and remains an open challenge. In particular, it is hard to integrate DL models into the framework of rule-based static approaches. This paper fills the gap and proposes a hybrid type inference approach named HiTyper based on both static inference and deep learning. Specifically, our key insight is to record type dependencies among variables in each function and encode the dependency information in type dependency graphs (TDGs). Based on TDGs, we can easily integrate type inference rules in the nodes to conduct static inference and type rejection rules to inspect the correctness of neural predictions. HiTyper iteratively conducts static inference and DL-based prediction until the TDG is fully inferred. Experiments on two benchmark datasets show that HiTyper outperforms state-of-the-art DL models by exactly matching 10% more human annotations. HiTyper also achieves an increase of more than 30% on inferring rare types. Considering only the static part of HiTyper, it infers 2x ~ 3x more types than existing static type inference tools.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><doi>10.48550/arxiv.2105.03595</doi><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | EISSN: 2331-8422 |
ispartof | arXiv.org, 2022-02 |
issn | 2331-8422 |
language | eng |
recordid | cdi_arxiv_primary_2105_03595 |
source | arXiv.org; Free E- Journals |
subjects | Algorithms Annotations Artificial neural networks Computer Science - Programming Languages Computer Science - Software Engineering Deep learning Drift Dynamic programming Inference Machine learning Natural language (computers) Programming languages |
title | Static Inference Meets Deep Learning: A Hybrid Type Inference Approach for Python |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-11T06%3A23%3A40IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_arxiv&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Static%20Inference%20Meets%20Deep%20Learning:%20A%20Hybrid%20Type%20Inference%20Approach%20for%20Python&rft.jtitle=arXiv.org&rft.au=Peng,%20Yun&rft.date=2022-02-09&rft.eissn=2331-8422&rft_id=info:doi/10.48550/arxiv.2105.03595&rft_dat=%3Cproquest_arxiv%3E2525282815%3C/proquest_arxiv%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2525282815&rft_id=info:pmid/&rfr_iscdi=true |