Integrating Pre-Trained Language Model With Physical Layer Communications

The burgeoning field of on-device AI communication, where devices exchange information directly through embedded foundation models, such as language models (LMs), requires robust, efficient, and generalizable communication frameworks. However, integrating these frameworks with existing wireless syst...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on wireless communications 2024-11, Vol.23 (11), p.17266-17278
Hauptverfasser: Lee, Ju-Hyung, Lee, Dong-Ho, Lee, Joohan, Pujara, Jay
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 17278
container_issue 11
container_start_page 17266
container_title IEEE transactions on wireless communications
container_volume 23
creator Lee, Ju-Hyung
Lee, Dong-Ho
Lee, Joohan
Pujara, Jay
description The burgeoning field of on-device AI communication, where devices exchange information directly through embedded foundation models, such as language models (LMs), requires robust, efficient, and generalizable communication frameworks. However, integrating these frameworks with existing wireless systems and effectively managing noise and bit errors pose significant challenges. In this work, we introduce a practical on-device AI communication framework, integrated with physical layer (PHY) communication functions, demonstrated through its performance on a link-level simulator. Our framework incorporates end-to-end training with channel noise to enhance resilience, incorporates vector quantized variational autoencoders (VQ-VAE) for efficient and robust communication, and utilizes pre-trained encoder-decoder transformers for improved generalization capabilities. Simulations, across various communication scenarios, reveal that our framework achieves a 50% reduction in transmission size while demonstrating substantial generalization ability and noise robustness under standardized 3GPP channel models.
doi_str_mv 10.1109/TWC.2024.3452481
format Article
fullrecord <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_crossref_primary_10_1109_TWC_2024_3452481</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10670056</ieee_id><sourcerecordid>3127777899</sourcerecordid><originalsourceid>FETCH-LOGICAL-c217t-7abcc2bd3f13dff0bf16c5542ce5fc231074ed51081d566842f80e8e4bbc4f263</originalsourceid><addsrcrecordid>eNpNkDtPwzAQxy0EEqWwMzBEYk7wO-mIIh6ViuhQ1NFynHPqqnWKnQz99rhqB2651__udD-EHgkuCMGzl9W6LiimvGBcUF6RKzQhQlQ5Tcn1KWYyJ7SUt-guxi3GpJRCTNB87gfogh6c77JlgHwVtPPQZgvtu1F3kH31LeyytRs22XJzjM7oXWoeIWR1v9-PPhUG1_t4j26s3kV4uPgp-nl_W9Wf-eL7Y16_LnJDSTnkpW6MoU3LLGGttbixRBohODUgrKGM4JJDKwiuSCukrDi1FYYKeNMYbqlkU_R83nsI_e8IcVDbfgw-nVQsPZisms2SCp9VJvQxBrDqENxeh6MiWJ2AqQRMnYCpC7A08nQecQDwTy5LjIVkf5xMZuc</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3127777899</pqid></control><display><type>article</type><title>Integrating Pre-Trained Language Model With Physical Layer Communications</title><source>IEEE/IET Electronic Library</source><creator>Lee, Ju-Hyung ; Lee, Dong-Ho ; Lee, Joohan ; Pujara, Jay</creator><creatorcontrib>Lee, Ju-Hyung ; Lee, Dong-Ho ; Lee, Joohan ; Pujara, Jay</creatorcontrib><description>The burgeoning field of on-device AI communication, where devices exchange information directly through embedded foundation models, such as language models (LMs), requires robust, efficient, and generalizable communication frameworks. However, integrating these frameworks with existing wireless systems and effectively managing noise and bit errors pose significant challenges. In this work, we introduce a practical on-device AI communication framework, integrated with physical layer (PHY) communication functions, demonstrated through its performance on a link-level simulator. Our framework incorporates end-to-end training with channel noise to enhance resilience, incorporates vector quantized variational autoencoders (VQ-VAE) for efficient and robust communication, and utilizes pre-trained encoder-decoder transformers for improved generalization capabilities. Simulations, across various communication scenarios, reveal that our framework achieves a 50% reduction in transmission size while demonstrating substantial generalization ability and noise robustness under standardized 3GPP channel models.</description><identifier>ISSN: 1536-1276</identifier><identifier>EISSN: 1558-2248</identifier><identifier>DOI: 10.1109/TWC.2024.3452481</identifier><identifier>CODEN: ITWCAX</identifier><language>eng</language><publisher>New York: IEEE</publisher><subject>Artificial intelligence ; Channel noise ; Data models ; Decoding ; Embedded foundations ; Encoders-Decoders ; language model ; link-level simulation ; natural language processing (NLP) ; Noise ; Physical layer communications ; Robustness ; Semantics ; Vectors ; VQ-VAE ; Wireless communication</subject><ispartof>IEEE transactions on wireless communications, 2024-11, Vol.23 (11), p.17266-17278</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c217t-7abcc2bd3f13dff0bf16c5542ce5fc231074ed51081d566842f80e8e4bbc4f263</cites><orcidid>0000-0003-1947-0283 ; 0009-0005-6959-7516</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10670056$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,792,27903,27904,54736</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10670056$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Lee, Ju-Hyung</creatorcontrib><creatorcontrib>Lee, Dong-Ho</creatorcontrib><creatorcontrib>Lee, Joohan</creatorcontrib><creatorcontrib>Pujara, Jay</creatorcontrib><title>Integrating Pre-Trained Language Model With Physical Layer Communications</title><title>IEEE transactions on wireless communications</title><addtitle>TWC</addtitle><description>The burgeoning field of on-device AI communication, where devices exchange information directly through embedded foundation models, such as language models (LMs), requires robust, efficient, and generalizable communication frameworks. However, integrating these frameworks with existing wireless systems and effectively managing noise and bit errors pose significant challenges. In this work, we introduce a practical on-device AI communication framework, integrated with physical layer (PHY) communication functions, demonstrated through its performance on a link-level simulator. Our framework incorporates end-to-end training with channel noise to enhance resilience, incorporates vector quantized variational autoencoders (VQ-VAE) for efficient and robust communication, and utilizes pre-trained encoder-decoder transformers for improved generalization capabilities. Simulations, across various communication scenarios, reveal that our framework achieves a 50% reduction in transmission size while demonstrating substantial generalization ability and noise robustness under standardized 3GPP channel models.</description><subject>Artificial intelligence</subject><subject>Channel noise</subject><subject>Data models</subject><subject>Decoding</subject><subject>Embedded foundations</subject><subject>Encoders-Decoders</subject><subject>language model</subject><subject>link-level simulation</subject><subject>natural language processing (NLP)</subject><subject>Noise</subject><subject>Physical layer communications</subject><subject>Robustness</subject><subject>Semantics</subject><subject>Vectors</subject><subject>VQ-VAE</subject><subject>Wireless communication</subject><issn>1536-1276</issn><issn>1558-2248</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpNkDtPwzAQxy0EEqWwMzBEYk7wO-mIIh6ViuhQ1NFynHPqqnWKnQz99rhqB2651__udD-EHgkuCMGzl9W6LiimvGBcUF6RKzQhQlQ5Tcn1KWYyJ7SUt-guxi3GpJRCTNB87gfogh6c77JlgHwVtPPQZgvtu1F3kH31LeyytRs22XJzjM7oXWoeIWR1v9-PPhUG1_t4j26s3kV4uPgp-nl_W9Wf-eL7Y16_LnJDSTnkpW6MoU3LLGGttbixRBohODUgrKGM4JJDKwiuSCukrDi1FYYKeNMYbqlkU_R83nsI_e8IcVDbfgw-nVQsPZisms2SCp9VJvQxBrDqENxeh6MiWJ2AqQRMnYCpC7A08nQecQDwTy5LjIVkf5xMZuc</recordid><startdate>202411</startdate><enddate>202411</enddate><creator>Lee, Ju-Hyung</creator><creator>Lee, Dong-Ho</creator><creator>Lee, Joohan</creator><creator>Pujara, Jay</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0003-1947-0283</orcidid><orcidid>https://orcid.org/0009-0005-6959-7516</orcidid></search><sort><creationdate>202411</creationdate><title>Integrating Pre-Trained Language Model With Physical Layer Communications</title><author>Lee, Ju-Hyung ; Lee, Dong-Ho ; Lee, Joohan ; Pujara, Jay</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c217t-7abcc2bd3f13dff0bf16c5542ce5fc231074ed51081d566842f80e8e4bbc4f263</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Artificial intelligence</topic><topic>Channel noise</topic><topic>Data models</topic><topic>Decoding</topic><topic>Embedded foundations</topic><topic>Encoders-Decoders</topic><topic>language model</topic><topic>link-level simulation</topic><topic>natural language processing (NLP)</topic><topic>Noise</topic><topic>Physical layer communications</topic><topic>Robustness</topic><topic>Semantics</topic><topic>Vectors</topic><topic>VQ-VAE</topic><topic>Wireless communication</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Lee, Ju-Hyung</creatorcontrib><creatorcontrib>Lee, Dong-Ho</creatorcontrib><creatorcontrib>Lee, Joohan</creatorcontrib><creatorcontrib>Pujara, Jay</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE/IET Electronic Library</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>IEEE transactions on wireless communications</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Lee, Ju-Hyung</au><au>Lee, Dong-Ho</au><au>Lee, Joohan</au><au>Pujara, Jay</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Integrating Pre-Trained Language Model With Physical Layer Communications</atitle><jtitle>IEEE transactions on wireless communications</jtitle><stitle>TWC</stitle><date>2024-11</date><risdate>2024</risdate><volume>23</volume><issue>11</issue><spage>17266</spage><epage>17278</epage><pages>17266-17278</pages><issn>1536-1276</issn><eissn>1558-2248</eissn><coden>ITWCAX</coden><abstract>The burgeoning field of on-device AI communication, where devices exchange information directly through embedded foundation models, such as language models (LMs), requires robust, efficient, and generalizable communication frameworks. However, integrating these frameworks with existing wireless systems and effectively managing noise and bit errors pose significant challenges. In this work, we introduce a practical on-device AI communication framework, integrated with physical layer (PHY) communication functions, demonstrated through its performance on a link-level simulator. Our framework incorporates end-to-end training with channel noise to enhance resilience, incorporates vector quantized variational autoencoders (VQ-VAE) for efficient and robust communication, and utilizes pre-trained encoder-decoder transformers for improved generalization capabilities. Simulations, across various communication scenarios, reveal that our framework achieves a 50% reduction in transmission size while demonstrating substantial generalization ability and noise robustness under standardized 3GPP channel models.</abstract><cop>New York</cop><pub>IEEE</pub><doi>10.1109/TWC.2024.3452481</doi><tpages>13</tpages><orcidid>https://orcid.org/0000-0003-1947-0283</orcidid><orcidid>https://orcid.org/0009-0005-6959-7516</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 1536-1276
ispartof IEEE transactions on wireless communications, 2024-11, Vol.23 (11), p.17266-17278
issn 1536-1276
1558-2248
language eng
recordid cdi_crossref_primary_10_1109_TWC_2024_3452481
source IEEE/IET Electronic Library
subjects Artificial intelligence
Channel noise
Data models
Decoding
Embedded foundations
Encoders-Decoders
language model
link-level simulation
natural language processing (NLP)
Noise
Physical layer communications
Robustness
Semantics
Vectors
VQ-VAE
Wireless communication
title Integrating Pre-Trained Language Model With Physical Layer Communications
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-24T20%3A16%3A19IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Integrating%20Pre-Trained%20Language%20Model%20With%20Physical%20Layer%20Communications&rft.jtitle=IEEE%20transactions%20on%20wireless%20communications&rft.au=Lee,%20Ju-Hyung&rft.date=2024-11&rft.volume=23&rft.issue=11&rft.spage=17266&rft.epage=17278&rft.pages=17266-17278&rft.issn=1536-1276&rft.eissn=1558-2248&rft.coden=ITWCAX&rft_id=info:doi/10.1109/TWC.2024.3452481&rft_dat=%3Cproquest_RIE%3E3127777899%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3127777899&rft_id=info:pmid/&rft_ieee_id=10670056&rfr_iscdi=true