Predictive Adaptive Streaming to Enable Mobile 360-Degree and VR Experiences

As 360-degree videos and virtual reality (VR) applications become popular for consumer and enterprise use cases, the desire to enable truly mobile experiences also increases. Delivering 360-degree videos and cloud/edge-based VR applications require ultra-high bandwidth and ultra-low latency [1] , ch...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on multimedia 2021, Vol.23, p.716-731
Hauptverfasser: Hou, Xueshi, Dey, Sujit, Zhang, Jianzhong, Budagavi, Madhukar
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 731
container_issue
container_start_page 716
container_title IEEE transactions on multimedia
container_volume 23
creator Hou, Xueshi
Dey, Sujit
Zhang, Jianzhong
Budagavi, Madhukar
description As 360-degree videos and virtual reality (VR) applications become popular for consumer and enterprise use cases, the desire to enable truly mobile experiences also increases. Delivering 360-degree videos and cloud/edge-based VR applications require ultra-high bandwidth and ultra-low latency [1] , challenging to achieve with mobile networks. A common approach to reduce bandwidth is streaming only the field of view (FOV). However, extracting and transmitting the FOV in response to user head motion can add high latency, adversely affecting user experience. In this paper, we propose a predictive adaptive streaming approach, where the predicted view with high predictive probability is adaptively encoded in relatively high quality according to bandwidth conditions and transmitted in advance, leading to a simultaneous reduction in bandwidth and latency. The predictive adaptive streaming method is based on a deep-learning-based viewpoint prediction model we develop, which uses past head motions to predict where a user will be looking in the 360-degree view. Using a very large dataset consisting of head motion traces from over 36,000 viewers for nineteen 360-degree/VR videos, we validate the ability of our predictive adaptive streaming method to offer high-quality view while simultaneously significantly reducing bandwidth.
doi_str_mv 10.1109/TMM.2020.2987693
format Article
fullrecord <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_proquest_journals_2483257798</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9069299</ieee_id><sourcerecordid>2483257798</sourcerecordid><originalsourceid>FETCH-LOGICAL-c291t-6877f48fa4afdf2e4accdff7ea22b11d872f62992196ff4d79ac25c47123b1e23</originalsourceid><addsrcrecordid>eNo9kEFLw0AQhRdRsFbvgpeA59TZySabPZbaqtCiaPW6bDazJaVN4iYV_fdubfH03uG9mcfH2DWHEeeg7paLxQgBYYQql5lKTtiAK8FjAClPg08RYoUcztlF160BuEhBDtj8xVNZ2b76omhcmvbPvPWezLaqV1HfRNPaFBuKFk1RBUkyiO9p5YkiU5fRx2s0_W7JV1Rb6i7ZmTObjq6OOmTvs-ly8hjPnx-eJuN5bFHxPs5yKZ3InRHGlQ5JGGtL5yQZxILzMpfoMlRhrcqcE6VUxmJqheSYFJwwGbLbw93WN5876nq9bna-Di81ijzBVEqVhxQcUtY3XefJ6dZXW-N_NAe9Z6YDM71npo_MQuXmUKmI6D-uIFNhTvILcGZmsA</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2483257798</pqid></control><display><type>article</type><title>Predictive Adaptive Streaming to Enable Mobile 360-Degree and VR Experiences</title><source>IEEE Electronic Library (IEL)</source><creator>Hou, Xueshi ; Dey, Sujit ; Zhang, Jianzhong ; Budagavi, Madhukar</creator><creatorcontrib>Hou, Xueshi ; Dey, Sujit ; Zhang, Jianzhong ; Budagavi, Madhukar</creatorcontrib><description>As 360-degree videos and virtual reality (VR) applications become popular for consumer and enterprise use cases, the desire to enable truly mobile experiences also increases. Delivering 360-degree videos and cloud/edge-based VR applications require ultra-high bandwidth and ultra-low latency &lt;xref ref-type="bibr" rid="ref1"&gt;[1] , challenging to achieve with mobile networks. A common approach to reduce bandwidth is streaming only the field of view (FOV). However, extracting and transmitting the FOV in response to user head motion can add high latency, adversely affecting user experience. In this paper, we propose a predictive adaptive streaming approach, where the predicted view with high predictive probability is adaptively encoded in relatively high quality according to bandwidth conditions and transmitted in advance, leading to a simultaneous reduction in bandwidth and latency. The predictive adaptive streaming method is based on a deep-learning-based viewpoint prediction model we develop, which uses past head motions to predict where a user will be looking in the 360-degree view. Using a very large dataset consisting of head motion traces from over 36,000 viewers for nineteen 360-degree/VR videos, we validate the ability of our predictive adaptive streaming method to offer high-quality view while simultaneously significantly reducing bandwidth.</description><identifier>ISSN: 1520-9210</identifier><identifier>EISSN: 1941-0077</identifier><identifier>DOI: 10.1109/TMM.2020.2987693</identifier><identifier>CODEN: ITMUF8</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>360-degree video ; Bandwidth ; Bandwidths ; Digital media ; Field of view ; Head ; Head movement ; Machine learning ; Network latency ; Prediction models ; Predictions ; Predictive models ; Rendering (computer graphics) ; Streaming media ; Video ; video streaming ; Videos ; Virtual reality</subject><ispartof>IEEE transactions on multimedia, 2021, Vol.23, p.716-731</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c291t-6877f48fa4afdf2e4accdff7ea22b11d872f62992196ff4d79ac25c47123b1e23</citedby><cites>FETCH-LOGICAL-c291t-6877f48fa4afdf2e4accdff7ea22b11d872f62992196ff4d79ac25c47123b1e23</cites><orcidid>0000-0003-3083-7656</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9069299$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,796,4024,27923,27924,27925,54758</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/9069299$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Hou, Xueshi</creatorcontrib><creatorcontrib>Dey, Sujit</creatorcontrib><creatorcontrib>Zhang, Jianzhong</creatorcontrib><creatorcontrib>Budagavi, Madhukar</creatorcontrib><title>Predictive Adaptive Streaming to Enable Mobile 360-Degree and VR Experiences</title><title>IEEE transactions on multimedia</title><addtitle>TMM</addtitle><description>As 360-degree videos and virtual reality (VR) applications become popular for consumer and enterprise use cases, the desire to enable truly mobile experiences also increases. Delivering 360-degree videos and cloud/edge-based VR applications require ultra-high bandwidth and ultra-low latency &lt;xref ref-type="bibr" rid="ref1"&gt;[1] , challenging to achieve with mobile networks. A common approach to reduce bandwidth is streaming only the field of view (FOV). However, extracting and transmitting the FOV in response to user head motion can add high latency, adversely affecting user experience. In this paper, we propose a predictive adaptive streaming approach, where the predicted view with high predictive probability is adaptively encoded in relatively high quality according to bandwidth conditions and transmitted in advance, leading to a simultaneous reduction in bandwidth and latency. The predictive adaptive streaming method is based on a deep-learning-based viewpoint prediction model we develop, which uses past head motions to predict where a user will be looking in the 360-degree view. Using a very large dataset consisting of head motion traces from over 36,000 viewers for nineteen 360-degree/VR videos, we validate the ability of our predictive adaptive streaming method to offer high-quality view while simultaneously significantly reducing bandwidth.</description><subject>360-degree video</subject><subject>Bandwidth</subject><subject>Bandwidths</subject><subject>Digital media</subject><subject>Field of view</subject><subject>Head</subject><subject>Head movement</subject><subject>Machine learning</subject><subject>Network latency</subject><subject>Prediction models</subject><subject>Predictions</subject><subject>Predictive models</subject><subject>Rendering (computer graphics)</subject><subject>Streaming media</subject><subject>Video</subject><subject>video streaming</subject><subject>Videos</subject><subject>Virtual reality</subject><issn>1520-9210</issn><issn>1941-0077</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNo9kEFLw0AQhRdRsFbvgpeA59TZySabPZbaqtCiaPW6bDazJaVN4iYV_fdubfH03uG9mcfH2DWHEeeg7paLxQgBYYQql5lKTtiAK8FjAClPg08RYoUcztlF160BuEhBDtj8xVNZ2b76omhcmvbPvPWezLaqV1HfRNPaFBuKFk1RBUkyiO9p5YkiU5fRx2s0_W7JV1Rb6i7ZmTObjq6OOmTvs-ly8hjPnx-eJuN5bFHxPs5yKZ3InRHGlQ5JGGtL5yQZxILzMpfoMlRhrcqcE6VUxmJqheSYFJwwGbLbw93WN5876nq9bna-Di81ijzBVEqVhxQcUtY3XefJ6dZXW-N_NAe9Z6YDM71npo_MQuXmUKmI6D-uIFNhTvILcGZmsA</recordid><startdate>2021</startdate><enddate>2021</enddate><creator>Hou, Xueshi</creator><creator>Dey, Sujit</creator><creator>Zhang, Jianzhong</creator><creator>Budagavi, Madhukar</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0003-3083-7656</orcidid></search><sort><creationdate>2021</creationdate><title>Predictive Adaptive Streaming to Enable Mobile 360-Degree and VR Experiences</title><author>Hou, Xueshi ; Dey, Sujit ; Zhang, Jianzhong ; Budagavi, Madhukar</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c291t-6877f48fa4afdf2e4accdff7ea22b11d872f62992196ff4d79ac25c47123b1e23</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>360-degree video</topic><topic>Bandwidth</topic><topic>Bandwidths</topic><topic>Digital media</topic><topic>Field of view</topic><topic>Head</topic><topic>Head movement</topic><topic>Machine learning</topic><topic>Network latency</topic><topic>Prediction models</topic><topic>Predictions</topic><topic>Predictive models</topic><topic>Rendering (computer graphics)</topic><topic>Streaming media</topic><topic>Video</topic><topic>video streaming</topic><topic>Videos</topic><topic>Virtual reality</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Hou, Xueshi</creatorcontrib><creatorcontrib>Dey, Sujit</creatorcontrib><creatorcontrib>Zhang, Jianzhong</creatorcontrib><creatorcontrib>Budagavi, Madhukar</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>IEEE transactions on multimedia</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Hou, Xueshi</au><au>Dey, Sujit</au><au>Zhang, Jianzhong</au><au>Budagavi, Madhukar</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Predictive Adaptive Streaming to Enable Mobile 360-Degree and VR Experiences</atitle><jtitle>IEEE transactions on multimedia</jtitle><stitle>TMM</stitle><date>2021</date><risdate>2021</risdate><volume>23</volume><spage>716</spage><epage>731</epage><pages>716-731</pages><issn>1520-9210</issn><eissn>1941-0077</eissn><coden>ITMUF8</coden><abstract>As 360-degree videos and virtual reality (VR) applications become popular for consumer and enterprise use cases, the desire to enable truly mobile experiences also increases. Delivering 360-degree videos and cloud/edge-based VR applications require ultra-high bandwidth and ultra-low latency &lt;xref ref-type="bibr" rid="ref1"&gt;[1] , challenging to achieve with mobile networks. A common approach to reduce bandwidth is streaming only the field of view (FOV). However, extracting and transmitting the FOV in response to user head motion can add high latency, adversely affecting user experience. In this paper, we propose a predictive adaptive streaming approach, where the predicted view with high predictive probability is adaptively encoded in relatively high quality according to bandwidth conditions and transmitted in advance, leading to a simultaneous reduction in bandwidth and latency. The predictive adaptive streaming method is based on a deep-learning-based viewpoint prediction model we develop, which uses past head motions to predict where a user will be looking in the 360-degree view. Using a very large dataset consisting of head motion traces from over 36,000 viewers for nineteen 360-degree/VR videos, we validate the ability of our predictive adaptive streaming method to offer high-quality view while simultaneously significantly reducing bandwidth.</abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/TMM.2020.2987693</doi><tpages>16</tpages><orcidid>https://orcid.org/0000-0003-3083-7656</orcidid></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 1520-9210
ispartof IEEE transactions on multimedia, 2021, Vol.23, p.716-731
issn 1520-9210
1941-0077
language eng
recordid cdi_proquest_journals_2483257798
source IEEE Electronic Library (IEL)
subjects 360-degree video
Bandwidth
Bandwidths
Digital media
Field of view
Head
Head movement
Machine learning
Network latency
Prediction models
Predictions
Predictive models
Rendering (computer graphics)
Streaming media
Video
video streaming
Videos
Virtual reality
title Predictive Adaptive Streaming to Enable Mobile 360-Degree and VR Experiences
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-19T03%3A50%3A46IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Predictive%20Adaptive%20Streaming%20to%20Enable%20Mobile%20360-Degree%20and%20VR%20Experiences&rft.jtitle=IEEE%20transactions%20on%20multimedia&rft.au=Hou,%20Xueshi&rft.date=2021&rft.volume=23&rft.spage=716&rft.epage=731&rft.pages=716-731&rft.issn=1520-9210&rft.eissn=1941-0077&rft.coden=ITMUF8&rft_id=info:doi/10.1109/TMM.2020.2987693&rft_dat=%3Cproquest_RIE%3E2483257798%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2483257798&rft_id=info:pmid/&rft_ieee_id=9069299&rfr_iscdi=true