Audio for extended realities: A case study informed exposition

An area of immersive storytelling in rapid evolution is that of extended reality. This emergent mode of experience employs spatial mapping, and both plane and object detection to superimpose computer-generated images in the volumetric context of a physical space via a head-mounted display. This in t...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Convergence (London, England) England), 2023-06
Hauptverfasser: Paterson, Justin, Kadel, Oliver
Format: Artikel
Sprache:eng
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title Convergence (London, England)
container_volume
creator Paterson, Justin
Kadel, Oliver
description An area of immersive storytelling in rapid evolution is that of extended reality. This emergent mode of experience employs spatial mapping, and both plane and object detection to superimpose computer-generated images in the volumetric context of a physical space via a head-mounted display. This in turn produces a unique set of challenges and opportunities for associated audio implementation and aesthetics. Creative development of this audio is often a function of evolving toolsets, and the associated workflow is far from standardized. This paper forms a context for such audio workflow, one that draws from precursor technologies such as audio for games and virtual reality and develops this into an outline taxonomy that is both representative of the state of the art, and forward facing towards evolution of the technology stack. The context is framed through a series of case studies. Between 2019 and 2021, BBC and Oculus TV commissioned Alchemy Immersive and Atlantic Productions to produce virtual reality and mixed reality experiences of several classic documentary series by Sir David Attenborough: Museum Alive, Micro Monsters, First Life VR, Museum Alive AR and Kingdom of Plants. This portfolio received numerous award nominations and prizes, including from the Raindance Festival and a double Emmy. The sound design, audio postproduction and spatial audio for these experiences were implemented by the company 1.618 Digital, and drawing from first-hand-creator involvement, the workflows are deconstructed and explored with reference to tools, technologies, techniques and perception. Such an exposition forms the basis for an analysis of both this and broader creative practice in the field of audio for extended reality, and this is subsequently used to present a speculative vision of audio in the future of immersive storytelling.
doi_str_mv 10.1177/13548565231169723
format Article
fullrecord <record><control><sourceid>sage_cross</sourceid><recordid>TN_cdi_crossref_primary_10_1177_13548565231169723</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sage_id>10.1177_13548565231169723</sage_id><sourcerecordid>10.1177_13548565231169723</sourcerecordid><originalsourceid>FETCH-LOGICAL-c2423-a69880d5fab2a14a6dffe51530c238b1fe450b1b3c42c42e1671cbd48929ecf13</originalsourceid><addsrcrecordid>eNp9kNtKAzEQhoMoWKsP4F1eYGsm5_VCKMUTFLzR6yWbTGRLu1uSXWjf3pR6JwgDMzDfNww_IffAFgDGPIBQ0iqtuADQteHigszASFsZYfllmcu-OgHX5CbnDWOSK6Nn5Gk5hW6gcUgUDyP2AQNN6Lbd2GF-pEvqXUaaxykcadcXbFcAPOyHXIihvyVX0W0z3v32Ofl6ef5cvVXrj9f31XJdeS65qJyurWVBRddyB9LpECMqUIJ5LmwLEaViLbTCS14KQRvwbZC25jX6CGJO4HzXpyHnhLHZp27n0rEB1pwCaP4EUJzF2cnuG5vNMKW-vPiP8AO4jFqy</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Audio for extended realities: A case study informed exposition</title><source>SAGE Complete A-Z List</source><creator>Paterson, Justin ; Kadel, Oliver</creator><creatorcontrib>Paterson, Justin ; Kadel, Oliver</creatorcontrib><description>An area of immersive storytelling in rapid evolution is that of extended reality. This emergent mode of experience employs spatial mapping, and both plane and object detection to superimpose computer-generated images in the volumetric context of a physical space via a head-mounted display. This in turn produces a unique set of challenges and opportunities for associated audio implementation and aesthetics. Creative development of this audio is often a function of evolving toolsets, and the associated workflow is far from standardized. This paper forms a context for such audio workflow, one that draws from precursor technologies such as audio for games and virtual reality and develops this into an outline taxonomy that is both representative of the state of the art, and forward facing towards evolution of the technology stack. The context is framed through a series of case studies. Between 2019 and 2021, BBC and Oculus TV commissioned Alchemy Immersive and Atlantic Productions to produce virtual reality and mixed reality experiences of several classic documentary series by Sir David Attenborough: Museum Alive, Micro Monsters, First Life VR, Museum Alive AR and Kingdom of Plants. This portfolio received numerous award nominations and prizes, including from the Raindance Festival and a double Emmy. The sound design, audio postproduction and spatial audio for these experiences were implemented by the company 1.618 Digital, and drawing from first-hand-creator involvement, the workflows are deconstructed and explored with reference to tools, technologies, techniques and perception. Such an exposition forms the basis for an analysis of both this and broader creative practice in the field of audio for extended reality, and this is subsequently used to present a speculative vision of audio in the future of immersive storytelling.</description><identifier>ISSN: 1354-8565</identifier><identifier>EISSN: 1748-7382</identifier><identifier>DOI: 10.1177/13548565231169723</identifier><language>eng</language><publisher>London, England: SAGE Publications</publisher><ispartof>Convergence (London, England), 2023-06</ispartof><rights>The Author(s) 2023</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c2423-a69880d5fab2a14a6dffe51530c238b1fe450b1b3c42c42e1671cbd48929ecf13</citedby><cites>FETCH-LOGICAL-c2423-a69880d5fab2a14a6dffe51530c238b1fe450b1b3c42c42e1671cbd48929ecf13</cites><orcidid>0000-0001-7822-319X</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://journals.sagepub.com/doi/pdf/10.1177/13548565231169723$$EPDF$$P50$$Gsage$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://journals.sagepub.com/doi/10.1177/13548565231169723$$EHTML$$P50$$Gsage$$Hfree_for_read</linktohtml><link.rule.ids>314,776,780,21798,27901,27902,43597,43598</link.rule.ids></links><search><creatorcontrib>Paterson, Justin</creatorcontrib><creatorcontrib>Kadel, Oliver</creatorcontrib><title>Audio for extended realities: A case study informed exposition</title><title>Convergence (London, England)</title><description>An area of immersive storytelling in rapid evolution is that of extended reality. This emergent mode of experience employs spatial mapping, and both plane and object detection to superimpose computer-generated images in the volumetric context of a physical space via a head-mounted display. This in turn produces a unique set of challenges and opportunities for associated audio implementation and aesthetics. Creative development of this audio is often a function of evolving toolsets, and the associated workflow is far from standardized. This paper forms a context for such audio workflow, one that draws from precursor technologies such as audio for games and virtual reality and develops this into an outline taxonomy that is both representative of the state of the art, and forward facing towards evolution of the technology stack. The context is framed through a series of case studies. Between 2019 and 2021, BBC and Oculus TV commissioned Alchemy Immersive and Atlantic Productions to produce virtual reality and mixed reality experiences of several classic documentary series by Sir David Attenborough: Museum Alive, Micro Monsters, First Life VR, Museum Alive AR and Kingdom of Plants. This portfolio received numerous award nominations and prizes, including from the Raindance Festival and a double Emmy. The sound design, audio postproduction and spatial audio for these experiences were implemented by the company 1.618 Digital, and drawing from first-hand-creator involvement, the workflows are deconstructed and explored with reference to tools, technologies, techniques and perception. Such an exposition forms the basis for an analysis of both this and broader creative practice in the field of audio for extended reality, and this is subsequently used to present a speculative vision of audio in the future of immersive storytelling.</description><issn>1354-8565</issn><issn>1748-7382</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>AFRWT</sourceid><recordid>eNp9kNtKAzEQhoMoWKsP4F1eYGsm5_VCKMUTFLzR6yWbTGRLu1uSXWjf3pR6JwgDMzDfNww_IffAFgDGPIBQ0iqtuADQteHigszASFsZYfllmcu-OgHX5CbnDWOSK6Nn5Gk5hW6gcUgUDyP2AQNN6Lbd2GF-pEvqXUaaxykcadcXbFcAPOyHXIihvyVX0W0z3v32Ofl6ef5cvVXrj9f31XJdeS65qJyurWVBRddyB9LpECMqUIJ5LmwLEaViLbTCS14KQRvwbZC25jX6CGJO4HzXpyHnhLHZp27n0rEB1pwCaP4EUJzF2cnuG5vNMKW-vPiP8AO4jFqy</recordid><startdate>20230605</startdate><enddate>20230605</enddate><creator>Paterson, Justin</creator><creator>Kadel, Oliver</creator><general>SAGE Publications</general><scope>AFRWT</scope><scope>AAYXX</scope><scope>CITATION</scope><orcidid>https://orcid.org/0000-0001-7822-319X</orcidid></search><sort><creationdate>20230605</creationdate><title>Audio for extended realities: A case study informed exposition</title><author>Paterson, Justin ; Kadel, Oliver</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c2423-a69880d5fab2a14a6dffe51530c238b1fe450b1b3c42c42e1671cbd48929ecf13</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Paterson, Justin</creatorcontrib><creatorcontrib>Kadel, Oliver</creatorcontrib><collection>Sage Journals GOLD Open Access 2024</collection><collection>CrossRef</collection><jtitle>Convergence (London, England)</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Paterson, Justin</au><au>Kadel, Oliver</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Audio for extended realities: A case study informed exposition</atitle><jtitle>Convergence (London, England)</jtitle><date>2023-06-05</date><risdate>2023</risdate><issn>1354-8565</issn><eissn>1748-7382</eissn><abstract>An area of immersive storytelling in rapid evolution is that of extended reality. This emergent mode of experience employs spatial mapping, and both plane and object detection to superimpose computer-generated images in the volumetric context of a physical space via a head-mounted display. This in turn produces a unique set of challenges and opportunities for associated audio implementation and aesthetics. Creative development of this audio is often a function of evolving toolsets, and the associated workflow is far from standardized. This paper forms a context for such audio workflow, one that draws from precursor technologies such as audio for games and virtual reality and develops this into an outline taxonomy that is both representative of the state of the art, and forward facing towards evolution of the technology stack. The context is framed through a series of case studies. Between 2019 and 2021, BBC and Oculus TV commissioned Alchemy Immersive and Atlantic Productions to produce virtual reality and mixed reality experiences of several classic documentary series by Sir David Attenborough: Museum Alive, Micro Monsters, First Life VR, Museum Alive AR and Kingdom of Plants. This portfolio received numerous award nominations and prizes, including from the Raindance Festival and a double Emmy. The sound design, audio postproduction and spatial audio for these experiences were implemented by the company 1.618 Digital, and drawing from first-hand-creator involvement, the workflows are deconstructed and explored with reference to tools, technologies, techniques and perception. Such an exposition forms the basis for an analysis of both this and broader creative practice in the field of audio for extended reality, and this is subsequently used to present a speculative vision of audio in the future of immersive storytelling.</abstract><cop>London, England</cop><pub>SAGE Publications</pub><doi>10.1177/13548565231169723</doi><orcidid>https://orcid.org/0000-0001-7822-319X</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1354-8565
ispartof Convergence (London, England), 2023-06
issn 1354-8565
1748-7382
language eng
recordid cdi_crossref_primary_10_1177_13548565231169723
source SAGE Complete A-Z List
title Audio for extended realities: A case study informed exposition
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-09T16%3A55%3A14IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-sage_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Audio%20for%20extended%20realities:%20A%20case%20study%20informed%20exposition&rft.jtitle=Convergence%20(London,%20England)&rft.au=Paterson,%20Justin&rft.date=2023-06-05&rft.issn=1354-8565&rft.eissn=1748-7382&rft_id=info:doi/10.1177/13548565231169723&rft_dat=%3Csage_cross%3E10.1177_13548565231169723%3C/sage_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rft_sage_id=10.1177_13548565231169723&rfr_iscdi=true