Fiduciary requirements for virtual assistants

Virtual assistants (VAs), like Amazon’s Alexa, Google’s Assistant, and Apple’s Siri, are on the rise. However, despite allegedly being ‘assistants’ to users, they ultimately help firms to maximise profits. With more and more tasks and leeway bestowed upon VAs, the severity as well as the extent of c...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Ethics and information technology 2024-06, Vol.26 (2), p.21, Article 21
1. Verfasser: Koessler, Leonie
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue 2
container_start_page 21
container_title Ethics and information technology
container_volume 26
creator Koessler, Leonie
description Virtual assistants (VAs), like Amazon’s Alexa, Google’s Assistant, and Apple’s Siri, are on the rise. However, despite allegedly being ‘assistants’ to users, they ultimately help firms to maximise profits. With more and more tasks and leeway bestowed upon VAs, the severity as well as the extent of conflicts of interest between firms and users increase. This article builds on the common law field of fiduciary law to argue why and how regulators should address this phenomenon. First, the functions of VAs resemble established fiduciaries, namely mandataries when they perform tasks on behalf of users, and increasingly advisors whenever they provide recommendations or suggestions. Second, users grant firms deploying VAs ever more discretion over their economic, and more and more significant non-economic interests, such as their health or finances. This delegation of power renders users vulnerable to abuse of power and inadequate performance by firms deploying VAs. Moreover, neither specification or monitoring nor market forces are alternatives that can sufficiently protect users. Thus, regulation is needed, departing from the recognition of the relationship between firms deploying VAs and users as a fiduciary relationship. In the EU, this could be realised through fiduciary requirements for VAs. First and foremost, to adequately protect users from abuse of power by firms deploying VAs, the core fiduciary duty of loyalty should be converted into corresponding fiduciary requirements for VAs, obliging firms to align VAs with their users.
doi_str_mv 10.1007/s10676-023-09741-7
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2986752227</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2986752227</sourcerecordid><originalsourceid>FETCH-LOGICAL-c270t-87c2876252a130bd8621caf03de3eb14e74ea77ba2cda57abcb8b553511d11ac3</originalsourceid><addsrcrecordid>eNp9kEFLAzEQhYMoWKt_wNOC52hmstnJHqVYKxS86Dkk2axsabttsiv4701dwZunGYb33vA-xm5B3IMQ9JBAVFRxgZKLmkrgdMZmoAi5LmV9nnepNYda0SW7SmkjhFAENGN82TWj72z8KmI4jl0Mu7AfUtH2sfjs4jDabWFT6tJg8_maXbR2m8LN75yz9-XT22LF16_PL4vHNfdIYuCaPGqqUKEFKVyjKwRvWyGbIIODMlAZLJGz6BuryDrvtFNKKoAGwHo5Z3dT7iH2xzGkwWz6Me7zS4O1rkghImUVTiof-5RiaM0hdrtcxYAwJyxmwmIyFvODxZxMcjKlLN5_hPgX_Y_rG6nhZUU</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2986752227</pqid></control><display><type>article</type><title>Fiduciary requirements for virtual assistants</title><source>SpringerLink Journals</source><creator>Koessler, Leonie</creator><creatorcontrib>Koessler, Leonie</creatorcontrib><description>Virtual assistants (VAs), like Amazon’s Alexa, Google’s Assistant, and Apple’s Siri, are on the rise. However, despite allegedly being ‘assistants’ to users, they ultimately help firms to maximise profits. With more and more tasks and leeway bestowed upon VAs, the severity as well as the extent of conflicts of interest between firms and users increase. This article builds on the common law field of fiduciary law to argue why and how regulators should address this phenomenon. First, the functions of VAs resemble established fiduciaries, namely mandataries when they perform tasks on behalf of users, and increasingly advisors whenever they provide recommendations or suggestions. Second, users grant firms deploying VAs ever more discretion over their economic, and more and more significant non-economic interests, such as their health or finances. This delegation of power renders users vulnerable to abuse of power and inadequate performance by firms deploying VAs. Moreover, neither specification or monitoring nor market forces are alternatives that can sufficiently protect users. Thus, regulation is needed, departing from the recognition of the relationship between firms deploying VAs and users as a fiduciary relationship. In the EU, this could be realised through fiduciary requirements for VAs. First and foremost, to adequately protect users from abuse of power by firms deploying VAs, the core fiduciary duty of loyalty should be converted into corresponding fiduciary requirements for VAs, obliging firms to align VAs with their users.</description><identifier>ISSN: 1388-1957</identifier><identifier>EISSN: 1572-8439</identifier><identifier>DOI: 10.1007/s10676-023-09741-7</identifier><language>eng</language><publisher>Dordrecht: Springer Netherlands</publisher><subject>Artificial intelligence ; Common law ; Computer Science ; Conflicts of interest ; Economics ; Ethics ; Fiduciaries ; Innovation/Technology Management ; Legislation ; Library Science ; Management of Computing and Information Systems ; Original Paper ; Scandals ; User Interfaces and Human Computer Interaction</subject><ispartof>Ethics and information technology, 2024-06, Vol.26 (2), p.21, Article 21</ispartof><rights>The Author(s), under exclusive licence to Springer Nature B.V. 2024. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c270t-87c2876252a130bd8621caf03de3eb14e74ea77ba2cda57abcb8b553511d11ac3</cites><orcidid>0009-0006-6863-5477</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s10676-023-09741-7$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s10676-023-09741-7$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,780,784,27924,27925,41488,42557,51319</link.rule.ids></links><search><creatorcontrib>Koessler, Leonie</creatorcontrib><title>Fiduciary requirements for virtual assistants</title><title>Ethics and information technology</title><addtitle>Ethics Inf Technol</addtitle><description>Virtual assistants (VAs), like Amazon’s Alexa, Google’s Assistant, and Apple’s Siri, are on the rise. However, despite allegedly being ‘assistants’ to users, they ultimately help firms to maximise profits. With more and more tasks and leeway bestowed upon VAs, the severity as well as the extent of conflicts of interest between firms and users increase. This article builds on the common law field of fiduciary law to argue why and how regulators should address this phenomenon. First, the functions of VAs resemble established fiduciaries, namely mandataries when they perform tasks on behalf of users, and increasingly advisors whenever they provide recommendations or suggestions. Second, users grant firms deploying VAs ever more discretion over their economic, and more and more significant non-economic interests, such as their health or finances. This delegation of power renders users vulnerable to abuse of power and inadequate performance by firms deploying VAs. Moreover, neither specification or monitoring nor market forces are alternatives that can sufficiently protect users. Thus, regulation is needed, departing from the recognition of the relationship between firms deploying VAs and users as a fiduciary relationship. In the EU, this could be realised through fiduciary requirements for VAs. First and foremost, to adequately protect users from abuse of power by firms deploying VAs, the core fiduciary duty of loyalty should be converted into corresponding fiduciary requirements for VAs, obliging firms to align VAs with their users.</description><subject>Artificial intelligence</subject><subject>Common law</subject><subject>Computer Science</subject><subject>Conflicts of interest</subject><subject>Economics</subject><subject>Ethics</subject><subject>Fiduciaries</subject><subject>Innovation/Technology Management</subject><subject>Legislation</subject><subject>Library Science</subject><subject>Management of Computing and Information Systems</subject><subject>Original Paper</subject><subject>Scandals</subject><subject>User Interfaces and Human Computer Interaction</subject><issn>1388-1957</issn><issn>1572-8439</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><recordid>eNp9kEFLAzEQhYMoWKt_wNOC52hmstnJHqVYKxS86Dkk2axsabttsiv4701dwZunGYb33vA-xm5B3IMQ9JBAVFRxgZKLmkrgdMZmoAi5LmV9nnepNYda0SW7SmkjhFAENGN82TWj72z8KmI4jl0Mu7AfUtH2sfjs4jDabWFT6tJg8_maXbR2m8LN75yz9-XT22LF16_PL4vHNfdIYuCaPGqqUKEFKVyjKwRvWyGbIIODMlAZLJGz6BuryDrvtFNKKoAGwHo5Z3dT7iH2xzGkwWz6Me7zS4O1rkghImUVTiof-5RiaM0hdrtcxYAwJyxmwmIyFvODxZxMcjKlLN5_hPgX_Y_rG6nhZUU</recordid><startdate>20240601</startdate><enddate>20240601</enddate><creator>Koessler, Leonie</creator><general>Springer Netherlands</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><orcidid>https://orcid.org/0009-0006-6863-5477</orcidid></search><sort><creationdate>20240601</creationdate><title>Fiduciary requirements for virtual assistants</title><author>Koessler, Leonie</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c270t-87c2876252a130bd8621caf03de3eb14e74ea77ba2cda57abcb8b553511d11ac3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Artificial intelligence</topic><topic>Common law</topic><topic>Computer Science</topic><topic>Conflicts of interest</topic><topic>Economics</topic><topic>Ethics</topic><topic>Fiduciaries</topic><topic>Innovation/Technology Management</topic><topic>Legislation</topic><topic>Library Science</topic><topic>Management of Computing and Information Systems</topic><topic>Original Paper</topic><topic>Scandals</topic><topic>User Interfaces and Human Computer Interaction</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Koessler, Leonie</creatorcontrib><collection>CrossRef</collection><jtitle>Ethics and information technology</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Koessler, Leonie</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Fiduciary requirements for virtual assistants</atitle><jtitle>Ethics and information technology</jtitle><stitle>Ethics Inf Technol</stitle><date>2024-06-01</date><risdate>2024</risdate><volume>26</volume><issue>2</issue><spage>21</spage><pages>21-</pages><artnum>21</artnum><issn>1388-1957</issn><eissn>1572-8439</eissn><abstract>Virtual assistants (VAs), like Amazon’s Alexa, Google’s Assistant, and Apple’s Siri, are on the rise. However, despite allegedly being ‘assistants’ to users, they ultimately help firms to maximise profits. With more and more tasks and leeway bestowed upon VAs, the severity as well as the extent of conflicts of interest between firms and users increase. This article builds on the common law field of fiduciary law to argue why and how regulators should address this phenomenon. First, the functions of VAs resemble established fiduciaries, namely mandataries when they perform tasks on behalf of users, and increasingly advisors whenever they provide recommendations or suggestions. Second, users grant firms deploying VAs ever more discretion over their economic, and more and more significant non-economic interests, such as their health or finances. This delegation of power renders users vulnerable to abuse of power and inadequate performance by firms deploying VAs. Moreover, neither specification or monitoring nor market forces are alternatives that can sufficiently protect users. Thus, regulation is needed, departing from the recognition of the relationship between firms deploying VAs and users as a fiduciary relationship. In the EU, this could be realised through fiduciary requirements for VAs. First and foremost, to adequately protect users from abuse of power by firms deploying VAs, the core fiduciary duty of loyalty should be converted into corresponding fiduciary requirements for VAs, obliging firms to align VAs with their users.</abstract><cop>Dordrecht</cop><pub>Springer Netherlands</pub><doi>10.1007/s10676-023-09741-7</doi><orcidid>https://orcid.org/0009-0006-6863-5477</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 1388-1957
ispartof Ethics and information technology, 2024-06, Vol.26 (2), p.21, Article 21
issn 1388-1957
1572-8439
language eng
recordid cdi_proquest_journals_2986752227
source SpringerLink Journals
subjects Artificial intelligence
Common law
Computer Science
Conflicts of interest
Economics
Ethics
Fiduciaries
Innovation/Technology Management
Legislation
Library Science
Management of Computing and Information Systems
Original Paper
Scandals
User Interfaces and Human Computer Interaction
title Fiduciary requirements for virtual assistants
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-24T00%3A30%3A44IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Fiduciary%20requirements%20for%20virtual%20assistants&rft.jtitle=Ethics%20and%20information%20technology&rft.au=Koessler,%20Leonie&rft.date=2024-06-01&rft.volume=26&rft.issue=2&rft.spage=21&rft.pages=21-&rft.artnum=21&rft.issn=1388-1957&rft.eissn=1572-8439&rft_id=info:doi/10.1007/s10676-023-09741-7&rft_dat=%3Cproquest_cross%3E2986752227%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2986752227&rft_id=info:pmid/&rfr_iscdi=true