Trustworthy Misinformation Mitigation with Soft Information Nudging
Research in combating misinformation reports many negative results: facts may not change minds, especially if they come from sources that are not trusted. Individuals can disregard and justify lies told by trusted sources. This problem is made even worse by social recommendation algorithms which hel...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | |
container_volume | |
creator | Horne, Benjamin D Gruppi, Maurício Adalı, Sibel |
description | Research in combating misinformation reports many negative results: facts may
not change minds, especially if they come from sources that are not trusted.
Individuals can disregard and justify lies told by trusted sources. This
problem is made even worse by social recommendation algorithms which help
amplify conspiracy theories and information confirming one's own biases due to
companies' efforts to optimize for clicks and watch time over individuals' own
values and public good. As a result, more nuanced voices and facts are drowned
out by a continuous erosion of trust in better information sources. Most
misinformation mitigation techniques assume that discrediting, filtering, or
demoting low veracity information will help news consumers make better
information decisions. However, these negative results indicate that some news
consumers, particularly extreme or conspiracy news consumers will not be
helped.
We argue that, given this background, technology solutions to combating
misinformation should not simply seek facts or discredit bad news sources, but
instead use more subtle nudges towards better information consumption. Repeated
exposure to such nudges can help promote trust in better information sources
and also improve societal outcomes in the long run. In this article, we will
talk about technological solutions that can help us in developing such an
approach, and introduce one such model called Trust Nudging. |
doi_str_mv | 10.48550/arxiv.1911.05825 |
format | Article |
fullrecord | <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_1911_05825</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>1911_05825</sourcerecordid><originalsourceid>FETCH-LOGICAL-a675-79c923c2948f01e2e1dabc302427e126d8f42bfa3b10454387bf29bca018c773</originalsourceid><addsrcrecordid>eNpNj7FuwjAURb10qCgf0Kn5gQS_ZxvbI4pKi0TpEPbIduJgCZLKMQX-vi10YLr3SkdXOoQ8Ay24EoLOTDyH7wI0QEGFQvFIym08juk0xLS7ZB9hDL0f4sGkMPS_M4XuVk8h7bJq8Clb3QGbY9OFvnsiD97sx3b6nxNSLV-35Xu-_nxblYt1buZS5FI7jcyh5spTaLGFxljHKHKULeC8UZ6j9YZZoFxwpqT1qK0zFJSTkk3Iy-31KlF_xXAw8VL_ydRXGfYDbh1FRw</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Trustworthy Misinformation Mitigation with Soft Information Nudging</title><source>arXiv.org</source><creator>Horne, Benjamin D ; Gruppi, Maurício ; Adalı, Sibel</creator><creatorcontrib>Horne, Benjamin D ; Gruppi, Maurício ; Adalı, Sibel</creatorcontrib><description>Research in combating misinformation reports many negative results: facts may
not change minds, especially if they come from sources that are not trusted.
Individuals can disregard and justify lies told by trusted sources. This
problem is made even worse by social recommendation algorithms which help
amplify conspiracy theories and information confirming one's own biases due to
companies' efforts to optimize for clicks and watch time over individuals' own
values and public good. As a result, more nuanced voices and facts are drowned
out by a continuous erosion of trust in better information sources. Most
misinformation mitigation techniques assume that discrediting, filtering, or
demoting low veracity information will help news consumers make better
information decisions. However, these negative results indicate that some news
consumers, particularly extreme or conspiracy news consumers will not be
helped.
We argue that, given this background, technology solutions to combating
misinformation should not simply seek facts or discredit bad news sources, but
instead use more subtle nudges towards better information consumption. Repeated
exposure to such nudges can help promote trust in better information sources
and also improve societal outcomes in the long run. In this article, we will
talk about technological solutions that can help us in developing such an
approach, and introduce one such model called Trust Nudging.</description><identifier>DOI: 10.48550/arxiv.1911.05825</identifier><language>eng</language><subject>Computer Science - Computers and Society</subject><creationdate>2019-11</creationdate><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,777,882</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/1911.05825$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.1911.05825$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Horne, Benjamin D</creatorcontrib><creatorcontrib>Gruppi, Maurício</creatorcontrib><creatorcontrib>Adalı, Sibel</creatorcontrib><title>Trustworthy Misinformation Mitigation with Soft Information Nudging</title><description>Research in combating misinformation reports many negative results: facts may
not change minds, especially if they come from sources that are not trusted.
Individuals can disregard and justify lies told by trusted sources. This
problem is made even worse by social recommendation algorithms which help
amplify conspiracy theories and information confirming one's own biases due to
companies' efforts to optimize for clicks and watch time over individuals' own
values and public good. As a result, more nuanced voices and facts are drowned
out by a continuous erosion of trust in better information sources. Most
misinformation mitigation techniques assume that discrediting, filtering, or
demoting low veracity information will help news consumers make better
information decisions. However, these negative results indicate that some news
consumers, particularly extreme or conspiracy news consumers will not be
helped.
We argue that, given this background, technology solutions to combating
misinformation should not simply seek facts or discredit bad news sources, but
instead use more subtle nudges towards better information consumption. Repeated
exposure to such nudges can help promote trust in better information sources
and also improve societal outcomes in the long run. In this article, we will
talk about technological solutions that can help us in developing such an
approach, and introduce one such model called Trust Nudging.</description><subject>Computer Science - Computers and Society</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNpNj7FuwjAURb10qCgf0Kn5gQS_ZxvbI4pKi0TpEPbIduJgCZLKMQX-vi10YLr3SkdXOoQ8Ay24EoLOTDyH7wI0QEGFQvFIym08juk0xLS7ZB9hDL0f4sGkMPS_M4XuVk8h7bJq8Clb3QGbY9OFvnsiD97sx3b6nxNSLV-35Xu-_nxblYt1buZS5FI7jcyh5spTaLGFxljHKHKULeC8UZ6j9YZZoFxwpqT1qK0zFJSTkk3Iy-31KlF_xXAw8VL_ydRXGfYDbh1FRw</recordid><startdate>20191113</startdate><enddate>20191113</enddate><creator>Horne, Benjamin D</creator><creator>Gruppi, Maurício</creator><creator>Adalı, Sibel</creator><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20191113</creationdate><title>Trustworthy Misinformation Mitigation with Soft Information Nudging</title><author>Horne, Benjamin D ; Gruppi, Maurício ; Adalı, Sibel</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a675-79c923c2948f01e2e1dabc302427e126d8f42bfa3b10454387bf29bca018c773</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Computer Science - Computers and Society</topic><toplevel>online_resources</toplevel><creatorcontrib>Horne, Benjamin D</creatorcontrib><creatorcontrib>Gruppi, Maurício</creatorcontrib><creatorcontrib>Adalı, Sibel</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Horne, Benjamin D</au><au>Gruppi, Maurício</au><au>Adalı, Sibel</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Trustworthy Misinformation Mitigation with Soft Information Nudging</atitle><date>2019-11-13</date><risdate>2019</risdate><abstract>Research in combating misinformation reports many negative results: facts may
not change minds, especially if they come from sources that are not trusted.
Individuals can disregard and justify lies told by trusted sources. This
problem is made even worse by social recommendation algorithms which help
amplify conspiracy theories and information confirming one's own biases due to
companies' efforts to optimize for clicks and watch time over individuals' own
values and public good. As a result, more nuanced voices and facts are drowned
out by a continuous erosion of trust in better information sources. Most
misinformation mitigation techniques assume that discrediting, filtering, or
demoting low veracity information will help news consumers make better
information decisions. However, these negative results indicate that some news
consumers, particularly extreme or conspiracy news consumers will not be
helped.
We argue that, given this background, technology solutions to combating
misinformation should not simply seek facts or discredit bad news sources, but
instead use more subtle nudges towards better information consumption. Repeated
exposure to such nudges can help promote trust in better information sources
and also improve societal outcomes in the long run. In this article, we will
talk about technological solutions that can help us in developing such an
approach, and introduce one such model called Trust Nudging.</abstract><doi>10.48550/arxiv.1911.05825</doi><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | DOI: 10.48550/arxiv.1911.05825 |
ispartof | |
issn | |
language | eng |
recordid | cdi_arxiv_primary_1911_05825 |
source | arXiv.org |
subjects | Computer Science - Computers and Society |
title | Trustworthy Misinformation Mitigation with Soft Information Nudging |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-17T16%3A09%3A34IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Trustworthy%20Misinformation%20Mitigation%20with%20Soft%20Information%20Nudging&rft.au=Horne,%20Benjamin%20D&rft.date=2019-11-13&rft_id=info:doi/10.48550/arxiv.1911.05825&rft_dat=%3Carxiv_GOX%3E1911_05825%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true |