Examining the replicability of online experiments selected by a decision market
Here we test the feasibility of using decision markets to select studies for replication and provide evidence about the replicability of online experiments. Social scientists (n = 162) traded on the outcome of close replications of 41 systematically selected MTurk social science experiments publishe...
Gespeichert in:
Veröffentlicht in: | Nature human behaviour 2024-11 |
---|---|
Hauptverfasser: | , , , , , , , , , , , , , , , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | Nature human behaviour |
container_volume | |
creator | Holzmeister, Felix Johannesson, Magnus Camerer, Colin F Chen, Yiling Ho, Teck-Hua Hoogeveen, Suzanne Huber, Juergen Imai, Noriko Imai, Taisuke Jin, Lawrence Kirchler, Michael Ly, Alexander Mandl, Benjamin Manfredi, Dylan Nave, Gideon Nosek, Brian A Pfeiffer, Thomas Sarafoglou, Alexandra Schwaiger, Rene Wagenmakers, Eric-Jan Waldén, Viking Dreber, Anna |
description | Here we test the feasibility of using decision markets to select studies for replication and provide evidence about the replicability of online experiments. Social scientists (n = 162) traded on the outcome of close replications of 41 systematically selected MTurk social science experiments published in PNAS 2015-2018, knowing that the 12 studies with the lowest and the 12 with the highest final market prices would be selected for replication, along with 2 randomly selected studies. The replication rate, based on the statistical significance indicator, was 83% for the top-12 and 33% for the bottom-12 group. Overall, 54% of the studies were successfully replicated, with replication effect size estimates averaging 45% of the original effect size estimates. The replication rate varied between 54% and 62% for alternative replication indicators. The observed replicability of MTurk experiments is comparable to that of previous systematic replication projects involving laboratory experiments. |
doi_str_mv | 10.1038/s41562-024-02062-9 |
format | Article |
fullrecord | <record><control><sourceid>proquest_swepu</sourceid><recordid>TN_cdi_proquest_miscellaneous_3130830049</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3130830049</sourcerecordid><originalsourceid>FETCH-LOGICAL-c270t-23f749d71f18842114065cd1f5c3be76fcc293ece746e9e74dead917a5977df83</originalsourceid><addsrcrecordid>eNpNkMtOwzAQRS0EolXpD7BAXrIJ-BU7XqKqPKRK3cDacpwJNeRFnIr273FJqVjYvos5M-OD0DUld5Tw7D4ImkqWECbiITHpMzRlXKuEcyXO_-UJmofwQQihmgut5CWacB1ZpfUUrZc7W_vGN-942ADuoau8s7mv_LDHbYnbpvINYNh10PsamiHgABW4AQqc77HFBTgffNvg2vafMFyhi9JWAebHd4beHpevi-dktX56WTysEscUGRLGSyV0oWhJs0wwSgWRqStomTqeg5Klc0xzcKCEBB3vAmyhqbKpVqooMz5DdOwbvqHb5qaL29l-b1rrzWYTTABDqRIZZTKLX5cklZG5HZmub7-2EAZT--CgqmwD7TYYTjnJOCFCx1I2lrq-DaGH8jSAEnPwb0b_Jvo3v_7NAbo59t_mNRQn5M82_wF1qn-D</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3130830049</pqid></control><display><type>article</type><title>Examining the replicability of online experiments selected by a decision market</title><source>Nature</source><source>Alma/SFX Local Collection</source><creator>Holzmeister, Felix ; Johannesson, Magnus ; Camerer, Colin F ; Chen, Yiling ; Ho, Teck-Hua ; Hoogeveen, Suzanne ; Huber, Juergen ; Imai, Noriko ; Imai, Taisuke ; Jin, Lawrence ; Kirchler, Michael ; Ly, Alexander ; Mandl, Benjamin ; Manfredi, Dylan ; Nave, Gideon ; Nosek, Brian A ; Pfeiffer, Thomas ; Sarafoglou, Alexandra ; Schwaiger, Rene ; Wagenmakers, Eric-Jan ; Waldén, Viking ; Dreber, Anna</creator><creatorcontrib>Holzmeister, Felix ; Johannesson, Magnus ; Camerer, Colin F ; Chen, Yiling ; Ho, Teck-Hua ; Hoogeveen, Suzanne ; Huber, Juergen ; Imai, Noriko ; Imai, Taisuke ; Jin, Lawrence ; Kirchler, Michael ; Ly, Alexander ; Mandl, Benjamin ; Manfredi, Dylan ; Nave, Gideon ; Nosek, Brian A ; Pfeiffer, Thomas ; Sarafoglou, Alexandra ; Schwaiger, Rene ; Wagenmakers, Eric-Jan ; Waldén, Viking ; Dreber, Anna</creatorcontrib><description>Here we test the feasibility of using decision markets to select studies for replication and provide evidence about the replicability of online experiments. Social scientists (n = 162) traded on the outcome of close replications of 41 systematically selected MTurk social science experiments published in PNAS 2015-2018, knowing that the 12 studies with the lowest and the 12 with the highest final market prices would be selected for replication, along with 2 randomly selected studies. The replication rate, based on the statistical significance indicator, was 83% for the top-12 and 33% for the bottom-12 group. Overall, 54% of the studies were successfully replicated, with replication effect size estimates averaging 45% of the original effect size estimates. The replication rate varied between 54% and 62% for alternative replication indicators. The observed replicability of MTurk experiments is comparable to that of previous systematic replication projects involving laboratory experiments.</description><identifier>ISSN: 2397-3374</identifier><identifier>EISSN: 2397-3374</identifier><identifier>DOI: 10.1038/s41562-024-02062-9</identifier><identifier>PMID: 39562799</identifier><language>eng</language><publisher>England</publisher><ispartof>Nature human behaviour, 2024-11</ispartof><rights>2024. The Author(s).</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c270t-23f749d71f18842114065cd1f5c3be76fcc293ece746e9e74dead917a5977df83</cites><orcidid>0000-0002-0592-577X ; 0000-0003-3679-874X ; 0009-0004-7115-5509 ; 0000-0002-0610-8093 ; 0000-0002-5416-2545 ; 0000-0001-5210-4977 ; 0000-0001-9606-0427 ; 0000-0003-3989-9941 ; 0000-0001-8759-6393 ; 0000-0002-1304-8615 ; 0000-0001-6251-5630 ; 0000-0003-1596-1034 ; 0000-0003-0073-0321 ; 0000-0003-0031-685X</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>315,781,785,886,27929,27930</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/39562799$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink><backlink>$$Uhttps://research.hhs.se/esploro/outputs/journalArticle/Examining-the-replicability-of-online-experiments/991001606599506056$$DView record from Swedish Publication Index$$Hfree_for_read</backlink></links><search><creatorcontrib>Holzmeister, Felix</creatorcontrib><creatorcontrib>Johannesson, Magnus</creatorcontrib><creatorcontrib>Camerer, Colin F</creatorcontrib><creatorcontrib>Chen, Yiling</creatorcontrib><creatorcontrib>Ho, Teck-Hua</creatorcontrib><creatorcontrib>Hoogeveen, Suzanne</creatorcontrib><creatorcontrib>Huber, Juergen</creatorcontrib><creatorcontrib>Imai, Noriko</creatorcontrib><creatorcontrib>Imai, Taisuke</creatorcontrib><creatorcontrib>Jin, Lawrence</creatorcontrib><creatorcontrib>Kirchler, Michael</creatorcontrib><creatorcontrib>Ly, Alexander</creatorcontrib><creatorcontrib>Mandl, Benjamin</creatorcontrib><creatorcontrib>Manfredi, Dylan</creatorcontrib><creatorcontrib>Nave, Gideon</creatorcontrib><creatorcontrib>Nosek, Brian A</creatorcontrib><creatorcontrib>Pfeiffer, Thomas</creatorcontrib><creatorcontrib>Sarafoglou, Alexandra</creatorcontrib><creatorcontrib>Schwaiger, Rene</creatorcontrib><creatorcontrib>Wagenmakers, Eric-Jan</creatorcontrib><creatorcontrib>Waldén, Viking</creatorcontrib><creatorcontrib>Dreber, Anna</creatorcontrib><title>Examining the replicability of online experiments selected by a decision market</title><title>Nature human behaviour</title><addtitle>Nat Hum Behav</addtitle><description>Here we test the feasibility of using decision markets to select studies for replication and provide evidence about the replicability of online experiments. Social scientists (n = 162) traded on the outcome of close replications of 41 systematically selected MTurk social science experiments published in PNAS 2015-2018, knowing that the 12 studies with the lowest and the 12 with the highest final market prices would be selected for replication, along with 2 randomly selected studies. The replication rate, based on the statistical significance indicator, was 83% for the top-12 and 33% for the bottom-12 group. Overall, 54% of the studies were successfully replicated, with replication effect size estimates averaging 45% of the original effect size estimates. The replication rate varied between 54% and 62% for alternative replication indicators. The observed replicability of MTurk experiments is comparable to that of previous systematic replication projects involving laboratory experiments.</description><issn>2397-3374</issn><issn>2397-3374</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><recordid>eNpNkMtOwzAQRS0EolXpD7BAXrIJ-BU7XqKqPKRK3cDacpwJNeRFnIr273FJqVjYvos5M-OD0DUld5Tw7D4ImkqWECbiITHpMzRlXKuEcyXO_-UJmofwQQihmgut5CWacB1ZpfUUrZc7W_vGN-942ADuoau8s7mv_LDHbYnbpvINYNh10PsamiHgABW4AQqc77HFBTgffNvg2vafMFyhi9JWAebHd4beHpevi-dktX56WTysEscUGRLGSyV0oWhJs0wwSgWRqStomTqeg5Klc0xzcKCEBB3vAmyhqbKpVqooMz5DdOwbvqHb5qaL29l-b1rrzWYTTABDqRIZZTKLX5cklZG5HZmub7-2EAZT--CgqmwD7TYYTjnJOCFCx1I2lrq-DaGH8jSAEnPwb0b_Jvo3v_7NAbo59t_mNRQn5M82_wF1qn-D</recordid><startdate>20241119</startdate><enddate>20241119</enddate><creator>Holzmeister, Felix</creator><creator>Johannesson, Magnus</creator><creator>Camerer, Colin F</creator><creator>Chen, Yiling</creator><creator>Ho, Teck-Hua</creator><creator>Hoogeveen, Suzanne</creator><creator>Huber, Juergen</creator><creator>Imai, Noriko</creator><creator>Imai, Taisuke</creator><creator>Jin, Lawrence</creator><creator>Kirchler, Michael</creator><creator>Ly, Alexander</creator><creator>Mandl, Benjamin</creator><creator>Manfredi, Dylan</creator><creator>Nave, Gideon</creator><creator>Nosek, Brian A</creator><creator>Pfeiffer, Thomas</creator><creator>Sarafoglou, Alexandra</creator><creator>Schwaiger, Rene</creator><creator>Wagenmakers, Eric-Jan</creator><creator>Waldén, Viking</creator><creator>Dreber, Anna</creator><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope><scope>ADTPV</scope><scope>AOWAS</scope><orcidid>https://orcid.org/0000-0002-0592-577X</orcidid><orcidid>https://orcid.org/0000-0003-3679-874X</orcidid><orcidid>https://orcid.org/0009-0004-7115-5509</orcidid><orcidid>https://orcid.org/0000-0002-0610-8093</orcidid><orcidid>https://orcid.org/0000-0002-5416-2545</orcidid><orcidid>https://orcid.org/0000-0001-5210-4977</orcidid><orcidid>https://orcid.org/0000-0001-9606-0427</orcidid><orcidid>https://orcid.org/0000-0003-3989-9941</orcidid><orcidid>https://orcid.org/0000-0001-8759-6393</orcidid><orcidid>https://orcid.org/0000-0002-1304-8615</orcidid><orcidid>https://orcid.org/0000-0001-6251-5630</orcidid><orcidid>https://orcid.org/0000-0003-1596-1034</orcidid><orcidid>https://orcid.org/0000-0003-0073-0321</orcidid><orcidid>https://orcid.org/0000-0003-0031-685X</orcidid></search><sort><creationdate>20241119</creationdate><title>Examining the replicability of online experiments selected by a decision market</title><author>Holzmeister, Felix ; Johannesson, Magnus ; Camerer, Colin F ; Chen, Yiling ; Ho, Teck-Hua ; Hoogeveen, Suzanne ; Huber, Juergen ; Imai, Noriko ; Imai, Taisuke ; Jin, Lawrence ; Kirchler, Michael ; Ly, Alexander ; Mandl, Benjamin ; Manfredi, Dylan ; Nave, Gideon ; Nosek, Brian A ; Pfeiffer, Thomas ; Sarafoglou, Alexandra ; Schwaiger, Rene ; Wagenmakers, Eric-Jan ; Waldén, Viking ; Dreber, Anna</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c270t-23f749d71f18842114065cd1f5c3be76fcc293ece746e9e74dead917a5977df83</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Holzmeister, Felix</creatorcontrib><creatorcontrib>Johannesson, Magnus</creatorcontrib><creatorcontrib>Camerer, Colin F</creatorcontrib><creatorcontrib>Chen, Yiling</creatorcontrib><creatorcontrib>Ho, Teck-Hua</creatorcontrib><creatorcontrib>Hoogeveen, Suzanne</creatorcontrib><creatorcontrib>Huber, Juergen</creatorcontrib><creatorcontrib>Imai, Noriko</creatorcontrib><creatorcontrib>Imai, Taisuke</creatorcontrib><creatorcontrib>Jin, Lawrence</creatorcontrib><creatorcontrib>Kirchler, Michael</creatorcontrib><creatorcontrib>Ly, Alexander</creatorcontrib><creatorcontrib>Mandl, Benjamin</creatorcontrib><creatorcontrib>Manfredi, Dylan</creatorcontrib><creatorcontrib>Nave, Gideon</creatorcontrib><creatorcontrib>Nosek, Brian A</creatorcontrib><creatorcontrib>Pfeiffer, Thomas</creatorcontrib><creatorcontrib>Sarafoglou, Alexandra</creatorcontrib><creatorcontrib>Schwaiger, Rene</creatorcontrib><creatorcontrib>Wagenmakers, Eric-Jan</creatorcontrib><creatorcontrib>Waldén, Viking</creatorcontrib><creatorcontrib>Dreber, Anna</creatorcontrib><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><collection>SwePub</collection><collection>SwePub Articles</collection><jtitle>Nature human behaviour</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Holzmeister, Felix</au><au>Johannesson, Magnus</au><au>Camerer, Colin F</au><au>Chen, Yiling</au><au>Ho, Teck-Hua</au><au>Hoogeveen, Suzanne</au><au>Huber, Juergen</au><au>Imai, Noriko</au><au>Imai, Taisuke</au><au>Jin, Lawrence</au><au>Kirchler, Michael</au><au>Ly, Alexander</au><au>Mandl, Benjamin</au><au>Manfredi, Dylan</au><au>Nave, Gideon</au><au>Nosek, Brian A</au><au>Pfeiffer, Thomas</au><au>Sarafoglou, Alexandra</au><au>Schwaiger, Rene</au><au>Wagenmakers, Eric-Jan</au><au>Waldén, Viking</au><au>Dreber, Anna</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Examining the replicability of online experiments selected by a decision market</atitle><jtitle>Nature human behaviour</jtitle><addtitle>Nat Hum Behav</addtitle><date>2024-11-19</date><risdate>2024</risdate><issn>2397-3374</issn><eissn>2397-3374</eissn><abstract>Here we test the feasibility of using decision markets to select studies for replication and provide evidence about the replicability of online experiments. Social scientists (n = 162) traded on the outcome of close replications of 41 systematically selected MTurk social science experiments published in PNAS 2015-2018, knowing that the 12 studies with the lowest and the 12 with the highest final market prices would be selected for replication, along with 2 randomly selected studies. The replication rate, based on the statistical significance indicator, was 83% for the top-12 and 33% for the bottom-12 group. Overall, 54% of the studies were successfully replicated, with replication effect size estimates averaging 45% of the original effect size estimates. The replication rate varied between 54% and 62% for alternative replication indicators. The observed replicability of MTurk experiments is comparable to that of previous systematic replication projects involving laboratory experiments.</abstract><cop>England</cop><pmid>39562799</pmid><doi>10.1038/s41562-024-02062-9</doi><orcidid>https://orcid.org/0000-0002-0592-577X</orcidid><orcidid>https://orcid.org/0000-0003-3679-874X</orcidid><orcidid>https://orcid.org/0009-0004-7115-5509</orcidid><orcidid>https://orcid.org/0000-0002-0610-8093</orcidid><orcidid>https://orcid.org/0000-0002-5416-2545</orcidid><orcidid>https://orcid.org/0000-0001-5210-4977</orcidid><orcidid>https://orcid.org/0000-0001-9606-0427</orcidid><orcidid>https://orcid.org/0000-0003-3989-9941</orcidid><orcidid>https://orcid.org/0000-0001-8759-6393</orcidid><orcidid>https://orcid.org/0000-0002-1304-8615</orcidid><orcidid>https://orcid.org/0000-0001-6251-5630</orcidid><orcidid>https://orcid.org/0000-0003-1596-1034</orcidid><orcidid>https://orcid.org/0000-0003-0073-0321</orcidid><orcidid>https://orcid.org/0000-0003-0031-685X</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 2397-3374 |
ispartof | Nature human behaviour, 2024-11 |
issn | 2397-3374 2397-3374 |
language | eng |
recordid | cdi_proquest_miscellaneous_3130830049 |
source | Nature; Alma/SFX Local Collection |
title | Examining the replicability of online experiments selected by a decision market |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-12T08%3A21%3A44IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_swepu&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Examining%20the%20replicability%20of%20online%20experiments%20selected%20by%20a%20decision%20market&rft.jtitle=Nature%20human%20behaviour&rft.au=Holzmeister,%20Felix&rft.date=2024-11-19&rft.issn=2397-3374&rft.eissn=2397-3374&rft_id=info:doi/10.1038/s41562-024-02062-9&rft_dat=%3Cproquest_swepu%3E3130830049%3C/proquest_swepu%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3130830049&rft_id=info:pmid/39562799&rfr_iscdi=true |