An evaluation of Cochrane Crowd found that crowdsourcing produced accurate results in identifying randomized trials

Filtering the deluge of new research to facilitate evidence synthesis has proven to be unmanageable using current paradigms of search and retrieval. Crowdsourcing, a way of harnessing the collective effort of a “crowd” of people, has the potential to support evidence synthesis by addressing this inf...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of clinical epidemiology 2021-05, Vol.133, p.130-139
Hauptverfasser: Noel-Storr, Anna, Dooley, Gordon, Elliott, Julian, Steele, Emily, Shemilt, Ian, Mavergames, Chris, Wisniewski, Susanna, McDonald, Steven, Murano, Melissa, Glanville, Julie, Foxlee, Ruth, Beecher, Deirdre, Ware, Jennifer, Thomas, James
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 139
container_issue
container_start_page 130
container_title Journal of clinical epidemiology
container_volume 133
creator Noel-Storr, Anna
Dooley, Gordon
Elliott, Julian
Steele, Emily
Shemilt, Ian
Mavergames, Chris
Wisniewski, Susanna
McDonald, Steven
Murano, Melissa
Glanville, Julie
Foxlee, Ruth
Beecher, Deirdre
Ware, Jennifer
Thomas, James
description Filtering the deluge of new research to facilitate evidence synthesis has proven to be unmanageable using current paradigms of search and retrieval. Crowdsourcing, a way of harnessing the collective effort of a “crowd” of people, has the potential to support evidence synthesis by addressing this information overload created by the exponential growth in primary research outputs. Cochrane Crowd, Cochrane's citizen science platform, offers a range of tasks aimed at identifying studies related to health care. Accompanying each task are brief, interactive training modules, and agreement algorithms that help ensure accurate collective decision-making.The aims of the study were to evaluate the performance of Cochrane Crowd in terms of its accuracy, capacity, and autonomy and to examine contributor engagement across three tasks aimed at identifying randomized trials. Crowd accuracy was evaluated by measuring the sensitivity and specificity of crowd screening decisions on a sample of titles and abstracts, compared with “quasi gold-standard” decisions about the same records using the conventional methods of dual screening. Crowd capacity, in the form of output volume, was evaluated by measuring the number of records processed by the crowd, compared with baseline. Crowd autonomy, the capability of the crowd to produce accurate collectively derived decisions without the need for expert resolution, was measured by the proportion of records that needed resolving by an expert. The Cochrane Crowd community currently has 18,897 contributors from 163 countries. Collectively, the Crowd has processed 1,021,227 records, helping to identify 178,437 reports of randomized controlled trials (RCTs) for Cochrane's Central Register of Controlled Trials. The sensitivity for each task was 99.1% for the RCT identification task (RCT ID), 99.7% for the RCT identification task of trials from ClinicalTrials.gov (CT ID), and 97.7% for the identification of RCTs from the International Clinical Trials Registry Platform (ICTRP ID). The specificity for each task was 99% for RCT ID, 98.6% for CT ID, and 99.1% for CT ICTRP ID. The capacity of the combined Crowd and machine learning workflow has increased fivefold in 6 years, compared with baseline. The proportion of records requiring expert resolution across the tasks ranged from 16.6% to 19.7%. Cochrane Crowd is sufficiently accurate and scalable to keep pace with the current rate of publication (and registration) of new primary studies. It has a
doi_str_mv 10.1016/j.jclinepi.2021.01.006
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_2480267945</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0895435621000081</els_id><sourcerecordid>2529826353</sourcerecordid><originalsourceid>FETCH-LOGICAL-c444t-dcbcadfc29a7c048b9d86f2200e6e9a5639ba92d5fcbe0f8302610c211c4f77e3</originalsourceid><addsrcrecordid>eNqFkU9rHSEUxaU0NK9Jv0IQuulmXtVRZ9w1PNI_EOgmWYuj18Zhnr6qk5J--vp4SRfdFC5ckN-53nsOQleUbCmh8uO8ne0SIhzClhFGt6QVka_Qho7D2AnF6Gu0IaMSHe-FPEdvS5kJoQMZxBt03vd8kINUG1SuI4ZHs6ymhhRx8niX7EM2EfAup18O-7RGh-uDqdgeH0pasw3xBz7k5FYLDhtr12wq4AxlXWrBIeLgINbgn45gG-bSPvxuaM3BLOUSnfnW4N1zv0D3n2_udl-72-9fvu2ubzvLOa-ds5M1zlumzGAJHyflRukZIwQkKCNkryajmBPeTkD82BMmKbGMUsv9MEB_gT6c5rZVf65Qqt6HYmFZ2nVpLZrxsUkGxUVD3_-Dzu3O2LbTTDA1MtmLvlHyRDUnSsng9SGHvclPmhJ9jEXP-iUWfYxFk1ZENuHV8_h12oP7K3vJoQGfTgA0Px4DZF1sgNjsDRls1S6F__3xBxM4pCI</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2529826353</pqid></control><display><type>article</type><title>An evaluation of Cochrane Crowd found that crowdsourcing produced accurate results in identifying randomized trials</title><source>Elsevier ScienceDirect Journals</source><creator>Noel-Storr, Anna ; Dooley, Gordon ; Elliott, Julian ; Steele, Emily ; Shemilt, Ian ; Mavergames, Chris ; Wisniewski, Susanna ; McDonald, Steven ; Murano, Melissa ; Glanville, Julie ; Foxlee, Ruth ; Beecher, Deirdre ; Ware, Jennifer ; Thomas, James</creator><creatorcontrib>Noel-Storr, Anna ; Dooley, Gordon ; Elliott, Julian ; Steele, Emily ; Shemilt, Ian ; Mavergames, Chris ; Wisniewski, Susanna ; McDonald, Steven ; Murano, Melissa ; Glanville, Julie ; Foxlee, Ruth ; Beecher, Deirdre ; Ware, Jennifer ; Thomas, James</creatorcontrib><description>Filtering the deluge of new research to facilitate evidence synthesis has proven to be unmanageable using current paradigms of search and retrieval. Crowdsourcing, a way of harnessing the collective effort of a “crowd” of people, has the potential to support evidence synthesis by addressing this information overload created by the exponential growth in primary research outputs. Cochrane Crowd, Cochrane's citizen science platform, offers a range of tasks aimed at identifying studies related to health care. Accompanying each task are brief, interactive training modules, and agreement algorithms that help ensure accurate collective decision-making.The aims of the study were to evaluate the performance of Cochrane Crowd in terms of its accuracy, capacity, and autonomy and to examine contributor engagement across three tasks aimed at identifying randomized trials. Crowd accuracy was evaluated by measuring the sensitivity and specificity of crowd screening decisions on a sample of titles and abstracts, compared with “quasi gold-standard” decisions about the same records using the conventional methods of dual screening. Crowd capacity, in the form of output volume, was evaluated by measuring the number of records processed by the crowd, compared with baseline. Crowd autonomy, the capability of the crowd to produce accurate collectively derived decisions without the need for expert resolution, was measured by the proportion of records that needed resolving by an expert. The Cochrane Crowd community currently has 18,897 contributors from 163 countries. Collectively, the Crowd has processed 1,021,227 records, helping to identify 178,437 reports of randomized controlled trials (RCTs) for Cochrane's Central Register of Controlled Trials. The sensitivity for each task was 99.1% for the RCT identification task (RCT ID), 99.7% for the RCT identification task of trials from ClinicalTrials.gov (CT ID), and 97.7% for the identification of RCTs from the International Clinical Trials Registry Platform (ICTRP ID). The specificity for each task was 99% for RCT ID, 98.6% for CT ID, and 99.1% for CT ICTRP ID. The capacity of the combined Crowd and machine learning workflow has increased fivefold in 6 years, compared with baseline. The proportion of records requiring expert resolution across the tasks ranged from 16.6% to 19.7%. Cochrane Crowd is sufficiently accurate and scalable to keep pace with the current rate of publication (and registration) of new primary studies. It has also proved to be a popular, efficient, and accurate way for a large number of people to play an important voluntary role in health evidence production. Cochrane Crowd is now an established part of Cochrane's effort to manage the deluge of primary research being produced. •Crowdsourcing the identification of randomized controlled trials via Cochrane Crowd is 99% accurate.•Cochrane Crowd uses crowd sourcing and machine learning to identify trials.•Cochrane Crowd has attracted contributors from over 150 countries.</description><identifier>ISSN: 0895-4356</identifier><identifier>EISSN: 1878-5921</identifier><identifier>DOI: 10.1016/j.jclinepi.2021.01.006</identifier><identifier>PMID: 33476769</identifier><language>eng</language><publisher>United States: Elsevier Inc</publisher><subject>Algorithms ; Autonomy ; Bibliographic data bases ; Citizen science ; Clinical trials ; Cochrane ; Crowdsourcing ; Decision making ; Design ; Epidemiology ; Evidence production ; Human intelligence tasking ; Identification ; Information management ; Learning algorithms ; Machine learning ; Medical research ; Performance evaluation ; Randomized controlled trial ; Screening ; Synthesis ; Systematic review ; Workflow</subject><ispartof>Journal of clinical epidemiology, 2021-05, Vol.133, p.130-139</ispartof><rights>2021 The Authors</rights><rights>Copyright © 2021 The Authors. Published by Elsevier Inc. All rights reserved.</rights><rights>2021. The Authors</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c444t-dcbcadfc29a7c048b9d86f2200e6e9a5639ba92d5fcbe0f8302610c211c4f77e3</citedby><cites>FETCH-LOGICAL-c444t-dcbcadfc29a7c048b9d86f2200e6e9a5639ba92d5fcbe0f8302610c211c4f77e3</cites><orcidid>0000-0001-6354-2729 ; 0000-0003-2762-6196 ; 0000-0002-3899-8844 ; 0000-0002-8492-7860 ; 0000-0002-9165-5875 ; 0000-0002-1253-8524 ; 0000-0003-4805-4190</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.sciencedirect.com/science/article/pii/S0895435621000081$$EHTML$$P50$$Gelsevier$$Hfree_for_read</linktohtml><link.rule.ids>314,776,780,3537,27901,27902,65306</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/33476769$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Noel-Storr, Anna</creatorcontrib><creatorcontrib>Dooley, Gordon</creatorcontrib><creatorcontrib>Elliott, Julian</creatorcontrib><creatorcontrib>Steele, Emily</creatorcontrib><creatorcontrib>Shemilt, Ian</creatorcontrib><creatorcontrib>Mavergames, Chris</creatorcontrib><creatorcontrib>Wisniewski, Susanna</creatorcontrib><creatorcontrib>McDonald, Steven</creatorcontrib><creatorcontrib>Murano, Melissa</creatorcontrib><creatorcontrib>Glanville, Julie</creatorcontrib><creatorcontrib>Foxlee, Ruth</creatorcontrib><creatorcontrib>Beecher, Deirdre</creatorcontrib><creatorcontrib>Ware, Jennifer</creatorcontrib><creatorcontrib>Thomas, James</creatorcontrib><title>An evaluation of Cochrane Crowd found that crowdsourcing produced accurate results in identifying randomized trials</title><title>Journal of clinical epidemiology</title><addtitle>J Clin Epidemiol</addtitle><description>Filtering the deluge of new research to facilitate evidence synthesis has proven to be unmanageable using current paradigms of search and retrieval. Crowdsourcing, a way of harnessing the collective effort of a “crowd” of people, has the potential to support evidence synthesis by addressing this information overload created by the exponential growth in primary research outputs. Cochrane Crowd, Cochrane's citizen science platform, offers a range of tasks aimed at identifying studies related to health care. Accompanying each task are brief, interactive training modules, and agreement algorithms that help ensure accurate collective decision-making.The aims of the study were to evaluate the performance of Cochrane Crowd in terms of its accuracy, capacity, and autonomy and to examine contributor engagement across three tasks aimed at identifying randomized trials. Crowd accuracy was evaluated by measuring the sensitivity and specificity of crowd screening decisions on a sample of titles and abstracts, compared with “quasi gold-standard” decisions about the same records using the conventional methods of dual screening. Crowd capacity, in the form of output volume, was evaluated by measuring the number of records processed by the crowd, compared with baseline. Crowd autonomy, the capability of the crowd to produce accurate collectively derived decisions without the need for expert resolution, was measured by the proportion of records that needed resolving by an expert. The Cochrane Crowd community currently has 18,897 contributors from 163 countries. Collectively, the Crowd has processed 1,021,227 records, helping to identify 178,437 reports of randomized controlled trials (RCTs) for Cochrane's Central Register of Controlled Trials. The sensitivity for each task was 99.1% for the RCT identification task (RCT ID), 99.7% for the RCT identification task of trials from ClinicalTrials.gov (CT ID), and 97.7% for the identification of RCTs from the International Clinical Trials Registry Platform (ICTRP ID). The specificity for each task was 99% for RCT ID, 98.6% for CT ID, and 99.1% for CT ICTRP ID. The capacity of the combined Crowd and machine learning workflow has increased fivefold in 6 years, compared with baseline. The proportion of records requiring expert resolution across the tasks ranged from 16.6% to 19.7%. Cochrane Crowd is sufficiently accurate and scalable to keep pace with the current rate of publication (and registration) of new primary studies. It has also proved to be a popular, efficient, and accurate way for a large number of people to play an important voluntary role in health evidence production. Cochrane Crowd is now an established part of Cochrane's effort to manage the deluge of primary research being produced. •Crowdsourcing the identification of randomized controlled trials via Cochrane Crowd is 99% accurate.•Cochrane Crowd uses crowd sourcing and machine learning to identify trials.•Cochrane Crowd has attracted contributors from over 150 countries.</description><subject>Algorithms</subject><subject>Autonomy</subject><subject>Bibliographic data bases</subject><subject>Citizen science</subject><subject>Clinical trials</subject><subject>Cochrane</subject><subject>Crowdsourcing</subject><subject>Decision making</subject><subject>Design</subject><subject>Epidemiology</subject><subject>Evidence production</subject><subject>Human intelligence tasking</subject><subject>Identification</subject><subject>Information management</subject><subject>Learning algorithms</subject><subject>Machine learning</subject><subject>Medical research</subject><subject>Performance evaluation</subject><subject>Randomized controlled trial</subject><subject>Screening</subject><subject>Synthesis</subject><subject>Systematic review</subject><subject>Workflow</subject><issn>0895-4356</issn><issn>1878-5921</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>8G5</sourceid><sourceid>BENPR</sourceid><sourceid>GUQSH</sourceid><sourceid>M2O</sourceid><recordid>eNqFkU9rHSEUxaU0NK9Jv0IQuulmXtVRZ9w1PNI_EOgmWYuj18Zhnr6qk5J--vp4SRfdFC5ckN-53nsOQleUbCmh8uO8ne0SIhzClhFGt6QVka_Qho7D2AnF6Gu0IaMSHe-FPEdvS5kJoQMZxBt03vd8kINUG1SuI4ZHs6ymhhRx8niX7EM2EfAup18O-7RGh-uDqdgeH0pasw3xBz7k5FYLDhtr12wq4AxlXWrBIeLgINbgn45gG-bSPvxuaM3BLOUSnfnW4N1zv0D3n2_udl-72-9fvu2ubzvLOa-ds5M1zlumzGAJHyflRukZIwQkKCNkryajmBPeTkD82BMmKbGMUsv9MEB_gT6c5rZVf65Qqt6HYmFZ2nVpLZrxsUkGxUVD3_-Dzu3O2LbTTDA1MtmLvlHyRDUnSsng9SGHvclPmhJ9jEXP-iUWfYxFk1ZENuHV8_h12oP7K3vJoQGfTgA0Px4DZF1sgNjsDRls1S6F__3xBxM4pCI</recordid><startdate>20210501</startdate><enddate>20210501</enddate><creator>Noel-Storr, Anna</creator><creator>Dooley, Gordon</creator><creator>Elliott, Julian</creator><creator>Steele, Emily</creator><creator>Shemilt, Ian</creator><creator>Mavergames, Chris</creator><creator>Wisniewski, Susanna</creator><creator>McDonald, Steven</creator><creator>Murano, Melissa</creator><creator>Glanville, Julie</creator><creator>Foxlee, Ruth</creator><creator>Beecher, Deirdre</creator><creator>Ware, Jennifer</creator><creator>Thomas, James</creator><general>Elsevier Inc</general><general>Elsevier Limited</general><scope>6I.</scope><scope>AAFTH</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7QL</scope><scope>7QP</scope><scope>7RV</scope><scope>7T2</scope><scope>7T7</scope><scope>7TK</scope><scope>7U7</scope><scope>7U9</scope><scope>7X7</scope><scope>7XB</scope><scope>88C</scope><scope>88E</scope><scope>8AO</scope><scope>8C1</scope><scope>8FD</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>8G5</scope><scope>ABUWG</scope><scope>AEUYN</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>C1K</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>GUQSH</scope><scope>H94</scope><scope>K9.</scope><scope>KB0</scope><scope>M0S</scope><scope>M0T</scope><scope>M1P</scope><scope>M2O</scope><scope>M7N</scope><scope>MBDVC</scope><scope>NAPCQ</scope><scope>P64</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>Q9U</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0001-6354-2729</orcidid><orcidid>https://orcid.org/0000-0003-2762-6196</orcidid><orcidid>https://orcid.org/0000-0002-3899-8844</orcidid><orcidid>https://orcid.org/0000-0002-8492-7860</orcidid><orcidid>https://orcid.org/0000-0002-9165-5875</orcidid><orcidid>https://orcid.org/0000-0002-1253-8524</orcidid><orcidid>https://orcid.org/0000-0003-4805-4190</orcidid></search><sort><creationdate>20210501</creationdate><title>An evaluation of Cochrane Crowd found that crowdsourcing produced accurate results in identifying randomized trials</title><author>Noel-Storr, Anna ; Dooley, Gordon ; Elliott, Julian ; Steele, Emily ; Shemilt, Ian ; Mavergames, Chris ; Wisniewski, Susanna ; McDonald, Steven ; Murano, Melissa ; Glanville, Julie ; Foxlee, Ruth ; Beecher, Deirdre ; Ware, Jennifer ; Thomas, James</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c444t-dcbcadfc29a7c048b9d86f2200e6e9a5639ba92d5fcbe0f8302610c211c4f77e3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Algorithms</topic><topic>Autonomy</topic><topic>Bibliographic data bases</topic><topic>Citizen science</topic><topic>Clinical trials</topic><topic>Cochrane</topic><topic>Crowdsourcing</topic><topic>Decision making</topic><topic>Design</topic><topic>Epidemiology</topic><topic>Evidence production</topic><topic>Human intelligence tasking</topic><topic>Identification</topic><topic>Information management</topic><topic>Learning algorithms</topic><topic>Machine learning</topic><topic>Medical research</topic><topic>Performance evaluation</topic><topic>Randomized controlled trial</topic><topic>Screening</topic><topic>Synthesis</topic><topic>Systematic review</topic><topic>Workflow</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Noel-Storr, Anna</creatorcontrib><creatorcontrib>Dooley, Gordon</creatorcontrib><creatorcontrib>Elliott, Julian</creatorcontrib><creatorcontrib>Steele, Emily</creatorcontrib><creatorcontrib>Shemilt, Ian</creatorcontrib><creatorcontrib>Mavergames, Chris</creatorcontrib><creatorcontrib>Wisniewski, Susanna</creatorcontrib><creatorcontrib>McDonald, Steven</creatorcontrib><creatorcontrib>Murano, Melissa</creatorcontrib><creatorcontrib>Glanville, Julie</creatorcontrib><creatorcontrib>Foxlee, Ruth</creatorcontrib><creatorcontrib>Beecher, Deirdre</creatorcontrib><creatorcontrib>Ware, Jennifer</creatorcontrib><creatorcontrib>Thomas, James</creatorcontrib><collection>ScienceDirect Open Access Titles</collection><collection>Elsevier:ScienceDirect:Open Access</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Bacteriology Abstracts (Microbiology B)</collection><collection>Calcium &amp; Calcified Tissue Abstracts</collection><collection>Nursing &amp; Allied Health Database</collection><collection>Health and Safety Science Abstracts (Full archive)</collection><collection>Industrial and Applied Microbiology Abstracts (Microbiology A)</collection><collection>Neurosciences Abstracts</collection><collection>Toxicology Abstracts</collection><collection>Virology and AIDS Abstracts</collection><collection>Health &amp; Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Healthcare Administration Database (Alumni)</collection><collection>Medical Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Public Health Database</collection><collection>Technology Research Database</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Research Library (Alumni Edition)</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest One Sustainability</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Environmental Sciences and Pollution Management</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>Research Library Prep</collection><collection>AIDS and Cancer Research Abstracts</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>Nursing &amp; Allied Health Database (Alumni Edition)</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>Healthcare Administration Database</collection><collection>Medical Database</collection><collection>Research Library</collection><collection>Algology Mycology and Protozoology Abstracts (Microbiology C)</collection><collection>Research Library (Corporate)</collection><collection>Nursing &amp; Allied Health Premium</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central Basic</collection><collection>MEDLINE - Academic</collection><jtitle>Journal of clinical epidemiology</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Noel-Storr, Anna</au><au>Dooley, Gordon</au><au>Elliott, Julian</au><au>Steele, Emily</au><au>Shemilt, Ian</au><au>Mavergames, Chris</au><au>Wisniewski, Susanna</au><au>McDonald, Steven</au><au>Murano, Melissa</au><au>Glanville, Julie</au><au>Foxlee, Ruth</au><au>Beecher, Deirdre</au><au>Ware, Jennifer</au><au>Thomas, James</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>An evaluation of Cochrane Crowd found that crowdsourcing produced accurate results in identifying randomized trials</atitle><jtitle>Journal of clinical epidemiology</jtitle><addtitle>J Clin Epidemiol</addtitle><date>2021-05-01</date><risdate>2021</risdate><volume>133</volume><spage>130</spage><epage>139</epage><pages>130-139</pages><issn>0895-4356</issn><eissn>1878-5921</eissn><abstract>Filtering the deluge of new research to facilitate evidence synthesis has proven to be unmanageable using current paradigms of search and retrieval. Crowdsourcing, a way of harnessing the collective effort of a “crowd” of people, has the potential to support evidence synthesis by addressing this information overload created by the exponential growth in primary research outputs. Cochrane Crowd, Cochrane's citizen science platform, offers a range of tasks aimed at identifying studies related to health care. Accompanying each task are brief, interactive training modules, and agreement algorithms that help ensure accurate collective decision-making.The aims of the study were to evaluate the performance of Cochrane Crowd in terms of its accuracy, capacity, and autonomy and to examine contributor engagement across three tasks aimed at identifying randomized trials. Crowd accuracy was evaluated by measuring the sensitivity and specificity of crowd screening decisions on a sample of titles and abstracts, compared with “quasi gold-standard” decisions about the same records using the conventional methods of dual screening. Crowd capacity, in the form of output volume, was evaluated by measuring the number of records processed by the crowd, compared with baseline. Crowd autonomy, the capability of the crowd to produce accurate collectively derived decisions without the need for expert resolution, was measured by the proportion of records that needed resolving by an expert. The Cochrane Crowd community currently has 18,897 contributors from 163 countries. Collectively, the Crowd has processed 1,021,227 records, helping to identify 178,437 reports of randomized controlled trials (RCTs) for Cochrane's Central Register of Controlled Trials. The sensitivity for each task was 99.1% for the RCT identification task (RCT ID), 99.7% for the RCT identification task of trials from ClinicalTrials.gov (CT ID), and 97.7% for the identification of RCTs from the International Clinical Trials Registry Platform (ICTRP ID). The specificity for each task was 99% for RCT ID, 98.6% for CT ID, and 99.1% for CT ICTRP ID. The capacity of the combined Crowd and machine learning workflow has increased fivefold in 6 years, compared with baseline. The proportion of records requiring expert resolution across the tasks ranged from 16.6% to 19.7%. Cochrane Crowd is sufficiently accurate and scalable to keep pace with the current rate of publication (and registration) of new primary studies. It has also proved to be a popular, efficient, and accurate way for a large number of people to play an important voluntary role in health evidence production. Cochrane Crowd is now an established part of Cochrane's effort to manage the deluge of primary research being produced. •Crowdsourcing the identification of randomized controlled trials via Cochrane Crowd is 99% accurate.•Cochrane Crowd uses crowd sourcing and machine learning to identify trials.•Cochrane Crowd has attracted contributors from over 150 countries.</abstract><cop>United States</cop><pub>Elsevier Inc</pub><pmid>33476769</pmid><doi>10.1016/j.jclinepi.2021.01.006</doi><tpages>10</tpages><orcidid>https://orcid.org/0000-0001-6354-2729</orcidid><orcidid>https://orcid.org/0000-0003-2762-6196</orcidid><orcidid>https://orcid.org/0000-0002-3899-8844</orcidid><orcidid>https://orcid.org/0000-0002-8492-7860</orcidid><orcidid>https://orcid.org/0000-0002-9165-5875</orcidid><orcidid>https://orcid.org/0000-0002-1253-8524</orcidid><orcidid>https://orcid.org/0000-0003-4805-4190</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 0895-4356
ispartof Journal of clinical epidemiology, 2021-05, Vol.133, p.130-139
issn 0895-4356
1878-5921
language eng
recordid cdi_proquest_miscellaneous_2480267945
source Elsevier ScienceDirect Journals
subjects Algorithms
Autonomy
Bibliographic data bases
Citizen science
Clinical trials
Cochrane
Crowdsourcing
Decision making
Design
Epidemiology
Evidence production
Human intelligence tasking
Identification
Information management
Learning algorithms
Machine learning
Medical research
Performance evaluation
Randomized controlled trial
Screening
Synthesis
Systematic review
Workflow
title An evaluation of Cochrane Crowd found that crowdsourcing produced accurate results in identifying randomized trials
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-08T13%3A26%3A24IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=An%20evaluation%20of%20Cochrane%20Crowd%20found%20that%20crowdsourcing%20produced%20accurate%20results%20in%20identifying%20randomized%20trials&rft.jtitle=Journal%20of%20clinical%20epidemiology&rft.au=Noel-Storr,%20Anna&rft.date=2021-05-01&rft.volume=133&rft.spage=130&rft.epage=139&rft.pages=130-139&rft.issn=0895-4356&rft.eissn=1878-5921&rft_id=info:doi/10.1016/j.jclinepi.2021.01.006&rft_dat=%3Cproquest_cross%3E2529826353%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2529826353&rft_id=info:pmid/33476769&rft_els_id=S0895435621000081&rfr_iscdi=true