How (not) to Incent Crowd Workers: Payment Schemes and Feedback in Crowdsourcing

Crowdsourcing gains momentum: In digital work places such as Amazon Mechanical Turk, oDesk, Clickworker, 99designs, or InnoCentive it is easy to distribute human work to hundreds or thousands of freelancers. In these crowdsourcing settings, one challenge is to properly incent worker effort to create...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Business & information systems engineering 2015-06, Vol.57 (3), p.167-179
Hauptverfasser: Straub, Tim, Gimpel, Henner, Teschner, Florian, Weinhardt, Christof
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 179
container_issue 3
container_start_page 167
container_title Business & information systems engineering
container_volume 57
creator Straub, Tim
Gimpel, Henner
Teschner, Florian
Weinhardt, Christof
description Crowdsourcing gains momentum: In digital work places such as Amazon Mechanical Turk, oDesk, Clickworker, 99designs, or InnoCentive it is easy to distribute human work to hundreds or thousands of freelancers. In these crowdsourcing settings, one challenge is to properly incent worker effort to create value. Common incentive schemes are piece rate payments and rank-order tournaments among workers. Tournaments might or might not disclose a worker’s current competitive position via a leaderboard. Following an exploratory approach, we derive a model on worker performance in rank-order tournaments and present a series of real effort studies using experimental techniques on an online labor market to test the model and to compare dyadic tournaments to piece rate payments. Data suggests that on average dyadic tournaments do not improve performance compared to a simple piece rate for simple and short crowdsourcing tasks. Furthermore, giving feedback on the competitive position in such tournaments tends to be negatively related to workers’ performance. This relation is partially mediated by task completion and moderated by the provision of feedback: When playing against strong competitors, feedback is associated with workers quitting the task altogether and, thus, showing lower performance. When the competitors are weak, workers tend to complete the task but with reduced effort. Overall, individual piece rate payments are most simple to communicate and implement while incenting performance is on par with more complex dyadic tournaments.
doi_str_mv 10.1007/s12599-015-0384-2
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_1701000050</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3678821171</sourcerecordid><originalsourceid>FETCH-LOGICAL-c332t-51b70f3d48ec653a795e043078e70fe2cf6da7ade49711416b2a2f798ba785323</originalsourceid><addsrcrecordid>eNp1kMFKAzEQhoMoWGofwNuKl3qIziSbzOYoRW2h4EXxGNLdrLS2m5psKb69u6wHEZzLwPD9w8_H2CXCLQLQXUKhjOGAioMsci5O2AgLTRwEiFM2ElJLTgDqnE1S2kA3whhDNGJX83DMpk1ob7I2ZIum9E2bzWI4VtlbiB8-pgt2Vrtt8pOfPWavjw8vszlfPj8tZvdLXkopWq5wRVDLKi98qZV0ZJSHXAIVvrt7Uda6cuQqnxtCzFGvhBM1mWLlqFBSyDGbDn_3MXwefGrtbp1Kv926xodDskiAfXMFHXr9B92EQ2y6dhY1Ga015qqjcKDKGFKKvrb7uN65-GURbO_NDt5s58323mxfQgyZ1LHNu4-_Pv8b-gbbpmvH</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1679666145</pqid></control><display><type>article</type><title>How (not) to Incent Crowd Workers: Payment Schemes and Feedback in Crowdsourcing</title><source>SpringerNature Journals</source><source>Digital Commons Online Journals</source><creator>Straub, Tim ; Gimpel, Henner ; Teschner, Florian ; Weinhardt, Christof</creator><creatorcontrib>Straub, Tim ; Gimpel, Henner ; Teschner, Florian ; Weinhardt, Christof</creatorcontrib><description>Crowdsourcing gains momentum: In digital work places such as Amazon Mechanical Turk, oDesk, Clickworker, 99designs, or InnoCentive it is easy to distribute human work to hundreds or thousands of freelancers. In these crowdsourcing settings, one challenge is to properly incent worker effort to create value. Common incentive schemes are piece rate payments and rank-order tournaments among workers. Tournaments might or might not disclose a worker’s current competitive position via a leaderboard. Following an exploratory approach, we derive a model on worker performance in rank-order tournaments and present a series of real effort studies using experimental techniques on an online labor market to test the model and to compare dyadic tournaments to piece rate payments. Data suggests that on average dyadic tournaments do not improve performance compared to a simple piece rate for simple and short crowdsourcing tasks. Furthermore, giving feedback on the competitive position in such tournaments tends to be negatively related to workers’ performance. This relation is partially mediated by task completion and moderated by the provision of feedback: When playing against strong competitors, feedback is associated with workers quitting the task altogether and, thus, showing lower performance. When the competitors are weak, workers tend to complete the task but with reduced effort. Overall, individual piece rate payments are most simple to communicate and implement while incenting performance is on par with more complex dyadic tournaments.</description><identifier>ISSN: 2363-7005</identifier><identifier>EISSN: 1867-0202</identifier><identifier>DOI: 10.1007/s12599-015-0384-2</identifier><language>eng</language><publisher>Wiesbaden: Springer Fachmedien Wiesbaden</publisher><subject>Analysis ; Business ; Business and Management ; Crowdsourcing ; Dyadics ; Feedback ; Gain ; Human ; Human resource management ; Incentives ; Information systems ; IT in Business ; Labor market ; Logos ; Markets ; Monetary incentives ; Research Paper ; Studies ; Tasks ; Telecommuting ; Tournaments &amp; championships ; Workers</subject><ispartof>Business &amp; information systems engineering, 2015-06, Vol.57 (3), p.167-179</ispartof><rights>Springer Fachmedien Wiesbaden 2015</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c332t-51b70f3d48ec653a795e043078e70fe2cf6da7ade49711416b2a2f798ba785323</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s12599-015-0384-2$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s12599-015-0384-2$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,780,784,27924,27925,41488,42557,51319</link.rule.ids></links><search><creatorcontrib>Straub, Tim</creatorcontrib><creatorcontrib>Gimpel, Henner</creatorcontrib><creatorcontrib>Teschner, Florian</creatorcontrib><creatorcontrib>Weinhardt, Christof</creatorcontrib><title>How (not) to Incent Crowd Workers: Payment Schemes and Feedback in Crowdsourcing</title><title>Business &amp; information systems engineering</title><addtitle>Bus Inf Syst Eng</addtitle><description>Crowdsourcing gains momentum: In digital work places such as Amazon Mechanical Turk, oDesk, Clickworker, 99designs, or InnoCentive it is easy to distribute human work to hundreds or thousands of freelancers. In these crowdsourcing settings, one challenge is to properly incent worker effort to create value. Common incentive schemes are piece rate payments and rank-order tournaments among workers. Tournaments might or might not disclose a worker’s current competitive position via a leaderboard. Following an exploratory approach, we derive a model on worker performance in rank-order tournaments and present a series of real effort studies using experimental techniques on an online labor market to test the model and to compare dyadic tournaments to piece rate payments. Data suggests that on average dyadic tournaments do not improve performance compared to a simple piece rate for simple and short crowdsourcing tasks. Furthermore, giving feedback on the competitive position in such tournaments tends to be negatively related to workers’ performance. This relation is partially mediated by task completion and moderated by the provision of feedback: When playing against strong competitors, feedback is associated with workers quitting the task altogether and, thus, showing lower performance. When the competitors are weak, workers tend to complete the task but with reduced effort. Overall, individual piece rate payments are most simple to communicate and implement while incenting performance is on par with more complex dyadic tournaments.</description><subject>Analysis</subject><subject>Business</subject><subject>Business and Management</subject><subject>Crowdsourcing</subject><subject>Dyadics</subject><subject>Feedback</subject><subject>Gain</subject><subject>Human</subject><subject>Human resource management</subject><subject>Incentives</subject><subject>Information systems</subject><subject>IT in Business</subject><subject>Labor market</subject><subject>Logos</subject><subject>Markets</subject><subject>Monetary incentives</subject><subject>Research Paper</subject><subject>Studies</subject><subject>Tasks</subject><subject>Telecommuting</subject><subject>Tournaments &amp; championships</subject><subject>Workers</subject><issn>2363-7005</issn><issn>1867-0202</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2015</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><recordid>eNp1kMFKAzEQhoMoWGofwNuKl3qIziSbzOYoRW2h4EXxGNLdrLS2m5psKb69u6wHEZzLwPD9w8_H2CXCLQLQXUKhjOGAioMsci5O2AgLTRwEiFM2ElJLTgDqnE1S2kA3whhDNGJX83DMpk1ob7I2ZIum9E2bzWI4VtlbiB8-pgt2Vrtt8pOfPWavjw8vszlfPj8tZvdLXkopWq5wRVDLKi98qZV0ZJSHXAIVvrt7Uda6cuQqnxtCzFGvhBM1mWLlqFBSyDGbDn_3MXwefGrtbp1Kv926xodDskiAfXMFHXr9B92EQ2y6dhY1Ga015qqjcKDKGFKKvrb7uN65-GURbO_NDt5s58323mxfQgyZ1LHNu4-_Pv8b-gbbpmvH</recordid><startdate>20150601</startdate><enddate>20150601</enddate><creator>Straub, Tim</creator><creator>Gimpel, Henner</creator><creator>Teschner, Florian</creator><creator>Weinhardt, Christof</creator><general>Springer Fachmedien Wiesbaden</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>0U~</scope><scope>1-H</scope><scope>3V.</scope><scope>7SC</scope><scope>7WY</scope><scope>7WZ</scope><scope>7XB</scope><scope>87Z</scope><scope>8AL</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FK</scope><scope>8FL</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BEZIV</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FRNLG</scope><scope>F~G</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K60</scope><scope>K6~</scope><scope>K7-</scope><scope>L.-</scope><scope>L.0</scope><scope>L6V</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>M0C</scope><scope>M0N</scope><scope>M7S</scope><scope>P5Z</scope><scope>P62</scope><scope>PQBIZ</scope><scope>PQBZA</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PTHSS</scope><scope>PYYUZ</scope><scope>Q9U</scope></search><sort><creationdate>20150601</creationdate><title>How (not) to Incent Crowd Workers</title><author>Straub, Tim ; Gimpel, Henner ; Teschner, Florian ; Weinhardt, Christof</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c332t-51b70f3d48ec653a795e043078e70fe2cf6da7ade49711416b2a2f798ba785323</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2015</creationdate><topic>Analysis</topic><topic>Business</topic><topic>Business and Management</topic><topic>Crowdsourcing</topic><topic>Dyadics</topic><topic>Feedback</topic><topic>Gain</topic><topic>Human</topic><topic>Human resource management</topic><topic>Incentives</topic><topic>Information systems</topic><topic>IT in Business</topic><topic>Labor market</topic><topic>Logos</topic><topic>Markets</topic><topic>Monetary incentives</topic><topic>Research Paper</topic><topic>Studies</topic><topic>Tasks</topic><topic>Telecommuting</topic><topic>Tournaments &amp; championships</topic><topic>Workers</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Straub, Tim</creatorcontrib><creatorcontrib>Gimpel, Henner</creatorcontrib><creatorcontrib>Teschner, Florian</creatorcontrib><creatorcontrib>Weinhardt, Christof</creatorcontrib><collection>CrossRef</collection><collection>Global News &amp; ABI/Inform Professional</collection><collection>Trade PRO</collection><collection>ProQuest Central (Corporate)</collection><collection>Computer and Information Systems Abstracts</collection><collection>Access via ABI/INFORM (ProQuest)</collection><collection>ABI/INFORM Global (PDF only)</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>ABI/INFORM Global (Alumni Edition)</collection><collection>Computing Database (Alumni Edition)</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ABI/INFORM Collection (Alumni Edition)</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Business Premium Collection</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Business Premium Collection (Alumni)</collection><collection>ABI/INFORM Global (Corporate)</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>ProQuest Business Collection (Alumni Edition)</collection><collection>ProQuest Business Collection</collection><collection>Computer Science Database</collection><collection>ABI/INFORM Professional Advanced</collection><collection>ABI/INFORM Professional Standard</collection><collection>ProQuest Engineering Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>ABI/INFORM Global</collection><collection>Computing Database</collection><collection>Engineering Database</collection><collection>Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest One Business</collection><collection>ProQuest One Business (Alumni)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>Engineering Collection</collection><collection>ABI/INFORM Collection China</collection><collection>ProQuest Central Basic</collection><jtitle>Business &amp; information systems engineering</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Straub, Tim</au><au>Gimpel, Henner</au><au>Teschner, Florian</au><au>Weinhardt, Christof</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>How (not) to Incent Crowd Workers: Payment Schemes and Feedback in Crowdsourcing</atitle><jtitle>Business &amp; information systems engineering</jtitle><stitle>Bus Inf Syst Eng</stitle><date>2015-06-01</date><risdate>2015</risdate><volume>57</volume><issue>3</issue><spage>167</spage><epage>179</epage><pages>167-179</pages><issn>2363-7005</issn><eissn>1867-0202</eissn><abstract>Crowdsourcing gains momentum: In digital work places such as Amazon Mechanical Turk, oDesk, Clickworker, 99designs, or InnoCentive it is easy to distribute human work to hundreds or thousands of freelancers. In these crowdsourcing settings, one challenge is to properly incent worker effort to create value. Common incentive schemes are piece rate payments and rank-order tournaments among workers. Tournaments might or might not disclose a worker’s current competitive position via a leaderboard. Following an exploratory approach, we derive a model on worker performance in rank-order tournaments and present a series of real effort studies using experimental techniques on an online labor market to test the model and to compare dyadic tournaments to piece rate payments. Data suggests that on average dyadic tournaments do not improve performance compared to a simple piece rate for simple and short crowdsourcing tasks. Furthermore, giving feedback on the competitive position in such tournaments tends to be negatively related to workers’ performance. This relation is partially mediated by task completion and moderated by the provision of feedback: When playing against strong competitors, feedback is associated with workers quitting the task altogether and, thus, showing lower performance. When the competitors are weak, workers tend to complete the task but with reduced effort. Overall, individual piece rate payments are most simple to communicate and implement while incenting performance is on par with more complex dyadic tournaments.</abstract><cop>Wiesbaden</cop><pub>Springer Fachmedien Wiesbaden</pub><doi>10.1007/s12599-015-0384-2</doi><tpages>13</tpages></addata></record>
fulltext fulltext
identifier ISSN: 2363-7005
ispartof Business & information systems engineering, 2015-06, Vol.57 (3), p.167-179
issn 2363-7005
1867-0202
language eng
recordid cdi_proquest_miscellaneous_1701000050
source SpringerNature Journals; Digital Commons Online Journals
subjects Analysis
Business
Business and Management
Crowdsourcing
Dyadics
Feedback
Gain
Human
Human resource management
Incentives
Information systems
IT in Business
Labor market
Logos
Markets
Monetary incentives
Research Paper
Studies
Tasks
Telecommuting
Tournaments & championships
Workers
title How (not) to Incent Crowd Workers: Payment Schemes and Feedback in Crowdsourcing
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-19T19%3A14%3A41IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=How%20(not)%20to%20Incent%20Crowd%20Workers:%20Payment%20Schemes%20and%20Feedback%20in%20Crowdsourcing&rft.jtitle=Business%20&%20information%20systems%20engineering&rft.au=Straub,%20Tim&rft.date=2015-06-01&rft.volume=57&rft.issue=3&rft.spage=167&rft.epage=179&rft.pages=167-179&rft.issn=2363-7005&rft.eissn=1867-0202&rft_id=info:doi/10.1007/s12599-015-0384-2&rft_dat=%3Cproquest_cross%3E3678821171%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=1679666145&rft_id=info:pmid/&rfr_iscdi=true