Salience Bias in Crowdsourcing Contests

被引:44
|
作者
Lee, Ho Cheung Brian [1 ]
Ba, Sulin [2 ]
Li, Xinxin [2 ]
Stallaert, Jan [2 ]
机构
[1] Univ Massachusetts Lowell, Manning Sch Business, Lowell, MA 01854 USA
[2] Univ Connecticut, Sch Business, Storrs, CT 06269 USA
基金
中国国家自然科学基金;
关键词
behavioral economics; crowdsourcing; open innovation; salience bias; parallel path effect; competition effect; INNOVATION CONTESTS; MARKET EXPERIENCE; FIELD EXPERIMENT; PRODUCT IDEAS; COMPETITION; DETERMINANTS; UNCERTAINTY; JUDGMENT; SUCCESS; LEVEL;
D O I
10.1287/isre.2018.0775
中图分类号
G25 [图书馆学、图书馆事业]; G35 [情报学、情报工作];
学科分类号
1205 ; 120501 ;
摘要
Crowdsourcing relies on online platforms to connect a community of users to perform specific tasks. However, without appropriate control, the behavior of the online community might not align with the platform's designed objective, which can lead to an inferior platform performance. This paper investigates how the feedback information on a crowdsourcing platform and systematic bias of crowdsourcing workers can affect crowdsourcing outcomes. Specifically, using archival data from the online crowdsourcing platform Kaggle, combined with survey data from actual Kaggle contest participants, we examine the role of a systematic bias, namely, the salience bias, in influencing the performance of the crowdsourcing workers and how the number of crowdsourcing workers moderates the impact of the salience bias on the outcomes of contests. Our results suggest that the salience bias influences the performance of contestants, including the winners of the contests. Furthermore, the number of participating contestants may attenuate or amplify the impact of the salience bias on the outcomes of contests, depending on the effort required to complete the tasks. Our results have critical implications for crowdsourcing firms and platform designers.
引用
收藏
页码:401 / 418
页数:18
相关论文
共 50 条
  • [1] Crowdsourcing contests
    Segev, Ella
    [J]. EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2020, 281 (02) : 241 - 255
  • [2] Optimal crowdsourcing contests
    Chawla, Shuchi
    Hartline, Jason D.
    Sivan, Balasubramanian
    [J]. GAMES AND ECONOMIC BEHAVIOR, 2019, 113 : 80 - 96
  • [3] Improving the Efficiency of Crowdsourcing Contests
    Xu, Haifeng
    Larson, Kate
    [J]. AAMAS'14: PROCEEDINGS OF THE 2014 INTERNATIONAL CONFERENCE ON AUTONOMOUS AGENTS & MULTIAGENT SYSTEMS, 2014, : 461 - 468
  • [4] Crowdsourcing contests with entry cost
    Tian, Xu
    Bi, Gongbing
    Shen, Xiaobei
    Liu, Lindong
    [J]. INTERNATIONAL TRANSACTIONS IN OPERATIONAL RESEARCH, 2021, 28 (03) : 1371 - 1392
  • [5] Adaptive Incentive Selection for Crowdsourcing Contests
    Truong, Nhat V. Q.
    Stein, Sebastian
    Long Tran-Thanh
    Jennings, Nicholas R.
    [J]. PROCEEDINGS OF THE 17TH INTERNATIONAL CONFERENCE ON AUTONOMOUS AGENTS AND MULTIAGENT SYSTEMS (AAMAS' 18), 2018, : 2100 - 2102
  • [6] Crowdsourcing with Tullock Contests: A New Perspective
    Luo, Tie
    Kanhere, Salil S.
    Tan, Hwee-Pink
    Wu, Fan
    Wu, Hongyi
    [J]. 2015 IEEE CONFERENCE ON COMPUTER COMMUNICATIONS (INFOCOM), 2015,
  • [7] Virtual Team Performances in Crowdsourcing Contests
    Dissanayake, Indika
    Zhang, Jie
    Gu, Bin
    [J]. AMCIS 2014 PROCEEDINGS, 2014,
  • [8] Participation recommendation system for crowdsourcing contests
    Baba, Yukino
    Kinoshita, Kei
    Kashima, Hisashi
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2016, 58 : 174 - 183
  • [9] A Structural Analysis of the Role of Superstars in Crowdsourcing Contests
    Zhang, Shunyuan
    Singh, Param Vir
    Ghose, Anindya
    [J]. INFORMATION SYSTEMS RESEARCH, 2019, 30 (01) : 15 - 33
  • [10] Effect of Crowd Voting on Participation in Crowdsourcing Contests
    Chen, Liang
    Xu, Pei
    Liu, De
    [J]. JOURNAL OF MANAGEMENT INFORMATION SYSTEMS, 2020, 37 (02) : 510 - 535