An Examination of the Dynamics of Crowdsourcing Contests: Role of Feedback Type

被引:3
|
作者
Sanyal, Pallab [1 ]
Ye, Shun [1 ]
机构
[1] George Mason Univ, Sch Business, Fairfax, VA 22030 USA
关键词
crowdsourcing; design contests; feedback; convergence; diversity; INNOVATION CONTESTS; PRODUCTIVITY LOSS; OUTCOME-FEEDBACK; CREATIVITY; IDEA; PARTICIPATION; UNCERTAINTY; GENERATION; BEHAVIOR; QUALITY;
D O I
10.1287/isre.2023.1232
中图分类号
G25 [图书馆学、图书馆事业]; G35 [情报学、情报工作];
学科分类号
1205 ; 120501 ;
摘要
As more businesses are turning to crowdsourcing platforms for solutions to business problems, determining how to manage the sourcing contests based on their objectives has become critically important. Existing research, both theoretical and empirical, studies the impact of a variety of contest and contestant characteristics on the outcomes of these contests. Aside from these static design parameters, a lever organizations (clients) can use to dynamically steer contests toward desirable goals is the feedback offered to the contestants (solvers) during the contest. Although a handful of recent studies focuses on the effects of feedback at a high level (e.g., volume, valence), to the best of our knowledge, none has examined the effects of the information contained in the feedback. Furthermore, the focus of the existing studies is solely on the quality of the submissions and not on other critical contest outcomes, such as the diversity of the submissions, which is found to be significant in the creativity and innovations literature. In this study, first, using the psychology literature on the theory of feedback intervention, we classify client feedback into two types: outcome and process. Second, using data from almost 12,000 design contests, we empirically examine the effects of the two types of feedback on the convergence and diversity of submissions following feedback interventions. We find that process feedback, providing goal-oriented information to solvers, fosters convergent thinking, leading to submissions that are similar. Although outcome feedback lacks the informative value of process feedback, it encourages divergent thinking, which is the ability to produce a variety of solutions to a problem. Furthermore, we find that the effects are strengthened when the feedback is provided earlier in the contest rather than later. Based on our findings, we offer insights on how practitioners can strategically use an appropriate form of feedback to either generate greater diversity of solutions or efficient convergence to an acceptable solution.
引用
收藏
页码:394 / 413
页数:21
相关论文
共 50 条
  • [1] Strategically reward solvers in crowdsourcing contests: the role of seeker feedback
    Wang, Meng-Meng
    BEHAVIOUR & INFORMATION TECHNOLOGY, 2022, 41 (14) : 3124 - 3137
  • [2] Intrinsically motivating the crowds: the role of feedback information in crowdsourcing contests
    Wang, Mengmeng
    Zhang, Chun
    Zhu, Tingting
    INFORMATION TECHNOLOGY & PEOPLE, 2024,
  • [3] The Role of Feedback in Dynamic Crowdsourcing Contests: A Structural Empirical Analysis
    Jiang, Zhaohui
    Huang, Yan
    Beil, Damian R.
    MANAGEMENT SCIENCE, 2022, 68 (07) : 4858 - 4877
  • [4] The role of rating feedback and its implications for solver submission behavior in crowdsourcing contests
    Wang, Mengmeng
    Cui, Xiaocong
    INFORMATION & MANAGEMENT, 2023, 60 (05)
  • [5] A Structural Analysis of the Role of Superstars in Crowdsourcing Contests
    Zhang, Shunyuan
    Singh, Param Vir
    Ghose, Anindya
    INFORMATION SYSTEMS RESEARCH, 2019, 30 (01) : 15 - 33
  • [6] Crowdsourcing contests
    Segev, Ella
    EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2020, 281 (02) : 241 - 255
  • [7] Optimal crowdsourcing contests
    Chawla, Shuchi
    Hartline, Jason D.
    Sivan, Balasubramanian
    GAMES AND ECONOMIC BEHAVIOR, 2019, 113 : 80 - 96
  • [8] Crowdsourcing contests with entry cost
    Tian, Xu
    Bi, Gongbing
    Shen, Xiaobei
    Liu, Lindong
    INTERNATIONAL TRANSACTIONS IN OPERATIONAL RESEARCH, 2021, 28 (03) : 1371 - 1392
  • [9] Improving the Efficiency of Crowdsourcing Contests
    Xu, Haifeng
    Larson, Kate
    AAMAS'14: PROCEEDINGS OF THE 2014 INTERNATIONAL CONFERENCE ON AUTONOMOUS AGENTS & MULTIAGENT SYSTEMS, 2014, : 461 - 468
  • [10] Salience Bias in Crowdsourcing Contests
    Lee, Ho Cheung Brian
    Ba, Sulin
    Li, Xinxin
    Stallaert, Jan
    INFORMATION SYSTEMS RESEARCH, 2018, 29 (02) : 401 - 418