Crowd IQ: Measuring the Intelligence of Crowdsourcing Platforms

被引:0
|
作者
Kosinski, Michal [1 ]
Bachrach, Yoram [1 ]
Kasneci, Gjergji [1 ]
Van-Gael, Jurgen [1 ]
Graepel, Thore [1 ]
机构
[1] Univ Cambridge, Psychometr Ctr, Cambridge CB2 1TN, England
来源
PROCEEDINGS OF THE 3RD ANNUAL ACM WEB SCIENCE CONFERENCE, 2012 | 2012年
关键词
Crowdsourcing; Psychometrics; Incentive Schemes; SYSTEMS; WORLD;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We measure crowdsourcing performance based on a standard IQ questionnaire, and examine Amazon's Mechanical Turk (AMT) performance under different conditions. These include variations of the payment amount offered, the way incorrect responses affect workers' reputations, threshold reputation scores of participating AMT workers, and the number of workers per task. We show that crowds composed of workers of high reputation achieve higher performance than low reputation crowds, and the effect of the amount of payment is non-monotone-both paying too much and too little affects performance. Furthermore, higher performance is achieved when the task is designed such that incorrect responses can decrease workers' reputation scores. Using majority vote to aggregate multiple responses to the same task can significantly improve performance, which can be further boosted by dynamically allocating workers to tasks in order to break ties.
引用
收藏
页码:151 / 160
页数:10
相关论文
共 50 条
  • [41] LBTask: A Benchmark for Spatial Crowdsourcing Platforms
    Yang, Qian
    Cui, Lizhen
    Zheng, Miao
    Liu, Shijun
    Guo, Wei
    Lu, Xudong
    Zheng, Yongqing
    Li, Qingzhong
    PROCEEDINGS OF THE 3RD INTERNATIONAL CONFERENCE ON CROWD SCIENCE AND ENGINEERING (ICCSE 2018), 2018,
  • [42] Crowd Dynamics: Conflicts, Contradictions, and Community in Crowdsourcing
    Karin Hansson
    Thomas Ludwig
    Computer Supported Cooperative Work (CSCW), 2019, 28 : 791 - 794
  • [43] Managing the Crowd: Towards a Taxonomy of Crowdsourcing Processes
    Geiger, David
    Seedorf, Stefan
    Schulze, Thimo
    Nickerson, Robert
    Schader, Martin
    AMCIS 2011 PROCEEDINGS, 2011,
  • [44] Supporting Occasional Groups in Crowdsourcing Platforms
    Harandi, Mahboobeh
    CONFERENCE COMPANION PUBLICATION OF THE 2019 COMPUTER SUPPORTED COOPERATIVE WORK AND SOCIAL COMPUTING (CSCW'19 COMPANION), 2019, : 52 - 55
  • [45] Collaborative Software Development Platforms for Crowdsourcing
    Peng, Xin
    Babar, Muhammad Ali
    Ebert, Christof
    IEEE SOFTWARE, 2014, 31 (02) : 30 - 36
  • [46] Affect and Creative Performance on Crowdsourcing Platforms
    Morris, Robert R.
    Dontcheva, Mira
    Finkelstein, Adam
    Gerber, Elizabeth
    2013 HUMAINE ASSOCIATION CONFERENCE ON AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION (ACII), 2013, : 67 - 72
  • [47] Investigating Business Sustainability of Crowdsourcing Platforms
    He, Hee Rui
    Liu, Yang
    Gao, Jing
    Jing, Dian
    IEEE ACCESS, 2022, 10 : 74291 - 74303
  • [48] A Fair Incentive Mechanism for Crowdsourcing in Crowd Sensing
    Zhu, Xuan
    An, Jian
    Yang, Maishun
    Xiang, Lele
    Yang, Qiangwei
    Gui, Xiaolin
    IEEE INTERNET OF THINGS JOURNAL, 2016, 3 (06): : 1364 - 1372
  • [49] Optimal Fee Structures of Crowdsourcing Platforms
    Wen, Zhong
    Lin, Lihui
    DECISION SCIENCES, 2016, 47 (05) : 820 - 850
  • [50] Rating Mechanisms for Sustainability of Crowdsourcing Platforms
    Qiu, Chenxi
    Squicciarini, Anna.
    Rajtmajer, Sarah
    PROCEEDINGS OF THE 28TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT (CIKM '19), 2019, : 2003 - 2012