Quality Enhancement in Crowdsourcing

被引:0
|
作者
Bhattacharya, Bijoly Saha [1 ]
机构
[1] Indian Inst Engn Sci & Technol, Dept Informat Technol, Howrah, W Bengal, India
关键词
Crowd-powered systems; dropout prediction; feedback mechanism; quality control;
D O I
10.1145/3297001.3297058
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Crowdsourcing is an online, distributed problem-solving and production model. In a crowdsourcing environment, requesters dispense tasks (through some mechanisms) to the crowd workers to be solved in a limited time. As the workers in these markets are irregular and keep changing over time, it becomes a real challenge to designate a dropout. A crowd worker, staying out of the environment for a long period, may resume back at any time in future. There have been limited attempts toward dropout prediction in crowdsourcing markets, arguably due to the disagreement in defining a dropout. Controlling the dropout rate is an important issue for the quality enhancement in crowdsourcing markets. Apart from this, there are some other factors which can enhance quality of crowdsourcing platforms from different perspectives. One of these is the feedback mechanism. In fact, the feedback mechanism might have influence over controlling the dropout rate. In this paper, our objective is to find out the significance and impact of dropout prediction and feedback mechanism together on the quality enhancement of crowdsourcing markets.
引用
收藏
页码:346 / 349
页数:4
相关论文
共 50 条
  • [41] Balancing quality and budget considerations in mobile crowdsourcing
    Miao, Chunyan
    Yu, Han
    Shen, Zhiqi
    Leung, Cyril
    DECISION SUPPORT SYSTEMS, 2016, 90 : 56 - 64
  • [42] Crowdsourcing High Quality Labels with a Tight Budget
    Li, Qi
    Ma, Fenglong
    Gao, Jing
    Su, Lu
    Quinn, Christopher J.
    PROCEEDINGS OF THE NINTH ACM INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING (WSDM'16), 2016, : 237 - 246
  • [43] Assessing Crowdsourcing Quality through Objective Tasks
    Aker, Ahmet
    El-Haj, Mahmoud
    Albakour, M-Dyaa
    Kruschwitz, Udo
    LREC 2012 - EIGHTH INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, 2012, : 1456 - 1461
  • [44] Konkani WordNet: Corpus-Based Enhancement using Crowdsourcing
    Manerkar, Sanjana
    Asnani, Kavita
    Khorjuvenkar, Preeti Ravindranath
    Desai, Shilpa
    Pawar, Jyoti D.
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2022, 21 (04)
  • [45] Crowdsourcing and Gamification of Enterprise Meeting Software Quality
    Smith, Ross
    Kilty, Lori Ada
    2014 IEEE/ACM 7TH INTERNATIONAL CONFERENCE ON UTILITY AND CLOUD COMPUTING (UCC), 2014, : 611 - 613
  • [46] Quality assessment of crowdsourcing transcriptions for African languages
    Gelas, Hadrien
    Abate, Solomon Teferra
    Besacier, Laurent
    Pellegrino, Francois
    12TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION 2011 (INTERSPEECH 2011), VOLS 1-5, 2011, : 3072 - +
  • [47] MobileWorks: Designing for Quality in a Managed Crowdsourcing Architecture
    Kulkarni, Anand
    Gutheim, Philipp
    Narula, Prayag
    Rolnitzky, David
    Parikh, Tapan
    Hartmann, Bjoern
    IEEE INTERNET COMPUTING, 2012, 16 (05) : 28 - 35
  • [48] Evaluation of Software Quality in the TopCoder Crowdsourcing Environment
    Wang, Xin
    Wu, Wenjun
    Hu, Zhenghui
    2017 IEEE 7TH ANNUAL COMPUTING AND COMMUNICATION WORKSHOP AND CONFERENCE IEEE CCWC-2017, 2017,
  • [49] The relational governance antecedents and loyalty consequence of service quality in crowdsourcing: The moderating role of crowdsourcing experience
    Guan, Shiqian
    Guo, Wenbo
    Liu, Shan
    Zhu, Qing
    TELEMATICS AND INFORMATICS, 2020, 55
  • [50] Truthful Data Quality Elicitation for Quality-Aware Data Crowdsourcing
    Gong, Xiaowen
    Shroff, Ness B.
    IEEE TRANSACTIONS ON CONTROL OF NETWORK SYSTEMS, 2020, 7 (01): : 326 - 337