Quality Enhancement in Crowdsourcing

被引:0
|
作者
Bhattacharya, Bijoly Saha [1 ]
机构
[1] Indian Inst Engn Sci & Technol, Dept Informat Technol, Howrah, W Bengal, India
关键词
Crowd-powered systems; dropout prediction; feedback mechanism; quality control;
D O I
10.1145/3297001.3297058
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Crowdsourcing is an online, distributed problem-solving and production model. In a crowdsourcing environment, requesters dispense tasks (through some mechanisms) to the crowd workers to be solved in a limited time. As the workers in these markets are irregular and keep changing over time, it becomes a real challenge to designate a dropout. A crowd worker, staying out of the environment for a long period, may resume back at any time in future. There have been limited attempts toward dropout prediction in crowdsourcing markets, arguably due to the disagreement in defining a dropout. Controlling the dropout rate is an important issue for the quality enhancement in crowdsourcing markets. Apart from this, there are some other factors which can enhance quality of crowdsourcing platforms from different perspectives. One of these is the feedback mechanism. In fact, the feedback mechanism might have influence over controlling the dropout rate. In this paper, our objective is to find out the significance and impact of dropout prediction and feedback mechanism together on the quality enhancement of crowdsourcing markets.
引用
收藏
页码:346 / 349
页数:4
相关论文
共 50 条
  • [31] Assessing the Quality of Wikipedia Editors through Crowdsourcing
    Suzuki, Yu
    Nakamura, Satoshi
    PROCEEDINGS OF THE 25TH INTERNATIONAL CONFERENCE ON WORLD WIDE WEB (WWW'16 COMPANION), 2016, : 1001 - 1006
  • [32] In Search of Quality in Crowdsourcing for Search Engine Evaluation
    Kazai, Gabriella
    ADVANCES IN INFORMATION RETRIEVAL, 2011, 6611 : 165 - 176
  • [33] Statistical Quality Estimation for General Crowdsourcing Tasks
    Baba, Yukino
    Kashima, Hisashi
    19TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING (KDD'13), 2013, : 554 - 562
  • [34] Quality factors of crowdsourcing system: Paper review
    Salimun, Shirley
    Janom, Norjansalika
    Arshad, Noor Habibah
    2015 IEEE 6TH CONTROL AND SYSTEM GRADUATE RESEARCH COLLOQUIUM (ICSGRC), 2015, : 82 - 86
  • [35] Quality Control in Crowdsourcing Systems Issues and Directions
    Allahbakhsh, Mohammad
    Benatallah, Boualem
    Ignjatovic, Aleksandar
    Motahari-Nezhad, Hamid Reza
    Bertino, Elisa
    Dustdar, Schahram
    IEEE INTERNET COMPUTING, 2013, 17 (02) : 76 - 81
  • [36] Measuring the Quality of Annotations for a Subjective Crowdsourcing Task
    Justo, Raquel
    Ines Torres, M.
    Alcaide, Jose M.
    PATTERN RECOGNITION AND IMAGE ANALYSIS (IBPRIA 2017), 2017, 10255 : 58 - 68
  • [37] The Influence of Input Data Complexity on Crowdsourcing Quality
    Tauchmann, Christopher
    Daxenberger, Johannes
    Mieskes, Margot
    PROCEEDINGS OF THE 25TH INTERNATIONAL CONFERENCE ON INTELLIGENT USER INTERFACES COMPANION (IUI'20), 2020, : 71 - 72
  • [38] A Transcription Task for Crowdsourcing with Automatic Quality Control
    Lee, Chia-ying
    Glass, James
    12TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION 2011 (INTERSPEECH 2011), VOLS 1-5, 2011, : 3048 - 3051
  • [39] The Effects of Feedback and Goal on the Quality of Crowdsourcing Tasks
    Lim, Jae-Eun
    Lee, Joonhwan
    Kim, Dongwhan
    INTERNATIONAL JOURNAL OF HUMAN-COMPUTER INTERACTION, 2021, 37 (13) : 1207 - 1219
  • [40] Quality improvement by worker filtering and development in crowdsourcing
    Ashikawa, Masayuki
    Kawamura, Takahiro
    Ohsuga, Akihiko
    WEB INTELLIGENCE, 2016, 14 (03) : 229 - 244