Aggregating Unstructured Submissions for Reliable Answers in Crowdsourcing Systems

被引:2
|
作者
Kurup, Ayswarya R. [1 ]
Sajeev, G. P. [1 ]
机构
[1] Amrita Vishwa Vidyapeetham, Amrita Sch Engn, Dept Comp Sci & Engn, Amritapuri 690525, India
关键词
Task aggregation; crowdsourcing; expertise estimation; probabilistic model;
D O I
10.1109/ised48680.2019.9096224
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
In crowdsourcing systems, requesters assign tasks to multiple workers for obtaining reliable solutions. Task aggregation is performed on submissions received from crowd workers. Existing aggregation methods focus on structured submissions without accounting reliability factor of workers. Hence they are not suitable for aggregating unstructured answers. This paper proposes a novel task aggregation approach for a generic crowdsourcing tasks using an iterative probabilistic model. We make use of the reliability parameter and expertness of workers along with similarity information and requesters feedback. Experiments on the empirical data demonstrate that our scheme yields better results compared to existing state-of-the-art approaches.
引用
收藏
页码:16 / 22
页数:7
相关论文
共 50 条
  • [1] Aggregating Reliable Submissions in Crowdsourcing Systems
    Kurup, Ayswarya R.
    Sajeev, G. P.
    Swaminathan, J.
    [J]. IEEE ACCESS, 2021, 9 : 153058 - 153071
  • [2] Aggregating Reliable Submissions in Crowdsourcing Systems
    Kurup, Ayswarya R.
    Sajeev, G.P.
    Swaminathan, J.
    [J]. IEEE Access, 2021, 9 : 153058 - 153071
  • [3] Finding High-Quality Unstructured Submissions in General Crowdsourcing Tasks
    Lyu, Shanshan
    Ouyang, Wentao
    Shen, Huawei
    Cheng, Xueqi
    [J]. INFORMATION RETRIEVAL, CCIR 2018, 2018, 11168 : 198 - 210
  • [4] Aggregating user preferences in group recommender systems: A crowdsourcing approach
    Ismailoglu, Firat
    [J]. DECISION SUPPORT SYSTEMS, 2022, 152
  • [5] Crowdsourcing and Aggregating Nested Markable Annotations
    Madge, Chris
    Yu, Juntao
    Chamberlain, Jon
    Kruschwitz, Udo
    Paun, Silviu
    Poesio, Massimo
    [J]. 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 797 - 807
  • [6] Reliable Medical Diagnosis from Crowdsourcing: Discover Trustworthy Answers from Non-Experts
    Li, Yaliang
    Du, Nan
    Liu, Chaochun
    Xie, Yusheng
    Fan, Wei
    Li, Qi
    Gao, Jing
    Sun, Huan
    [J]. WSDM'17: PROCEEDINGS OF THE TENTH ACM INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING, 2017, : 253 - 261
  • [7] Crowdsourcing ("crowdsourcing") of the services of translation with 9 questions and answers
    Heard, Reiner
    [J]. BABEL-REVUE INTERNATIONALE DE LA TRADUCTION-INTERNATIONAL JOURNAL OF TRANSLATION, 2017, 63 (02): : 282 - 284
  • [8] Budget-Optimal Task Allocation for Reliable Crowdsourcing Systems
    Karger, David R.
    Oh, Sewoong
    Shah, Devavrat
    [J]. OPERATIONS RESEARCH, 2014, 62 (01) : 1 - 24
  • [9] Is affective crowdsourcing reliable?
    Hupont, Isabelle
    Lebreton, Pierre
    Maeki, Toni
    Skodras, Evangelos
    Hirth, Matthias
    [J]. 2014 IEEE FIFTH INTERNATIONAL CONFERENCE ON COMMUNICATIONS AND ELECTRONICS (ICCE), 2014, : 516 - 521
  • [10] A Model for Aggregating Contributions of Synergistic Crowdsourcing Workflows
    Fang, Yili
    Sun, Hailong
    Zhang, Richong
    Huai, Jinpeng
    Mao, Yongyi
    [J]. PROCEEDINGS OF THE TWENTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2014, : 3102 - 3103