A Text Mining Approach to Evaluate Submissions to Crowdsourcing Contests

被引:18
|
作者
Walter, Thomas P. [1 ]
Back, Andrea [1 ]
机构
[1] Univ St Gallen, Inst Informat Management, St Gallen, Switzerland
关键词
IDEA; INNOVATION; QUALITY;
D O I
10.1109/HICSS.2013.64
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
This survey deals with the problem of evaluating the submissions to crowdsourcing websites on which data is increasing rapidly in both volume and complexity. Usually expert committees are installed to rate submissions, select winners and adjust monetary rewards. Thus, with an increasing number of submissions, this process is getting more complex, time-consuming and hence expensive. In this paper we suggest following text mining methodology, foremost similarity measurements and clustering algorithms, to evaluate the quality of submissions to crowdsourcing contests semi-automatically. We evaluate our approach by comparing text mining based measurement of more than 40'000 submissions with the real-world decisions made by expert committees using Precision and Recall together with F-1-score.
引用
收藏
页码:3109 / 3118
页数:10
相关论文
共 50 条
  • [1] The impact of task description linguistic style on task performance: a text mining of crowdsourcing contests
    Yang, Keng
    Qi, Hanying
    Huang, Qian
    [J]. INDUSTRIAL MANAGEMENT & DATA SYSTEMS, 2022, 122 (01) : 322 - 344
  • [2] Crowdsourcing contests
    Segev, Ella
    [J]. EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2020, 281 (02) : 241 - 255
  • [3] A Crowdsourcing Approach to Evaluate the Quality of Query-based Extractive Text Summaries
    Iskender, Neslihan
    Gabryszak, Aleksandra
    Polzehl, Tim
    Hennig, Leonhard
    Moeller, Sebastian
    [J]. 2019 ELEVENTH INTERNATIONAL CONFERENCE ON QUALITY OF MULTIMEDIA EXPERIENCE (QOMEX), 2019,
  • [4] Text-mining approach to evaluate terms for ontology development
    Tsoi, Lam C.
    Patel, Ravi
    Zhao, Wenle
    Zheng, W. Jim
    [J]. JOURNAL OF BIOMEDICAL INFORMATICS, 2009, 42 (05) : 824 - 830
  • [5] Optimal crowdsourcing contests
    Chawla, Shuchi
    Hartline, Jason D.
    Sivan, Balasubramanian
    [J]. GAMES AND ECONOMIC BEHAVIOR, 2019, 113 : 80 - 96
  • [6] Using a Text-Mining Approach to Evaluate the Quality of Nursing Records
    Chang, Hsiu-Mei
    Chiou, Shwu-Fen
    Liu, Hsiu-Yun
    Yu, Hui-Chu
    [J]. NURSING INFORMATICS 2016: EHEALTH FOR ALL: EVERY LEVEL COLLABORATION - FROM PROJECT TO REALIZATION, 2016, 225 : 813 - 814
  • [7] Crowdsourcing: a systematic review of the literature using text mining
    Pavlidou, Ioanna
    Papagiannidis, Savvas
    Tsui, Eric
    [J]. INDUSTRIAL MANAGEMENT & DATA SYSTEMS, 2020, 120 (11) : 2041 - 2065
  • [8] Crowdsourcing contests with entry cost
    Tian, Xu
    Bi, Gongbing
    Shen, Xiaobei
    Liu, Lindong
    [J]. INTERNATIONAL TRANSACTIONS IN OPERATIONAL RESEARCH, 2021, 28 (03) : 1371 - 1392
  • [9] Salience Bias in Crowdsourcing Contests
    Lee, Ho Cheung Brian
    Ba, Sulin
    Li, Xinxin
    Stallaert, Jan
    [J]. INFORMATION SYSTEMS RESEARCH, 2018, 29 (02) : 401 - 418
  • [10] Improving the Efficiency of Crowdsourcing Contests
    Xu, Haifeng
    Larson, Kate
    [J]. AAMAS'14: PROCEEDINGS OF THE 2014 INTERNATIONAL CONFERENCE ON AUTONOMOUS AGENTS & MULTIAGENT SYSTEMS, 2014, : 461 - 468