A framework for evaluation of crowdsourcing platforms performance

被引:0
|
作者
Moghadasi, Mohammadhasan [1 ,3 ]
Shirmohammadi, Mehdi [1 ]
Ghasemi, Ahmadreza [2 ]
机构
[1] Ershad Damavand Inst Higher Educ, Tehran, Iran
[2] Univ Tehran, Tehran, Iran
[3] Ershad Damavand Inst Higher Educ, Dept Business Management, Master Business Adm, Tehran, Iran
关键词
open innovation; crowdsourcing; crowdsourcing platform; evaluation framework; performance evaluation; OPEN INNOVATION; SYSTEMS; MODELS; CROWD; USER;
D O I
10.1177/02666669231152553
中图分类号
G25 [图书馆学、图书馆事业]; G35 [情报学、情报工作];
学科分类号
1205 ; 120501 ;
摘要
This study aims to identify an appropriate conceptual framework to evaluate crowdsourcing platforms from an open innovation perspective employing a combination of qualitative and quantitative methods. The initial indices of the performance evaluation framework in the crowdsourcing platforms are obtained through the Delphi method and interviews with experts. Then, using these factors, a statistical questionnaire is designed and distributed among users of crowdsourcing platforms to confirm or reject the factors. Finally, the aspects of the performance evaluation framework of crowdsourcing platforms are specified from the perspective of open innovation. Using fuzzy hierarchical analysis, these aspects are prioritized in order of importance: Collaboration, Project design, Moderation, Terms and conditions, UI/UX (user interface and user experience), and Key statistics. Concerning the principle of crowdsourcing, which is based on crowd participation and crowd intelligence of users, Collaboration and Project design turned out to be the significant factors in evaluating a crowdsourcing platform.
引用
收藏
页码:635 / 647
页数:13
相关论文
共 50 条
  • [1] Affect and Creative Performance on Crowdsourcing Platforms
    Morris, Robert R.
    Dontcheva, Mira
    Finkelstein, Adam
    Gerber, Elizabeth
    2013 HUMAINE ASSOCIATION CONFERENCE ON AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION (ACII), 2013, : 67 - 72
  • [2] An evaluation framework for software crowdsourcing
    Wenjun WU
    WeiTek TSAI
    Wei LI
    Frontiers of Computer Science, 2013, 7 (05) : 694 - 709
  • [3] An evaluation framework for software crowdsourcing
    Wenjun Wu
    Wei-Tek Tsai
    Wei Li
    Frontiers of Computer Science, 2013, 7 : 694 - 709
  • [4] An evaluation framework for software crowdsourcing
    Wu, Wenjun
    Tsai, Wei-Tek
    Li, Wei
    FRONTIERS OF COMPUTER SCIENCE, 2013, 7 (05) : 694 - 709
  • [5] Crowdsourcing mode evaluation for parcel delivery service platforms
    Zhen, Lu
    Wu, Yiwei
    Wang, Shuaian
    Yi, Wen
    INTERNATIONAL JOURNAL OF PRODUCTION ECONOMICS, 2021, 235
  • [6] Community heuristics for user interface evaluation of crowdsourcing platforms
    Campo, Simon A.
    Khan, Vasssilis-Javed
    Papangelis, Konstantinos
    Markopoulos, Panos
    FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2019, 95 : 775 - 789
  • [7] EDUCATIONAL PLATFORMS EVALUATION FRAMEWORK
    Dumitrache, Anisoara
    RETHINKING EDUCATION BY LEVERAGING THE ELEARNING PILLAR OF THE DIGITAL AGENDA FOR EUROPE!, VOL. II, 2015, : 506 - 511
  • [8] Software Crowdsourcing Platforms
    Zanatta, Alexandre Lazaretti
    Machado, Leticia Santos
    Pereira, Graziela Basilio
    Prikladnicki, Rafael
    Carmel, Erran
    IEEE SOFTWARE, 2016, 33 (06) : 112 - 116
  • [9] Design and evaluation of crowdsourcing platforms based on users' confidence judgments
    Ahmadabadi, Samin Nili
    Haghifam, Maryam
    Shah-Mansouri, Vahid
    Ershadmanesh, Sara
    SCIENTIFIC REPORTS, 2024, 14 (01):
  • [10] Crowdsourcing Multimedia QoE Evaluation: A Trusted Framework
    Wu, Chen-Chi
    Chen, Kuan-Ta
    Chang, Yu-Chun
    Lei, Chin-Laung
    IEEE TRANSACTIONS ON MULTIMEDIA, 2013, 15 (05) : 1121 - 1137