A framework for evaluation of crowdsourcing platforms performance

被引:0
|
作者
Moghadasi, Mohammadhasan [1 ,3 ]
Shirmohammadi, Mehdi [1 ]
Ghasemi, Ahmadreza [2 ]
机构
[1] Ershad Damavand Inst Higher Educ, Tehran, Iran
[2] Univ Tehran, Tehran, Iran
[3] Ershad Damavand Inst Higher Educ, Dept Business Management, Master Business Adm, Tehran, Iran
关键词
open innovation; crowdsourcing; crowdsourcing platform; evaluation framework; performance evaluation; OPEN INNOVATION; SYSTEMS; MODELS; CROWD; USER;
D O I
10.1177/02666669231152553
中图分类号
G25 [图书馆学、图书馆事业]; G35 [情报学、情报工作];
学科分类号
1205 ; 120501 ;
摘要
This study aims to identify an appropriate conceptual framework to evaluate crowdsourcing platforms from an open innovation perspective employing a combination of qualitative and quantitative methods. The initial indices of the performance evaluation framework in the crowdsourcing platforms are obtained through the Delphi method and interviews with experts. Then, using these factors, a statistical questionnaire is designed and distributed among users of crowdsourcing platforms to confirm or reject the factors. Finally, the aspects of the performance evaluation framework of crowdsourcing platforms are specified from the perspective of open innovation. Using fuzzy hierarchical analysis, these aspects are prioritized in order of importance: Collaboration, Project design, Moderation, Terms and conditions, UI/UX (user interface and user experience), and Key statistics. Concerning the principle of crowdsourcing, which is based on crowd participation and crowd intelligence of users, Collaboration and Project design turned out to be the significant factors in evaluating a crowdsourcing platform.
引用
收藏
页码:635 / 647
页数:13
相关论文
共 50 条
  • [21] ASSESSING MALAYSIAN CROWDSOURCING PLATFORMS USING WEB OF SYSTEM PERFORMANCE (WOSP) MODEL
    Habbal, Adib
    Chit, Suwannit Chareen
    Ahmad, Rahayu
    Mahmod, Musyrifah
    PROCEEDINGS OF THE 5TH INTERNATIONAL CONFERENCE ON COMPUTING & INFORMATICS, 2015, : 495 - 500
  • [22] Unsatisfied today, satisfied tomorrow: A simulation framework for performance evaluation of crowdsourcing-based network monitoring
    Pimpinella, Andrea
    Repossi, Marianna
    Redondi, Alessandro E. C.
    COMPUTER COMMUNICATIONS, 2022, 182 : 184 - 197
  • [23] A novel framework to improve the performance of crowdfunding platforms
    Gera, Jaya
    Kaur, Harmeet
    ICT EXPRESS, 2018, 4 (02): : 55 - 62
  • [24] Framework for User Experience Evaluation in MOOC Platforms
    Gallegos, Ana Poma
    Morales, Germania Rodriguez
    Torres-Carrion, Pablo, V
    Carrion, Samanta Cueva
    ARTIFICIAL INTELLIGENCE IN HCI, AI-HCI 2022, 2022, 13336 : 284 - 304
  • [25] LBTask: A Benchmark for Spatial Crowdsourcing Platforms
    Yang, Qian
    Cui, Lizhen
    Zheng, Miao
    Liu, Shijun
    Guo, Wei
    Lu, Xudong
    Zheng, Yongqing
    Li, Qingzhong
    PROCEEDINGS OF THE 3RD INTERNATIONAL CONFERENCE ON CROWD SCIENCE AND ENGINEERING (ICCSE 2018), 2018,
  • [26] Collaborative Software Development Platforms for Crowdsourcing
    Peng, Xin
    Babar, Muhammad Ali
    Ebert, Christof
    IEEE SOFTWARE, 2014, 31 (02) : 30 - 36
  • [27] Supporting Occasional Groups in Crowdsourcing Platforms
    Harandi, Mahboobeh
    CONFERENCE COMPANION PUBLICATION OF THE 2019 COMPUTER SUPPORTED COOPERATIVE WORK AND SOCIAL COMPUTING (CSCW'19 COMPANION), 2019, : 52 - 55
  • [28] Investigating Business Sustainability of Crowdsourcing Platforms
    He, Hee Rui
    Liu, Yang
    Gao, Jing
    Jing, Dian
    IEEE ACCESS, 2022, 10 : 74291 - 74303
  • [29] Optimal Fee Structures of Crowdsourcing Platforms
    Wen, Zhong
    Lin, Lihui
    DECISION SCIENCES, 2016, 47 (05) : 820 - 850
  • [30] Crowd Work with or without Crowdsourcing Platforms
    Yan, Xin
    Ding, Xianghua
    Gu, Ning
    2016 IEEE 20TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN (CSCWD), 2016, : 56 - 61