Affect and Creative Performance on Crowdsourcing Platforms

被引:10
|
作者
Morris, Robert R. [1 ]
Dontcheva, Mira [2 ]
Finkelstein, Adam [3 ]
Gerber, Elizabeth [4 ]
机构
[1] MIT, Media Lab, Cambridge, MA 02139 USA
[2] Adobe Res, San Francisco, CA USA
[3] Princeton Univ, Princeton, NJ 08544 USA
[4] Norwestern Univ, Segal Design Inst, Evanston, IL USA
关键词
affective computing; crowdsourcing; affective priming; affective self-report; creativity;
D O I
10.1109/ACII.2013.18
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Performance on crowdsourcing platforms varies greatly, especially for tasks requiring significant cognitive effort or creative insight. Researchers have proposed several techniques to address these problems, yet few have considered the role of affect, despite the well-established link between positive affect and creative performance. In this paper, we examine two affective techniques to boost creativity on crowdsourcing platforms - affective priming and affective pre-screening. Across three experiments, we find divergent results, depending on which technique is used. We find that not all happy crowdworkers are alike. Those that are primed to feel happy exhibit enhanced creative performance, whereas those that merely report feeling happy exhibit impaired creative performance. We examine these findings in light of preexisting research on creativity, affect, and mood saliency. Lastly, we show how our findings have implications not only for crowdsourcing platforms, but also for other human-computer interaction scenarios that involve affect and creative performance.
引用
收藏
页码:67 / 72
页数:6
相关论文
共 50 条
  • [1] A framework for evaluation of crowdsourcing platforms performance
    Moghadasi, Mohammadhasan
    Shirmohammadi, Mehdi
    Ghasemi, Ahmadreza
    [J]. INFORMATION DEVELOPMENT, 2024, 40 (04) : 635 - 647
  • [2] Software Crowdsourcing Platforms
    Zanatta, Alexandre Lazaretti
    Machado, Leticia Santos
    Pereira, Graziela Basilio
    Prikladnicki, Rafael
    Carmel, Erran
    [J]. IEEE SOFTWARE, 2016, 33 (06) : 112 - 116
  • [3] Research on the Impact of Task Feedback on the Performance of Creative Crowdsourcing Solvers
    Chi, Aining
    Ren, Nan
    [J]. ICEME 2019: 019 10TH INTERNATIONAL CONFERENCE ON E-BUSINESS, MANAGEMENT AND ECONOMICS, 2019, : 101 - 105
  • [4] How crowdsourcing risks affect performance: an exploratory model
    Liu, Shan
    Xia, Fan
    Zhang, Jinlong
    Wang, Lin
    [J]. MANAGEMENT DECISION, 2016, 54 (09) : 2235 - 2255
  • [5] Cognitively Inspired Task Design to Improve User Performance on Crowdsourcing Platforms
    Sampath, Harini Alagarai
    Rajeshuni, Rajeev
    Indurkhya, Bipin
    [J]. 32ND ANNUAL ACM CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS (CHI 2014), 2014, : 3665 - 3674
  • [6] Incentive Mechanisms for Crowdsourcing Platforms
    Katmada, Aikaterini
    Satsiou, Anna
    Kompatsiaris, Ioannis
    [J]. INTERNET SCIENCE, (INSCI 2016), 2016, 9934 : 3 - 18
  • [7] Creativity on Paid Crowdsourcing Platforms
    Oppenlaender, Jonas
    Milland, Kristy
    Visuri, Aku
    Ipeirotis, Panos
    Hosio, Simo
    [J]. PROCEEDINGS OF THE 2020 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS (CHI'20), 2020,
  • [8] Characterization of Experts in Crowdsourcing Platforms
    Ben Rjab, Amal
    Kharoune, Mouloud
    Miklos, Zoltan
    Martin, Arnaud
    [J]. BELIEF FUNCTIONS: THEORY AND APPLICATIONS, (BELIEF 2016), 2016, 9861 : 97 - 104
  • [9] Sybil Defense in Crowdsourcing Platforms
    Yuan, Dong
    Li, Guoliang
    Li, Qi
    Zheng, Yudian
    [J]. CIKM'17: PROCEEDINGS OF THE 2017 ACM CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, 2017, : 1529 - 1538
  • [10] How to Scale Crowdsourcing Platforms
    Kohler, Thomas
    [J]. CALIFORNIA MANAGEMENT REVIEW, 2018, 60 (02) : 98 - 121