Using Mobile Dual Eye-Tracking to Capture Cycles of Collaboration and Cooperation in Co-located Dyads

被引:3
|
作者
Schneider, Bertrand [1 ,2 ]
Bryant, Tonya [1 ]
机构
[1] Harvard Univ, Cambridge, MA USA
[2] Harvard Univ, Cambridge, MA 02138 USA
关键词
JOINT VISUAL-ATTENTION; CONSTRUCTIVIST; PERSPECTIVES; MOVEMENTS;
D O I
10.1080/07370008.2022.2157418
中图分类号
G44 [教育心理学];
学科分类号
0402 ; 040202 ;
摘要
The goal of this paper is to bring new insights to the study of social learning processes by designing measures of collaboration using high-frequency sensor data. More specifically, we are interested in understanding the interplay between moments of collaboration and cooperation, which is an understudied area of research. We collected a multimodal dataset during a collaborative learning activity typical of makerspaces: learning how to program a robot. Pairs of participants were introduced to computational thinking concepts using a block-based environment. Mobile eye-trackers, physiological wristbands, and motion sensors captured their behavior and social interactions. In this paper, we analyze the eye-tracking data to capture participants' tendency to synchronize their visual attention. This paper provides three contributions: (1) we use an emerging methodology (mobile dual eye-tracking) to capture joint visual attention in a co-located setting and replicate findings that show how levels of joint visual attention are positively correlated with collaboration quality; (2) we qualitatively analyzed the co-occurrence of verbal activity and joint visual attention in low and high performing groups to better understand moments of collaboration and cooperation; (3) inspired by the qualitative observations and theories of collaborative learning, we designed a new quantitative measure that captures cycles of collaborative and cooperative work. Compared to simple measures of joint visual attention, we found it to increase correlation coefficients with learning and collaboration scores. We discuss those results and describe how advances in analyzing sensor data can contribute to theories of collaboration. We conclude with implications for capturing students' interactions in co-located spaces using Multimodal Learning Analytics (MMLA).
引用
收藏
页码:26 / 55
页数:30
相关论文
共 50 条
  • [21] Compensation of Head Movements in Mobile Eye-Tracking Data Using an Inertial Measurement Unit
    Larsson, Linnea
    Schwaller, Andrea
    Holmqvist, Kenneth
    Nystrom, Marcus
    Stridh, Martin
    PROCEEDINGS OF THE 2014 ACM INTERNATIONAL JOINT CONFERENCE ON PERVASIVE AND UBIQUITOUS COMPUTING (UBICOMP'14 ADJUNCT), 2014, : 1161 - 1167
  • [22] Capturing Cognitive Events Embedded in the Real World Using Mobile Electroencephalography and Eye-Tracking
    Ladouce, Simon
    Mustile, Magda
    Ietswaart, Magdalena
    Dehais, Frederic
    JOURNAL OF COGNITIVE NEUROSCIENCE, 2022, 34 (12) : 2237 - 2255
  • [23] Using mobile eye tracking to capture joint visual attention in collaborative experimentation
    Becker, Sebastian
    Mukhametov, Sergey
    Pawels, Philipp
    Kuhn, Jochen
    2021 PHYSICS EDUCATION RESEARCH CONFERENCE (PERC), 2022, : 39 - 44
  • [24] Study of using a multi-touch tabletop technology to facilitate collaboration, interaction, and awareness in co-located environment
    Shadiev, Rustam
    Hwang, Wu-Yuin
    Huang, Yueh-Min
    Yang, Yu-Shu
    BEHAVIOUR & INFORMATION TECHNOLOGY, 2015, 34 (10) : 952 - 963
  • [25] Measuring Construction Workers' Real-Time Situation Awareness Using Mobile Eye-Tracking
    Hasanzadeh, Sogand
    Esmaeili, Behzad
    Dodd, Michael D.
    CONSTRUCTION RESEARCH CONGRESS 2016: OLD AND NEW CONSTRUCTION TECHNOLOGIES CONVERGE IN HISTORIC SAN JUAN, 2016, : 2894 - 2904
  • [26] Zoned in or zoned out? Investigating immersion in slot machine gambling using mobile eye-tracking
    Murch, W. Spencer
    Limbrick-Oldfield, Eve H.
    Ferrari, Mario A.
    MacDonald, Kent I.
    Fooken, Jolande
    Cherkasova, Mariya V.
    Spering, Miriam
    Clark, Luke
    ADDICTION, 2020, 115 (06) : 1127 - 1138
  • [27] Orchestration Load Indicators and Patterns: n-the-Wild Studies Using Mobile Eye-Tracking
    Prieto, Luis P.
    Sharma, Kshitij
    Kidzinski, Lukasz
    Dillenbourg, Pierre
    IEEE TRANSACTIONS ON LEARNING TECHNOLOGIES, 2018, 11 (02): : 216 - 229
  • [28] Development of a novel visuomotor integration paradigm by integrating a virtual environment with mobile eye-tracking and motion-capture systems
    Miller, Haylie L.
    Bugnariu, Nicoleta
    Patterson, Rita M.
    Wijayasinghe, Indika
    Popa, Dan O.
    2017 INTERNATIONAL CONFERENCE ON VIRTUAL REHABILITATION (ICVR), 2017,
  • [29] Quantifying the costs of interruption during diagnostic radiology interpretation using mobile eye-tracking glasses
    Drew, Trafton
    Williams, Lauren H.
    Aldred, Booth
    Heilbrun, Marta E.
    Minoshima, Satoshi
    JOURNAL OF MEDICAL IMAGING, 2018, 5 (03)
  • [30] Task-dependence in scene perception: Head unrestrained viewing using mobile eye-tracking
    Backhaus, Daniel
    Engbert, Ralf
    Rothkegel, Lars O. M.
    Trukenbrod, Hans A.
    JOURNAL OF VISION, 2020, 20 (05):