Publications

You can also find my articles on my Google Scholar profile.

Journal Articles


Speech analysis of teaching assistant interventions in small group collaborative problem solving with undergraduate engineering students

Published in British Journal of Educational Technology, 2024

This descriptive study focuses on usingvoice activity detection (VAD) algorithms to extractstudent speech data in order to better understand thecollaboration of small group work and the impact ofteaching assistant (TA) interventions in undergradu-ate engineering discussion sections. Audio data wererecorded from individual students wearing head-mounted noise-cancelling microphones. Video data ofeach student group were manually coded for collabo-rative behaviours (eg, group task relatedness, groupverbal interaction and group talk content) of studentsand TA–student interactions. The analysis includes in-formation about the turn taking, overall speech durationpatterns and amounts of overlapping speech observedboth when TAs were intervening with groups and whenthey were not. We found that TAs very rarely providedexplicit support regarding collaboration. Key speechmetrics, such as amount of turn overlap and maximumturn duration, revealed important information about thenature of student small group discussions and TA inter-ventions. TA interactions during small group collabo-ration are complex and require nuanced treatmentswhen considering the design of supportive tools.

Recommended citation: C. M. D’Angelo and R. J. Rajarathinam, ‘Speech analysis of teaching assistant interventions in small group collaborative problem solving with undergraduate engineering students’, British Journal of Educational Technology, vol. 55, no. 4, pp. 1583–1601, 2024.
Download Paper

Unveiling joint attention dynamics: Examining multimodal engagement in an immersive collaborative astronomy simulation

Published in Computers and Education, 2024

Numerous computer-based collaborative learning environments have been developed to support collaborative problem-solving. Yet, understanding the complexity and dynamic nature of the collaboration process remains a challenge. This is particularly true in open-ended immersive learning environments, where students navigate both physical and virtual spaces, pursuing diverse paths to solve problems. In response, we aimed to unpack these complex collaborative learning processes by investigating 16 groups of college students (n = 77) who utilized an immersive astronomy simulation in their introductory astronomy course. Our specific focus is on joint attention as a multi-level indicator to index collaboration. To examine the interplay between joint attention and other multimodal traces (conceptual discussions and gestures) in students interactions with peers and the simulation, we employed a multi-granular approach. This approach encompasses macro-level correlations, meso-level network trends, and micro-level qualitative insights from vignettes to capture nuances at different levels. Distinct multimodal engagement patterns emerged between low- and high-achieving groups, evolving over time across a series of tasks. Our findings contribute to the understanding of the notion of timely joint attention and emphasize the importance of individual exploration during the early stages of collaborative problem-solving, demonstrating its contribution to productive knowledge co-construction. This research overall provides valuable insights into the complexities of collaboration dynamics within and beyond digital space. The empirical evidence we present in our study lays a strong foundation for developing instructional designs aimed at fostering productive collaboration in immersive learning environments.

Recommended citation: J. Kang, Y. Zhou, R. J. Rajarathinam, Y. Tan, and D. W. Shaffer, ‘Unveiling joint attention dynamics: Examining multimodal engagement in an immersive collaborative astronomy simulation’, Computers & Education, vol. 213, p. 105002, 2024.
Download Paper

Gesture-mediated collaboration with augmented reality headsets in a problem-based astronomy task

Published in International Journal of Computer-Supported Collaborative Learning, 2023

Extended reality technologies such as headset-based augmented reality (AR) unlock unique opportunities to integrate gestures into the collaborative problem-solving process. The following qualitative study documents the collection and analysis of group interaction data in an astronomy sky simulation across AR and tablet technologies in a classroom setting. A total of 15 groups were coded for episodes of on-task problem-solving, conceptual engagement, and use of gesture. Analysis of coded interactions assisted in identifying vignettes facilitating exploration, orientation, perspective sharing, and communication of mental models. In addition, the use of gesture by some groups enabled the creation of shared situated conceptual spaces, bridging the AR and tablet experiences and facilitating collaborative exchange of spatial information. The patterns of gesture and collaborative knowledge interactions documented here have implications for the design of future collaborative learning environments leveraging extended reality technologies.

Recommended citation: J. Planey, R. J. Rajarathinam, E. Mercier, and R. Lindgren, ‘Gesture-mediated collaboration with augmented reality headsets in a problem-based astronomy task’, International Journal of Computer-Supported Collaborative Learning, vol. 18, no. 2, pp. 259–289, Jun. 2023.
Download Paper

Conference Papers


Enhancing Multimodal Learning Analytics: A Comparative Study of Facial Features Captured Using Traditional vs 360-Degree Cameras in Collaborative Learning

Published in Proceedings of the 17th International Conference on Educational Data Mining, 2024

Multimodal Learning Analytics (MMLA) has emerged as a powerful approach within the computer-supported collaborative learning community, offering nuanced insights into learning processes through diverse data sources. Despite its potential, the prevalent reliance on traditional instruments such as tripod-mounted digital cameras for video capture often results in sub optimal data quality for facial expressions captured, which is crucial for understanding collaborative dynamics. This study introduces an innovative approach to overcome this limitation by employing 360-degree camera technology to capture students’ facial features while collaborating in small working groups. A comparative analysis of 1.5 hours of video data from both traditional tripod-mounted digital cameras and 360-degree cameras evaluated the efficacy of these methods in capturing Facial Action Units (AU) and facial keypoints. The use of OpenFace revealed that the 360-degree camera captured high-quality facial features in 33.17% of frames, significantly outperforming the traditional method’s 8.34%, thereby enhancing reliability in facial feature detection. The findings suggest a pathway for future research to integrate 360-degree camera technology in MMLA. Future research directions involve refining this technology further to improve the detection of affective states in collaborative learning environments, thereby offering a richer understanding of the learning process.

Recommended citation: R. J. Rajarathinam, C. Palaguachi, and J. Kang, ‘Enhancing Multimodal Learning Analytics: A Comparative Study of Facial Features Captured Using Traditional vs 360-Degree Cameras in Collaborative Learning’, in Proceedings of the 17th International Conference on Educational Data Mining, 2024, pp. 551–558.
Download Paper

Turn-taking analysis of small group collaboration in an engineering discussion classroom

Published in LAK23: 13th International Learning Analytics and Knowledge Conference, 2023

This preliminary study focuses on using voice activity detection (VAD) algorithms to extract turn information of small group work detected from recorded individual audio stream data from undergraduate engineering discussion sections. Video data along with audio were manually coded for collaborative behavior of students and teacher-student interaction. We found that individual audio data can be used to obtain features that can describe group work in noisy classrooms. We observed patterns in student turn taking and talk duration during various sections of the classroom which matched with the video coded data. Results show that high quality individual audio data can be effective in describing collaborative processes that occurs in the classroom. Future directions on using prosodic features and implications on how we can conceptualize collaborative group work using audio data are discussed.

Recommended citation: R. J. Rajarathinam and C. M. D’Angelo, ‘Turn-taking analysis of small group collaboration in an engineering discussion classroom’, in LAK23: 13th International Learning Analytics and Knowledge Conference, Arlington, TX, USA, 2023, pp. 650–656.
Download Paper