Multimodal learning analytics provides researchers new tools and techniques to capture different types of data from complex learning activities in dynamic learning environments. This paper investigates high-fidelity synchronised multimodal recordings of small groups of learners interacting from diverse sensors that include computer vision, user generated content, and data from the learning objects (physical computing components). We processed and extracted different aspects of the students’ interactions to answer the following question: which features of student group work are good predictors of team success in open-ended tasks with physical computing? To answer this question, we have explored different supervised machine learning approaches (traditional and deep learning techniques) to analyse the data coming from multiple sources. The results illustrate that state-of-the-art computational techniques can be used to generate insights into the "black box" of learning in students’ project-based activities. The features identified from the analysis show that distance between learners’ hands and faces is a strong predictor of students’ artefact quality which can indicate the value of student collaboration. Our research shows that new and promising approaches such as neural networks as well as more traditional regression approaches can both be used to classify MMLA data, and both have advantages and disadvantages depending on the research questions and contexts being investigated. The work presented here is a significant contribution towards developing techniques to automatically identify the key aspects of students success in project-based learning environments, and ultimately help teachers provide appropriate and timely support to students in these fundamental aspects.

Supervised Machine Learning in Multimodal Learning Analytics for Estimating Success in Project-based Learning

Ruffaldi, E.;Dabisias, G.;
2018-01-01

Abstract

Multimodal learning analytics provides researchers new tools and techniques to capture different types of data from complex learning activities in dynamic learning environments. This paper investigates high-fidelity synchronised multimodal recordings of small groups of learners interacting from diverse sensors that include computer vision, user generated content, and data from the learning objects (physical computing components). We processed and extracted different aspects of the students’ interactions to answer the following question: which features of student group work are good predictors of team success in open-ended tasks with physical computing? To answer this question, we have explored different supervised machine learning approaches (traditional and deep learning techniques) to analyse the data coming from multiple sources. The results illustrate that state-of-the-art computational techniques can be used to generate insights into the "black box" of learning in students’ project-based activities. The features identified from the analysis show that distance between learners’ hands and faces is a strong predictor of students’ artefact quality which can indicate the value of student collaboration. Our research shows that new and promising approaches such as neural networks as well as more traditional regression approaches can both be used to classify MMLA data, and both have advantages and disadvantages depending on the research questions and contexts being investigated. The work presented here is a significant contribution towards developing techniques to automatically identify the key aspects of students success in project-based learning environments, and ultimately help teachers provide appropriate and timely support to students in these fundamental aspects.
2018
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11382/521512
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 118
social impact