Preprint Article Version 1 This version is not peer-reviewed

Designing a Holistic Student Evaluation Model for E-Learning Using a Multi-Task Attention-Based Deep Learning Approach

Version 1 : Received: 9 September 2024 / Approved: 10 September 2024 / Online: 10 September 2024 (15:06:03 CEST)

How to cite: Olaniyan, D.; Olaniyan, J.; Obagbuwa, I. C.; Micheal, E. B.; Bernard, O. P. Designing a Holistic Student Evaluation Model for E-Learning Using a Multi-Task Attention-Based Deep Learning Approach. Preprints 2024, 2024090760. https://doi.org/10.20944/preprints202409.0760.v1 Olaniyan, D.; Olaniyan, J.; Obagbuwa, I. C.; Micheal, E. B.; Bernard, O. P. Designing a Holistic Student Evaluation Model for E-Learning Using a Multi-Task Attention-Based Deep Learning Approach. Preprints 2024, 2024090760. https://doi.org/10.20944/preprints202409.0760.v1

Abstract

This study presents the development and evaluation of a Multi-Task Long Short-Term Memory (LSTM) model with an Attention Mechanism designed to predict students' academic performance. The model concurrently addresses two tasks: predicting overall performance (total score) as a regression task and categorizing performance levels (remarks) as a classification task. By processing both tasks simultaneously, the model optimizes computational efficiency and resource use. The dataset includes detailed student performance records across various metrics such as Continuous Assessment, Practical Skills, Demeanor, Presentation Quality, Attendance, and Participation. The model's performance was evaluated using comprehensive metrics. For the regression task, it achieved a Mean Absolute Error (MAE) of 0.0249, Mean Squared Error (MSE) of 0.0012, and Root Mean Squared Error (RMSE) of 0.0346. For the classification task, it attained perfect scores with an accuracy, precision, recall, and F1 score of 1.0. These results highlight the model's high accuracy and robustness in predicting both continuous and categorical outcomes. The Attention Mechanism enhances the model's capabilities by identifying and focusing on the most relevant features. This study demonstrates the effectiveness of the Multi-Task LSTM with Attention Mechanism in educational data analysis, offering a reliable tool for predicting student performance and potential broader applications in similar multi-task learning contexts. Future work will explore further enhancements and wider applications to improve predictive accuracy and efficiency.

Keywords

e-learning; student; performance; multi-task; deep learning; attention mechanism

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.