Graph Neural Networks (GNN) and Long Short-Term Memory (LSTM) for Forecasting Learner Attrition: A Systematic Review

Authors

  • Chinedu Cory Otuya National Open University of Nigeria, Nigeria https://orcid.org/0009-0005-9118-091X
  • Afolayan Ayodele Obiniyi Federal University Lokoja, Nigeria
  • Joseph Sunday Igwe Ebonyi State University, Nigeria

DOI:

https://doi.org/10.64539/sjcs.v2i2.2026.453

Keywords:

Learner attrition, Deep learning, Graph neural networks, Long short-term memory, PRISMA

Abstract

The issue of learner attrition is a long-standing problem in Open and Distance Learning (ODL) settings where the lack of physical interaction and flexibility exacerbates the risk of disengagement. The use of Deep Learning (DL) techniques for forecasting complexity of behavioural and relational patterns of educational data has grown in usage. While Artificial Intelligence, DL in particular offers superior accuracy in forecasting attrition, the selection of appropriate techniques that addresses temporal sequence and relational patterns remains a critical gap due to inductive biases of ODL settings. This paper performs a systematic review based on the Preferred Reporting Items of Systematic Reviews and Meta-Analyses (PRISMA) tool in order to synthesize the current body of knowledge regarding the use of LSTM and GNN in forecasting attrition. The peer-reviewed articles were located in major digital databases and filtered based on predetermined inclusion and exclusion criteria. The review evaluated model archetypes, data properties, metrics of evaluation, and performance results. Results showed that LSTM models were more useful in learning temporal patterns of engagement, whereas GNN models were efficient at learning relational and social learning patterns. Nevertheless, differences in datasets, validation procedures and evaluation metrices made it difficult to directly compare the results. The study identified methodological gaps of single models and recommended the use of hybrid methods for increased accuracy. The review gave consolidated information that direct researchers and institutions in the selection of suitable hybrid deep learning model in forecasting learner attrition.

References

[1] K. R. Kannan, K. T. M. Abarna, and S. Vairachilai, “Graph Neural Networks for Predicting Student Performance: A Deep Learning Approach for Academic Success Forecasting,” International Journal of Intelligent Systems and Applications in Engineering, vol. 12,, no. 1s, pp. 228-232, 2024. https://ijisae.org/index.php/IJISAE/article/view/3410.

[2] J. N. Ndunagu, D. O. Oyewola, F. S. Garki, J. C. Onyeakazi, C. U. Ezeanya, and E. Ukwandu, “Deep Learning for Predicting Attrition Rate in Open and Distance Learning (ODL) Institutions,” Computers, vol. 13, no. 9, p. 229, Sep. 2024. https://doi.org/10.3390/computers13090229.

[3] C. Otuya, A. A. Obiniyi, J. S. Igwe, D. M. Adayilo, and E. Ladan, “Analysing Critical Determinants of Learner Attrition in Open and Distance Learning: Effect on Coefficient of Efficiency,” in Proceedings of the 1st International Hybrid Conference of Faculty of Science, Federal University Gusau, 2025. https://doi.org/10.57233/ihcfs.013.

[4] I. Y. Umar, K. I. Musa, and M. Tella, “Long Short-Term Memory (LSTM) Based Model for Forecasting Students’ Dropout,” International Journal of Scientific Research and Management Studies (IJSRMS), vol. 11, no. 7, pp.124-135, 2025. [Online]. Available: https://www.isroset.org/pdf_paper_view.php?paper_id=3915&14-ISROSET-IJSRMS-10579.pdf.

[5] M. Li, X. Wang, Y. Wang, Y. Chen, and Y. Chen, “Study-GNN: A Novel Pipeline for Student Performance Prediction Based on Multi-Topology Graph Neural Networks,” Sustainability, vol. 14, no. 13, p. 7965, Jun. 2022. https://doi.org/10.3390/su14137965.

[6] Z. Luo, X. Wang, Y. Wang, H. Zhang, and Z. Li, “A Personalized MOOC Learning Group and Course Recommendation Method Based on Graph Neural Network and Social Network Analysis,” arXiv preprint arXiv:2410.10658, Oct. 2024. [Online]. Available: http://arxiv.org/abs/2410.10658.

[7] D. Roh, D. Han, D. Kim, K. Han, and M. Y. Yi, “SIG-Net: GNN-Based Dropout Prediction in MOOCs Using Student Interaction Graph,” in Proceedings of the 39th ACM/SIGAPP Symposium on Applied Computing, Avila, Spain: ACM, Apr. 2024, pp. 29–37. https://doi.org/10.1145/3605098.3636002.

[8] X. Wu, Z. Yu, C. Zhang, and Z. Zhiheng, “Research on MOOC Dropout Prediction by Combining CNN-BiGRU and GCN,” in Proceedings of the Fourth International Conference on Computer Vision, Application, and Algorithm (CVAA 2024), H. Yuan and L. Leng, Eds., Chengdu, China: SPIE, Jan. 2025, p. 109. https://doi.org/10.1117/12.3055872.

[9] D. K. Dake and C. Buabeng-Andoh, “Using Machine Learning Techniques to Predict Learner Dropout Rate in Higher Educational Institutions,” Mobile Information Systems, vol. 2022, pp. 1–9, Nov. 2022. https://doi.org/10.1155/2022/2670562.

[10] Y. Guo and Y. He, “MOOC Dropout Prediction Using Explainable Relational Graph Convolution,” IEEE Access, vol. 13, pp. 204759–204772, 2025. https://doi.org/10.1109/ACCESS.2025.3637528.

[11] Q. Huang and J. Chen, “Enhancing Academic Performance Prediction with Temporal Graph Networks for Massive Open Online Courses,” Journal of Big Data, vol. 11, no. 1, p. 52, Apr. 2024. https://doi.org/10.1186/s40537-024-00918-5.

[12] Q. Huang and Y. Zeng, “Improving Academic Performance Predictions with Dual Graph Neural Networks,” Complex & Intelligent Systems, vol. 10, no. 3, pp. 3557–3575, Jun. 2024. https://doi.org/10.1007/s40747-024-01344-z.

[13] Y. Liu, S. Fan, S. Xu, A. Sajjanhar, S. Yeom, and Y. Wei, “Predicting Student Performance Using Clickstream Data and Machine Learning,” Education Sciences, vol. 13, no. 1, p. 17, Dec. 2022. https://doi.org/10.3390/educsci13010017.

[14] L. Liu and L. Wan, “Innovative Models for Enhanced Student Adaptability and Performance in Educational Environments,” PLoS ONE, vol. 19, no. 9, p. e0307221, Sep. 2024. https://doi.org/10.1371/journal.pone.0307221.

[15] S. C. Matz, C. S. Bukow, H. Peters, C. Deacons, A. Dinu, and C. Stachl, “Using Machine Learning to Predict Student Retention from Socio-Demographic Characteristics and App-Based Engagement Metrics,” Scientific Reports, vol. 13, no. 1, p. 5705, Apr. 2023. https://doi.org/10.1038/s41598-023-32484-w.

[16] A. Zhang, Z. C. Lipton, Mu Li, and Alexander J. Smola, “Long Short-Term Memory (LSTM),” in Dive into Deep Learning. [Online]. Available: https://d2l.ai/chapter_recurrent-modern/lstm.html.

[17] N. A. B. Hasman, N. B. B. A. Mustafa, and M. N. Chik, “Deep Learning Models Performance Comparison for Solar Energy Generation Forecasting in a Large-Scale Solar Farm,” IOP Conference Series: Earth and Environmental Science, vol. 1560, no. 1, p. 012041, Nov. 2025. https://doi.org/10.1088/1755-1315/1560/1/012041.

[18] X. Li, Y. Zhang, H. Cheng, M. Li, and B. Yin, “Student Achievement Prediction Using Deep Neural Network from Multi-Source Campus Data,” Complex & Intelligent Systems, vol. 8, no. 6, pp. 5143–5156, Dec. 2022. https://doi.org/10.1007/s40747-022-00731-8.

[19] W. Guo, C. Cheng, C. Huang, Z. Lu, K. Chen, and J. Ding, “A Spatio-Temporal Graph Neural Network for Predicting Flow Fields on Unstructured Grids with the SUBOFF Benchmark,” Journal of Marine Science and Engineering, vol. 13, no. 9, pp. 1647–1670, 2025. https://doi.org/10.3390/jmse13091647.

[20] L. Yan, et al., “Streamflow Prediction of Spatio-Temporal Graph Neural Network with Feature Enhancement Fusion,” Symmetry, vol. 18, no. 2, p. 240, Jan. 2026. https://doi.org/10.3390/sym18020240.

[21] S. Xu, L. Xie, R. Dai, and Z. Lyu, “Dumpling GNN: Hybrid GNN Enables Better ADC Payload Activity Prediction Based on the Chemical Structure,” International Journal of Molecular Sciences, vol. 26, no. 10, p. 4859, May 2025. https://doi.org/10.3390/ijms26104859.

[22] C. Yang, P. Jin, and Y. Chen, “Leveraging Graph Neural Networks and Gated Recurrent Units for Accurate and Transparent Prediction of Baseball Pitching Speed,” Scientific Reports, vol. 15, no. 1, p. 7745, Mar. 2025. https://doi.org/10.1038/s41598-025-88284-x.

[23] M. S. Sonani, A. Badii, and A. Moin, “Stock Price Prediction Using a Hybrid LSTM-GNN Model: Integrating Time-Series and Graph-Based Analysis,” arXiv preprint arXiv:2502.15813, Feb. 2025. https://doi.org/10.48550/arXiv.2502.15813.

[24] C. Makanga, et al., “Explainable Machine Learning and Graph Neural Network Approaches for Predicting Employee Attrition,” in Proceedings of the 2024 Sixteenth International Conference on Contemporary Computing, Noida, India: ACM, Aug. 2024, pp. 243–255. https://doi.org/10.1145/3675888.3676058.

[25] P. Takaki, M. L. Dutra, G. De Araújo, and E. M. D. S. Júnior, “A Proposed Framework for Evaluating the Academic-Failure Prediction in Distance Learning,” Mobile Networks and Applications, vol. 27, no. 5, pp. 1958–1966, Oct. 2022. https://doi.org/10.1007/s11036-022-01965-z.

[26] S. Abadal, A. Jain, R. Guirado, J. López-Alonso, and E. Alarcón, “Computing Graph Neural Networks: A Survey from Algorithms to Accelerators,” ACM Computing Surveys, vol. 54, no. 9, pp. 1–38, Dec. 2022. https://doi.org/10.1145/3477141.

[27] J. Cherian and R. Kumar, “Fundamentals of Machine Learning,” in Machine Learning Fundamentals, 2023, pp. 147–174. https://doi.org/10.1007/978-3-031-22206-1_6.

[28] D. Möller, “Machine Learning and Deep Learning,” in Machine Learning and Deep Learning, 2023, pp. 347–384. https://doi.org/10.1007/978-3-031-26845-8_8.

[29] J. Bhanbhro, A. A. Memon, B. Lal, S. Talpur, and M. Memon, “Speech Emotion Recognition: Comparative Analysis of CNN-LSTM and Attention-Enhanced CNN-LSTM Models,” Signals, vol. 6, no. 2, p. 22, May 2025. https://doi.org/10.3390/signals6020022.

[30] G. Wei, et al., “Drilling and Completion Condition Recognition Algorithm Based on CNN-GNN-LSTM Neural Networks and Applications,” Processes, vol. 13, no. 4, p. 1090, Apr. 2025. https://doi.org/10.3390/pr13041090.

[31] L. C. Lamb, A. Garcez, M. Gori, M. Prates, P. Avelar, and M. Vardi, “Graph Neural Networks Meet Neural-Symbolic Computing: A Survey and Perspective,” arXiv preprint arXiv:2003.00330, Jun. 2021. https://doi.org/10.48550/arXiv.2003.00330.

[32] S. Wu, F. Sun, W. Zhang, X. Xie, and B. Cui, “Graph Neural Networks in Recommender Systems: A Survey,” ACM Computing Surveys, vol. 55, no. 5, pp. 1–37, May 2023. https://doi.org/10.1145/3535101.

[33] S. A. Shahriar, Y. Choi, and R. Islam, “Advanced Deep Learning Approaches for Forecasting High-Resolution Fire Weather Index (FWI) over CONUS: Integration of GNN-LSTM, GNN-TCNN, and GNN-DeepAR,” Remote Sensing, vol. 17, no. 3, p. 515, Feb. 2025. https://doi.org/10.3390/rs17030515.

Downloads

Published

2026-04-18

How to Cite

Otuya, C. C., Obiniyi, A. A., & Igwe, J. S. (2026). Graph Neural Networks (GNN) and Long Short-Term Memory (LSTM) for Forecasting Learner Attrition: A Systematic Review. Scientific Journal of Computer Science, 2(2), 193–202. https://doi.org/10.64539/sjcs.v2i2.2026.453