Model and algorithm for neural network-based gaze fixation duration determination for dynamic content adaptation in inclusive learning

Автор(и)

DOI:

https://doi.org/10.18664/ikszt.v31i1.357674

Ключові слова:

multimodal systems, assistive technologies, context-aware interaction, people with disabilities, personalized interfaces

Анотація

The relevance of this study stems from the fact that most existing inclusive educational technologies provide only basic access  to content but do not ensure its effective mastery. In practice, they rely on static personalization grounded in a pre-defined  user profile (e.g., impairment type). The key shortcoming of this approach is its blindness to the dynamics of the actual  learning process: the system cannot recognize when a learner experiences an immediate spike in cognitive load or encounters  difficulty understanding a specific term or formula. This creates a barrier that leads to frustration, loss of motivation, and  ultimately superficial learning.
The objective of the work is to develop and substantiate an innovative mechanism for real-time dynamic adaptation of  educational content that responds to learners’ cognitive difficulties as they occur.
The study is based on designing a system that uses eye-tracking to monitor user interaction with digital instructional materials. The primary biometric indicator is the fixation duration on specific semantic units (text fragments, images, formulas).
This work is the first to propose an adaptation mechanism grounded in the hypothesis that prolonged gaze fixation is a  reliable marker of comprehension difficulty or elevated cognitive load. Upon detection of an extended fixation, the system  automatically triggers an adaptive scenario and offers targeted, context-specific support (e.g., term definitions, simplified  explanations, visual examples) precisely for the unit that elicited difficulty. This approach enables a shift from macro-personalization (profile-level) to micro-personalization (real-time), thereby creating a more responsive and effective inclusive  learning environment.

Біографія автора

Олеся Юріївна Барковська, Kharkiv National University of Radio Electronics

Candidate of Technical Sciences, Associate Professor at the Department of Electronic Computers

Посилання

Fernández-Batanero J. M., Montenegro-Rueda M., Fernández-Cerero J. et al. Assistive technology for the inclusion of students with disabilities: a systematic review. Education Tech Research Dev. 2022. 70. 1911–1930. https://doi.org/10.1007/s11423-022-10127-7.

Navas-Bonilla C. D. R., Guerra-Arango J. A., Oviedo-Guado D. A. & Murillo-Noriega D. E. (2025, February). Inclusive education through technology: a systematic review of types, tools and characteristics. Frontiers in Education. Vol. 10. Р. 1527851. Frontiers Media SA. https://doi.org/10.3389/feduc.2025.1527851.

Ayeni O. O., Al Hamad N. M., Chisom O. N., Osawaru B. & Adewusi O. E. (2024). AI in education: A review of personalized learning and educational technology. GSC Advanced Research and Reviews. 18 (2). 261-271. https://doi.org/10.30574/gscarr.2024.18.2.0062.

Salas-Pilco S. Z., Xiao K. & Oshima J. (2022). Artificial Intelligence and New Technologies in Inclusive Education for Minority Students: A Systematic Review. Sustainability. 14 (20). 13572. https://doi.org/10.3390/su142013572.

Barkovska O., Ivashchenko H., Rosinskiy D. & Zakharov D. (2024). Educational training simulator for monitoring reading technique and speed based on speech-to-text (STT) methods. Information Technologies and Learning Tools. 103 (5). 21. DOI:10.33407/itlt.v103i5.5647.

Barkovska O., Liapin Y., Ruban I., Rosinskiy D. & Tkachov V. (2025). ANALYSIS OF NEURAL NETWORK HYPERPARAMETERS FOR PREDICTING USER GAZE DIRECTION IN ADAPTIVE LEARNING SYSTEMS. Information Technologies and Learning Tools. 108 (4). 263. DOI:10.33407/itlt.v108i4.6145.

Barkovska, O., Liapin, Y., Muzyka, T., Ryndyk, I., & Botnar, P. (2024). Gaze Direction Monitoring Model in Computer System for Academic Performance Assessment. Information Technologies and Learning Tools. 99 (1). 63. DOI:10.33407/itlt.v99i1.5503.

##submission.downloads##

Опубліковано

2026-04-27