| dc.description.abstract |
The "EyeQ: Analyzing Quiz Difficulty with Retina Insights" system represents a significant advancement in the realm of educational assessment. Traditional methods of assessing students' comprehension levels often fall short in accurately gauging question difficulty and providing timely feedback to educators. To address this challenge, our project introduces a novel approach that leverages eye-tracking technology integrated with mobile devices to monitor students' eye movements during in-class formative assessments.This project uses mobile phones where students can look while doing assessments. Using python and react native this project helps us to see if there are patterns in how students eye move depending on how hard the assessment questions are. We used libraries like TensorFlow, PyTorch and OpenCv to analyze eye movements. This helped in the enhancement of undergraduate education practices and in the improvement of students’ eLearning environments.
Through the analysis of eye movement patterns, our system aims to provide educators with real-time insights into the difficulty levels of questions posed to students. By employing machine learning algorithms, we develop a predictive model capable of accurately assessing question difficulty based on students' eye movements. This real-time feedback empowers educators to adapt their teaching methods on-the-fly, leading to more effective and equitable formative assessments in undergraduate education.
Key components of our project include the design of a user-friendly mobile application, rigorous testing and validation procedures, ethical handling of data, scalability considerations, iterative improvement strategies, and knowledge dissemination efforts. By addressing the critical need for objective and timely |
en_US |