What is Eye Tracking
Eye tracking, sometimes called gaze interaction, captures eye movements and measures eye activity. An eye tracker is used to detect and measure the position of the eyes, where an individual is looking (also known as gaze point), and changes in gaze position. Eye tracking is advancing Health AI, with groundbreaking applications in neurology, ophthalmology, augmentative and alternative communication (AAC), interactive education, and medical and consumer research.
What Can Eye Tracking Tell Us?
Eye tracking is useful when studying a person's gaze behavior, visual attention span, and physiological responses. Eye movement data offers a way to trace what the eyes are looking at, how we visually explore stimuli, when and why eyes become fixated on an object, and more. The data captured helps identify visibility, engagement, and viewing patterns and can provide insights for varied applications, such as the study of neurodegenerative disorders.
What is Eye Tracking Used For?
Eye tracking can be used to operate an on-screen mouse, keyboard, and even wheelchairs with just the eyes. Eye movement recordings are translated into mouse commands, enabling touchless interaction with a device. But beyond simple commands, eye control brings independent mobility to people living with disabilities.
This technology can also offer unique insights into a person's attention, focus, emotional state, mental engagement, and cognitive functions. Eye tracking has endless applications, including e-learning, automotive infotainment systems, market research, advertising, biometrics security, healthcare, and more.
How Does Eye Tracking Work?
Our trackers include infrared light sources, a single camera, and image processing algorithms. During the calibration process, the camera takes a series of high-resolution images of the eyes as they follow a stimulus. During this process, eyes are illuminated with infrared lights, creating a glint on each pupil. The camera records the corneal reflection patterns created by the lights and then our image processing algorithms compute the relative distance of the light to the pupil