High-precision eye movement tracking is finally beginning to attract more attention as a method that allows us to understand those “instantaneous, unconscious reactions” and makes the “indecision leading up to a decision” observable. The FOVE headset display for VR uses a microcamera situated near the eyes that can track actual eye movement with precision of one degree.
SOOTH has accumulated eye movement tracking logs from more than 300 people as they watched videos. This data was collected for the purpose of analyzing TV commercial viewing, validating screen layouts using A/B testing, and visualizing eye movement responses when walking during VR experiences, among other things.
When the aggregated results are opened in the dashboard, the eye movement logs from test subjects are shown as a heat map over the video playing at 30 fps. This mapping makes it clear that the movement of viewers’ visual focus is being led by things like the movement of people in the video and editing.
(The photo below shows a simple test being given to 32 people.)
The movement of viewers’ visual focus is delayed a certain amount of time on the screen, but this is due to the limits of human reaction speed. We can leverage this gap by using these points where viewers’ visual focus stops for all types of applications.
The motion and speed of eye movement while reading on-screen text can also be confirmed.
This makes it possible to display both the video and eye movement log by selecting, for example, the “instant at which concentration levels are highest,” calculated based on an EEG log.
SOOTH is focused not only on electrical responses, like EKG and EEG output but also on these more physical reactions.SOOTH is also working broadly on learning more about psychology and behaviors in response to environments and stimuli in order to establish predictive models for psychology and behavior.