Eye tracking: what do we look at when we are looking?

Learning about eye-tracking

by Alexia Revueltas Rouz, PhD student (Centre for Research in Digital Education)

What do we look at when we are looking? When do we stop looking? What drives our gaze from one place to another?

Eye-movements are small and not easily perceived. These are strongly related to visual attention, to a degree that moving your eyes to a different location will shift your focus of attention (although it is possible to shift your attention without moving your eyes).

Eye tracking is a technique used to measure exactly where your eyes are focusing when you are looking at something. The assumption behind it is that visual attention drives them; this will depend on the type of activity that is being researched. In neuropsychology, it has been used to achieve a better understanding of cognition. For example, it could be a searching task where your gaze is directed toward something, like looking for the odd shape out amongst a pool of similar shapes, or an activity where you just have to watch a video or an image, which can show which salient parts are perceived and how they draw attention. Additionally, research has shown that sometimes eye-movements will guide an action, the eyes may move to a different position before the hands follow and start an action in that same place, like looking from the knife towards the peanut butter jar before reaching to open it while making a sandwich. In other words, eye tracking helps to record exactly where, when and for how long we look at something.

What you can measure with eye-tracking is whether you have a specific area of interest (AOI) or are exploring where the gaze is falling upon:

Fixations: Where the eyes are looking.

Fixation duration: How long do they look.

Time to First Fixation: From the beginning of the task, how long did it take to focus the gaze on something.

Saccades (when the image is static): Where the eyes move from one fixation to another.

Smooth pursuit (dynamic videos): The eye's trajectory in a moving object.

Scan-path: Sequence of fixations, in what order and where the eyes were moving.

The technique works by tracking the eyes with a near infrared light that is directed to the center of the eye and reflects in the cornea, which follows your pupil and the light in your cornea and records how your eyes are moving.  This is called PCCR - Pupil Center Corneal Reflection.  

A vector will be calculated using the angle between the cornea and the pupil reflections.  This will be done at the same time that you are performing a task. When the analysis is performed both the image/video of whatever you were watching is overlapped with the recording pattern. Furthermore, this is a completely harmless technique and is not intrusive in any way.

There are two types of eye-tracking techniques: Screen-based and wearable eye-trackers (or eye tracking glasses). The use of each of them will depend on the aim of the research. For example, glasses are used mainly in real-scenarios and have a lower quality than screen-based trackers, however, they allow exploration of visual attention in a real-context; the main field of application is for consumer choices in supermarkets. The screen-based method is used for any task that can be reproduced through a computer, because it runs like a second screen and the tracker is in the bottom part of the monitor; these type of eye-trackers can be used at all ages, from infants to older adults, something that glasses can’t do yet.

Screen-based eye tracker image
Screen-based eye tracker

Mobile eye tracker diagram
Mobile eye tracker

One of the types of results you can get as an output from an eye tracking research is called heat map. These show a gradient of colours (from green to red) where the warmer colour represents a higher number of gaze fixation, it can be how many people coincided in a certain part of the image/video or how many times a single person looked at that part.

https://www.youtube.com/watch?v=iDQlMrhyjtw -> video relevant to heat maps

My research objectives include (1) using mobile eye tracking to explore where children are looking when they take part in a hands-on science exhibit and (2) if they look at the relevant areas of the exhibit. Could it be that the parts in an exhibit where children are more visually active are related to higher interaction with them?  My study aims to provide insight into visual attention in a real-scenario, a science centre, and where this attention is directed or drawn when children are interacting with the exhibits. This is similar to work done before by Walker, Bucker, Anderson, Schreij and Theeuwes in an art museum with older children where it was found that adults and children looked at paintings in different ways, that is, they are not attracted to the same features of the paintings. This is also relevant to our Move2Learn project because it could provide us with further insight into which parts of hands-on exhibits (and which gestures) could be relevant when it comes to early science learning.

Heat maps of adults and children looking at the same Van Gogh painting (from Walker et al., 2017)
Heat maps of adults and children looking at the same Van Gogh painting (from Walker et al., 2017)

It was because of this that I recently took a train trip to Kingston University to take part in a workshop about eye tracking with Dr Jo van Herwegen (on the eve of the snow blizzard, a.k.a. The Beast of the East, which was completely worth doing!). She is an expert in this field, particularly when it comes to working with children and with both types of eye-trackers. She has worked with typically developed children and with clinical population, mainly Autism Spectrum Disorder and Williams syndrome, both populations have different gaze patterns when compared to typically developed ones.

Thanks to that, I got to work with a Tobii eye-tracker and learned how to start setting up an experiment and a very rough idea of how to analyse images and videos by setting up your areas of interest (the Tobii software makes it so easy!).  One of the best bits of advice I took from her workshop is something not commonly spoken about is: thinking when it is really relevant to use eye-tracking, particularly mobile eye-tracking (which, despite looking pretty, can be difficult and time consuming to analyze). There are other research methods similar that can be a very good proxy, for example a head-mounted camera or one video of a person using mobile eye-tracking in a real-scenario which is afterwards displayed in a screen-based eye-tracker. There are many different approaches when it comes to researching eye movements. The most important part is having the research question clearly defined and answering why is it important to use this technique.

Currently we are working on a pilot study to track if children are looking at gestures when an adult explains a mathematical concept through a video, if they use similar gestures depending on their gaze pattern and how much they observed afterwards.

For more information about eye tracking:




Walker F, Bucker B, Anderson NC, Schreij D, Theeuwes J (2017) Looking at paintings in the Vincent Van Gogh Museum: Eye movement patterns of children and adults. PLoS ONE 12(6): e0178912. https://doi.org/10.1371/journal.pone.0178912