Image-Based Tracking and 3D Content Generation by Jonathan Dyssel Stets

Eye tracking have gone through major improvements for the past decade, which has moved the application to a much wider field.

Both industry and researches has gained interest in analyzing what we are looking at in everyday life situations, and that is now possible using mobile eye trackers. A mobile eye tracker is a pair of glasses with a camera in front to record the scene that the person is looking at and it records the gaze position i.e. where in the scene the person is looking. Tracking and marking areas of interest (AOIs) in the recorded scene is currently done manually. A process that is both slow and costly.

This project investigates methods for tracking of AOIs through empirical and methodological studies. The goal is to contribute to an implementation of an automatic or semi-automatic tracking algorithm that can help analysts track AOIs in eye tracking videos.

The second part is to realize an easy creation of 3D models of scenes used for eye tracking studies. This is to be done via optical methods such as multiple view stereo or structured light systems. Using both the AOI tracking data, the eye-tracking data and the 3D models, it can be visualized what the person is looking at, with for instance 3D attention heat maps. This Ph.D. is a part of the Innovationsfonden project “Eye Tracking Research Platform” in collaboration with the commercial partner iMotions A/S and Copenhagen Business School.

Effective start/end date 01/11/2014 → 15/08/2018

Published as PhD report: Visual Human-Computer Interaction

Supervisors: Henrik Aanæs, Rasmus Larsen

Section for mage Analysis & Computer Graphics

Contact

Rasmus Larsen
Provost
Rektoratet
+45 45 25 10 10