Start  /  Projects  /  Completed projects  /  Projects
Start
Projects

Completed project

 

VisualEyes - Head and eye behaviour measurement and visualisation in simulators

Introduction

Driving and operating a vehicle is to a great extent a visual task. In driver behaviour studies it is therefore important to be able to measure where the driver is looking. Today this can be done unobtrusively and remotely in real-time with camera-based eye tracking. The most common remote eye tracking systems use multiple cameras in order to give satisfactory results. However, promising results using only one camera has recently emerged on the market. Further, in order to study and understand driver behaviour, it would be very beneficial to see the driver´s visual behaviour superimposed on the actual environment.

Project objectives

The VisualEyes project had three main goals:

1. Evaluation of the influence of factors such as wearing glasses

   and participants' age on gaze tracking system performance for a

   one-camera and a three-camera system.

2. Development of a self-initialized visual attention detection module

    based on a one-camera system.

3. Development of a real-time visualization system for gaze

   direction.

Results

Gaze tracking system performance

Data from a one-camera and a three-camera eye tracker were acquired and analysed in terms of availability, accuracy and precision. Both availability and accuracy were found to be affected by different factors, the most important being the number of cameras used and the angular distance from straight ahead. Both the one-camera and the three-camera system have a high degree of accuracy and availability straight ahead. But with increasing distance from the central region the results deteriorate, more for the one-camera system. Interestingly, no significant effects of wearing glasses were found, neither in availability nor in accuracy. There was however an interaction effect between distance and glasses.

 

Visual attention detection

A one-camera system located in front of the driver is used to detect when s/he is visually inattentive, by measuring head position, head direction and gaze direction. Statistics of gaze data from driving on a straight road is used to automatically (no calibration needed)create a reference to the surroundings. Inattention is defined as the driver´s gaze diverting from a defined "Visual Field Relevant for Driving" for a certain period of time. The modelling of this time buffer allows the driver to be visually inattentive for a limited time. The software is written in C# programming language. In driving simulator studies the visual attention detection can be used to monitor drivers' attention status and (by feeding back information to the simulator control) to trigger events during attentive and inattentive states, respectively.

More information: Henrik Bergström, Pixcode AB

 

eyeVis - visualization of gaze direction

By the eyeVis tool the driver's gaze is superimposed on a video of the front scenery and visualized as a cross or a gaze trail. Regions of interest (ROI) can be added for analysing and statistical purposes, e.g. percentage of time in a certain ROI. Feasible sections of a test drive can be selected for more detailed analysis  (e.g. of a certain time-frame where the driver is doing a specified task). To make eyeVis really useful it is designed so the logging system of the simulator automatically creates an XML file  together with the AVI movie of the drive. The XML file contains information about the objects in the scenery and about how the cabin is moved with respect to the projection screens. The information of the objects is used to automatically generate the ROIs. Thus, a car driving in front of the simulator vehicle can be choosen as a ROI which follows the car over time and disappears when the car drives out of sight. The cabin information is compensated for the fact that the eye-tracking cameras are mounted in the cabin and move with it. So far, the eyeVis tool is not realized for real-time operation.

More information: Martin Pettersson, Smart Eye AB

 

Conclusions

Advantages with a one-camera system are that it is cheaper, easier to operate and easier to install in a vehicle. A multi-camera system will, on the other hand, provide higher availability and accuracy for areas that are far from the road centre. A one-camera system is thus mostly suitable for in-vehicle applications such as systems that warn drivers for sleepiness or distraction while multi-camera solutions are preferable for research purposes.

The visual attention module is suitable to use for monitoring the drivers visual attention status and can be logged or used in scenarios in simulator studies.

The visualization tool is useful for purposes like demonstrations and for understanding drivers visual behaviour, and can also be used for making statistics on gaze behaviour.

 

Project manager: Arne Nåbo, Saab, +46 (0)520 780 40

Project partners: Saab, SmartEye, VTI

Contractors: Pixcode AB and the Skaraborg Institute, sub-contrators to SmartEye

 

Reference: Ahlström, C., et al. (2010). Performance of a one-camera and a three-camera system. ViP publication 2010-1, Linköping: VTI, Sweden.

 

 

Project plan

 

VisualEyes - Head and eye behaviour measurements and visualisation in simulator

 

The VisualEyes project has two main aims: evaluation of gaze tracking system performance of a one-camera and a multi-camera system and development of a real-time visualization system for gaze direction.


Workpackage1: In a pilot study, a one-camera gaze tracking system will be evaluated by measuring drivers' head and eye movements. About 50 people with varying face characteristics will be the test group in the experiment. A reference system with two cameras will be used as ground thruth. The results will be the basis for the design of a validation test for ackommodation of the driver population with the purpose to develop guidelines, helping researchers in their choice of camera system, and a software for real-time detection of visual attention.

 

Workpackage 2: A tool for visualisation of drivers' gaze behaviour in driving simulators will be developed. This will enable folowing the drivers' visual behaviour in the surrounding scenario and inside the car.

 

Project manager: Arne Nåbo, Saab, +46 (0)520 780 40 

Project partners: Saab, SmartEye and VTI

Contractors: Pixcode and Skaraborg Institute, sub-contractors to SmartEye

Project period: January 2009 - September 2009

 


What's up?

ViP PM 2017-1
Linking gaze tracking with a simulated world
October 19 - 2017
ViP PM 2016-6
SPASS - Strengthen Performance Active Safety Simulator
October 19 - 2017
New ViP booklet
Read the new ViP booklet
April 11 - 2017
New publications
ViP Publication 2016-2, 2016-3 and ViP PM 2016-5
February 20 - 2017
More news
Ariom Reklambyrå i Linköping och Norrköping