IEEE Access (Jan 2019)

Training a Camera to Perform Long-Distance Eye Tracking by Another Eye-Tracker

  • Wenyu Li,
  • Qinglin Dong,
  • Hao Jia,
  • Shijie Zhao,
  • Yongchen Wang,
  • Li Xie,
  • Qiang Pan,
  • Feng Duan,
  • Tianming Liu

DOI
https://doi.org/10.1109/ACCESS.2019.2949150
Journal volume & issue
Vol. 7
pp. 155313 – 155324

Abstract

Read online

Appearance-based gaze estimation techniques have been greatly advanced in these years. However, using a single camera for appearance-based gaze estimation has been limited to short distance in previous studies. In addition, labeling of training samples has been a time-consuming and unfriendly step in previous appearance-based gaze estimation studies. To bridge these significant gaps, this paper presents a new long-distance gaze estimation paradigm: train a camera to perform eye tracking by another eye tracker, named Learning-based Single Camera eye tracker (LSC eye-tracker). In the training stage, the LSC eye-tracker simultaneously acquired gaze data by a commercial trainer eye tracker and face appearance images by a long-distance trainee camera, based on which deep convolutional neural network (CNN) models are utilized to learn the mapping from appearance images to gazes. In the application stage, the LSC eye-tracker works alone to predict gazes based on the acquired appearance images by the single camera and the trained CNN models. Our experimental results show that the LSC eye-tracker enables both population-based eye tracking and personalized eye tracking with promising accuracy and performance.

Keywords