Advanced search
Start date
Betweenand


3D gaze estimation in the scene volume with a head-mounted eye tracker

Full text
Author(s):
Elmadjian, Carlos ; Shukla, Pushkar ; Tula, Antonio Diaz ; Morimoto, Carlos H. ; Spencer, SN
Total Authors: 5
Document type: Journal article
Source: COMMUNICATION BY GAZE INTERACTION (COGAIN 2018); v. N/A, p. 9-pg., 2018-01-01.
Abstract

Most applications involving gaze-based interaction are supported by estimation techniques that find a mapping between gaze data and corresponding targets on a 2D surface. However, in Virtual and Augmented Reality (AR) environments, interaction occurs mostly in a volumetric space, which poses a challenge to such techniques. Accurate point-of-regard (PoR) estimation, in particular, is of great importance to AR applications, since most known setups are prone to parallax error and target ambiguity. In this work, we expose the limitations of widely used techniques for PoR estimation in 3D and propose a new calibration procedure using an uncalibrated headmounted binocular eye tracker coupled with an RGB-D camera to track 3D gaze within the scene volume. We conducted a study to evaluate our setup with real-world data using a geometric and an appearance-based method. Our results show that accurate estimation in this setting still is a challenge, though some gaze-based interaction techniques in 3D should be possible. (AU)

FAPESP's process: 17/06933-0 - Augmented cognition through wearable computing
Grantee:Carlos Eduardo Leão Elmadjian
Support Opportunities: Scholarships abroad - Research Internship - Doctorate (Direct)
FAPESP's process: 16/10148-3 - Gaze-Based Interaction in Wearable Computing: an interface for the Internet of Things
Grantee:Carlos Hitoshi Morimoto
Support Opportunities: Regular Research Grants