Advanced search
Start date
Betweenand


GazeBar: Exploiting the Midas Touch in Gaze Interaction

Full text
Author(s):
Elmadjian, Carlos ; Morimoto, Carlos H. ; ACM
Total Authors: 3
Document type: Journal article
Source: EXTENDED ABSTRACTS OF THE 2021 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS (CHI'21); v. N/A, p. 7-pg., 2021-01-01.
Abstract

Imagine an application that requires constant configuration changes, such as modifying the brush type in a drawing application. Typically, options are hierarchically organized in menu bars that the user must navigate, sometimes through several levels, to select the desired mode. An alternative to reduce hand motion is the use of multimodal techniques such as gaze-touch, that combines gaze pointing with mechanical selection. In this paper, we introduce GazeBar, a novel multimodal gaze interaction technique that uses gaze paths as a combined pointing and selection mechanism. The idea behind GazeBar is to maximize the interaction flow by reducing "safety" mechanisms (such as clicking) under certain circumstances. We present GazeBar's design and demonstrate it using a digital drawing application prototype. Advantages and disadvantages of GazeBar are discussed based on a user performance model. (AU)

FAPESP's process: 15/26802-1 - Augmented cognition using wearable computing
Grantee:Carlos Eduardo Leão Elmadjian
Support Opportunities: Scholarships in Brazil - Doctorate (Direct)
FAPESP's process: 16/10148-3 - Gaze-Based Interaction in Wearable Computing: an interface for the Internet of Things
Grantee:Carlos Hitoshi Morimoto
Support Opportunities: Regular Research Grants