Busca avançada
Ano de início
Entree


GazeBar: Exploiting the Midas Touch in Gaze Interaction

Texto completo
Autor(es):
Elmadjian, Carlos ; Morimoto, Carlos H. ; ACM
Número total de Autores: 3
Tipo de documento: Artigo Científico
Fonte: EXTENDED ABSTRACTS OF THE 2021 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS (CHI'21); v. N/A, p. 7-pg., 2021-01-01.
Resumo

Imagine an application that requires constant configuration changes, such as modifying the brush type in a drawing application. Typically, options are hierarchically organized in menu bars that the user must navigate, sometimes through several levels, to select the desired mode. An alternative to reduce hand motion is the use of multimodal techniques such as gaze-touch, that combines gaze pointing with mechanical selection. In this paper, we introduce GazeBar, a novel multimodal gaze interaction technique that uses gaze paths as a combined pointing and selection mechanism. The idea behind GazeBar is to maximize the interaction flow by reducing "safety" mechanisms (such as clicking) under certain circumstances. We present GazeBar's design and demonstrate it using a digital drawing application prototype. Advantages and disadvantages of GazeBar are discussed based on a user performance model. (AU)

Processo FAPESP: 15/26802-1 - Cognição aumentada por meio de plataformas vestíveis
Beneficiário:Carlos Eduardo Leão Elmadjian
Modalidade de apoio: Bolsas no Brasil - Doutorado Direto
Processo FAPESP: 16/10148-3 - Interação pelo Olhar em Computação Vestível: uma Interface para a Internet das Coisas
Beneficiário:Carlos Hitoshi Morimoto
Modalidade de apoio: Auxílio à Pesquisa - Regular