RIS ID

22866

Publication Details

This paper was originally published as: Meers, S & Ward, K, Haptic gaze-tracking based perception of graphical user interfaces, 11th International Conference on Information Visualisation, Zurich, Switzerland, 4-6 July 2007.

Abstract

This paper presents a novel human-computer interface that enables the computer display to be perceived without any use of the eyes. Our system works by tracking the user's head position and orientation to obtain their 'gaze' point on a virtual screen, and by indicating to the user what object is present at the gaze location via haptic feedback to the fingers and synthetic speech or Braille text. This is achieved by using the haptic vibration frequency delivered to the fingers to indicate the type of screen object at the gaze position, and the vibration amplitude to indicate the screen object's window-layer, when the object is contained in overlapping windows. Also, objects that are gazed at momentarily have their name output to the user via a Braille display or synthetic speech. Our experiments have shown that by browsing over the screen and receiving haptic and voice (or Braille) feedback in this manner, the user is able to acquire a mental two-dimensional representation of the virtual screen and its content without any use of the eyes. This form of blind screen perception can then be used to locate screen objects and controls and manipulate them with the mouse or via gaze control. Our experimental results are provided in which we demonstrate how this form of blind screen perception can effectively be used to exercise point-and-click and drag-and-drop control of desktop objects and open windows by using the mouse, or the user's head pose, without any use of the eyes.

Grant Number

ARC/DP0881180

Share

COinS