Gaze-Based Assistive Technologies

Gaze-Based Assistive Technologies

Thies Pfeiffer
Copyright: © 2014 |Pages: 20
DOI: 10.4018/978-1-4666-4438-0.ch004
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

The eyes play an important role both in perception and communication. Technical interfaces that make use of their versatility can bring significant improvements to those who are unable to speak or to handle selection tasks elsewise such as with their hands, feet, noses or tools handled with the mouth. Using the eyes to enter texts into a computer system, which is called gaze-typing, is the most prominent gaze-based assistive technology. The article reviews the principles of eye movements, presents an overview of current eye-tracking systems, and discusses several approaches to gaze-typing. With the recent advent of mobile eye-tracking systems, gaze-based assistive technology is no longer restricted to interactions with desktop-computers. Gaze-based assistive technology is ready to expand its application into other areas of everyday life. The second part of the article thus discusses the use of gaze-based assistive technology in the household, or “the wild,” outside one’s own four walls.
Chapter Preview
Top

Introduction

How exciting is the first eye contact with a new born baby and his parents. How overwhelming the moment when the eyes start exploring the world and the head movement follows the eyes. And how ground-shaking the effect once the baby’s eyes can follow the attentive gaze of her parents (Corkum & Moore, 1998; Hood, Willen & Driver, 1998) or find the target beyond the pointing finger (Butterworth & Itakura, 2000).

Eyes are a powerful device for communication–they tell a lot about us, our intentions, what we are talking about, whom we are talking to, and they even reveal parts of our emotions. They are part of a multimodal communicative ensemble that is our human body.

For some, however, the eyes are also the one and only gate to the outside world. We humans are often very good in a situated reading of the intentions of others just by observing their eyes. If someone we care for is gazing at a glass of water, which to him is out of reach, we infer that he might be thirsty, offer our help and give him to drink. If we talk to someone unable to speak or move, we can establish a pact and tailor our questions in such a way, that our interlocutor can answer them using eye blinks (e.g. one for no, two for yes) or eye movements (up/down or left/right).

This chapter addresses the question on how gaze-based assistive technologies enable us to make use of our eyes for purposes that are beyond their natural sensory use. It will show that today it is already possible to talk with our eyes and even to write letters. Someday we will also be able to interact with our (technically enhanced) physical environment based on eye gaze and first visionary steps into that direction will be presented.

Gaze-Based Interaction and the Midas-Touch Problem

A crucial task in interaction is the selection of the object to interact with. For a successful selection, one has to aim at a target and then trigger the selection (see e.g. Huckauf & Urbina, 2008). While aiming at a specific target is the example par excellence for an appropriate gaze-based interaction, it proves to be more difficult to actually trigger the selection.

The problem of providing a robust but also swift technique to trigger a selection is common for many gaze-based applications. The eye is predominantly a sensory organ which is now, in gaze-based interaction, used in an articulatory way. For articulation we want to have a high and exclusive control over our modality, so that we are not to be misunderstood. Gaze, however, wanders over the scene while we process it and it is highly attracted to changes in the environment. Users might thus look at a certain key on the screen because they want to type, but they might also just accidently look there, e.g. while listening to a response, or just because the keys’ depictions of a virtual keyboard switched from lower to upper case after triggering the “shift”-key. Other visual changes might, e.g., be the result of the intelligent algorithm that rearranges the display to present the keys most likely selected next at a prominent position.

The problem of unwillingly triggering reactions is known as the Midas-Touch problem, since a prominent paper by R.J.K. Jacob (1993).

Complete Chapter List

Search this Book:
Reset