Blind User Interfacing: Requirements, Models and a Framework

Blind User Interfacing: Requirements, Models and a Framework

Fernando Alonso, José Fuertes, Ángel González, Loïc Martínez
DOI: 10.4018/978-1-60566-206-0.ch013
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

There are specific usability requirements for developing dual interfaces, that is, graphical user interfaces that are also adapted for blind users. These include task adequacy, dimensional trade-off, behavior equivalence, semantic loss avoidance and device-independence. Consequently, the development of human-computer interfaces based on task, domain, dialog, presentation, platform and user models has to be modified to take into account these requirements. This paper presents the requirements for blind user interfacing, the changes to be made to the human-computer interface models and a framework that improves the development of dual user interfaces. The framework includes a set of guidelines for interface design, a toolkit for the low effort implementation of dual user interfaces, and a programming library for the inclusion of speech and Braille in applications. A case study of the development of one such dual interface application is also presented.
Chapter Preview
Top

Background

The development of user interfaces for blind people has taken two complementary research and development paths: (1) adapting existing visual interfaces for blind users and (2) building dual interfaces or interfaces especially adapted for blind or visually impaired users.

The adaptation of visual interfaces for blind people emerged in response to the need to adapt the directly manipulated graphical environments that were proliferating to enable people with impaired vision to use common development software. The pioneering projects on this subject include OutSpoken (Edwards, 1991) and Mercator (Mynatt and Edwards, 1992a), which aimed to adapt MacOS and X-Windows, the most commonly used graphical environments in the early 90s. These projects encouraged the development of the Off Screen Model (Mynatt and Edwards, 1992b), the adaptation model on which most of the tools available for adapting interfaces for blind users are based. Tools like Orca help provide access to applications and toolkits that support the AT-SPI (e.g., the GNOME desktop) (GNOME, 2009), VoiceOver gives blind people access to MacOS X systems (Apple, 2009), and using Hal Screen Reader (Dolphin Computer Access, 2009), JAWS (Freedom Scientific BLV Group, 2008) and Window-Eyes (GW Micro, 2009) non-sighted users can access Windows environments.

Key Terms in this Chapter

User Model: Description of the characteristics of an individual user or of a stereotype user group. The user model is not intended to be a description of the mental state of a user.F

Domain Model: Model that describes the syntactic sequence of human-computer interaction through user interface objects and determines how the set of tasks and actions are ordered.

Presentation Model: Model that describes the perceptual appearance of the user interface.

Task Model: A formal description of the services the user accesses. It is organized hierarchically and contains information regarding task activation, preconditions, postconditions, and the actual task action.

User Interfaces for All: The principle of ensuring accessibility at design time and to meet the individual needs, abilities and preferences of the user population at large, including blind people and other disabled and elderly users.

Usability: The extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use.

HCI Modeling: A method for developing user interfaces. One key HCI modeling method is based on six models: task, domain, dialog, presentation, platform and user.

Platform Model: Model that contains information about the capabilities, constraints and limitations of the target platform.

Dual User Interface: A graphical user interface that can also accommodate blind users.

Dialog Model: Model that defines the commands as well as the purpose of each one. Commands are executed via interaction techniques and may produce one or more system responses.

Complete Chapter List

Search this Book:
Reset