ᅠ 

Where does accessibility plug into the graphical desktop stack?

Samuel Thibault

Slides & stuff on http://brl.thefreecat.org/

http://liberte0.org/

Outline

Gnuplot

Gnuplot output with red and green curves

Color blindness: 8% male, 0.5% female

What is accessibility?

AKA a11y

Usable by people with specific needs

See Accessibility HOWTOs

“Handicap” depends on the situation

and is not necessarily permanent

Why making GUI accessible?

(when textmode seems so easier to make accessible)

Dedicated software?

Design principles

Status in a few words

Story of an 'a'

Input

This figure shows the 0x1e scancode emitted by the keyboard, transformed into KEY_A=30 by atkbd, transferred to input, evdev, passed as an even to the X server, shows input-evdev driver turns into 38=30+8, passed by the core to the X client

Still a keycode

i.e. physical position

Input

This figure shows that the KeyPress(38) event is handled by the toolkit main loop, which uses XKB to turn it into a KeySym(XK_a), passed to the toolkit input, which turns it into its own input_signal(a) representation, passed to the toolkit widget

XKB handles turning into keysym, i.e. keyboard cap

Widget eventually has some behavior, e.g. append to text

Output

This shows that the widget passes text to text rendering such as pango to get a pixmap, which is passed to the X server, whose video driver pushes it to the video card, eventually shown on the screen.

Pixmap very early!

Not necessarily a screen, actually...

Accessibility in input

Versatility FTW!

Some people can only use

Keyboard layouts

AccessX

Basically fine-tuning

Virtual keyboard

This shows the xvkbd window, which is essentially a keyboard

Virtual keyboard

This shows the whole input stack, and an additional arrow shows that xvkbd injects events between the X server and the X client

XTest injection

Braille keyboards

This shows the whole input stack, plus brltty which feeds uinput with evdev events

Some braille devices have a classical PC keyboard

Braille keyboards

Others have a braille keyboard

Braille keyboards

But now we have a keysym, not a keycode

This shows the whole input stack, with the brltty daemon trying (and failing) to insert a keysym into the X server, and xbrlapi trying (and failing) to insert a keysym into the X client, and eventually xbrlapi using XKB to turn 'a' into KeyPress(38), and push that to the X client.

PC Braille keyboard

Typing braille with the PC keyboard

PC keyboard with the asdf and jkl; keys highlighted as being braille keys.

PC Braille keyboard

Mere XKB layout + imLcFlt + Xcompose

This shows the input stack within the X client, KeyPress(41) being translated by the layout into XK_braille_dot_1, which is filtered by imLcFlt into the XK_braille_dots_1 pattern, which is turned into XK_a by XCompose before being passed to the toolkit input engine.

Braille abbreviations

PC Braille keyboard

Ibus daemon

This shows the X client input stack where the ibus module gets KeyPress(41), passes it to the ibus-sharada-braille daemon, which turns it into XK_a, passed to the toolkit input engine

How about wayland?

Accessibility in output

Tinkering with the rendering

But for blind people?

And a lot other accessibility possibilities

X accessibility, Mercator 1.0

Picture showing relations between xedit, Xserver and Mercator: text goes from xedit to X server through Mercator

X accessibility, Mercator 1.0

Figure showing relations between gedit, X server and Mercator: now Mercator gets pixmap, not text, because gedit uses gtk which uses pango to render fonts

Generic methodology

Figure showing the relations between the standard application - abstract representation - visual rendering, and the accessibility bus on which a registry and a screen reader can connect, with an eventual rendering on an accessibility device.

Story of an 'a', continued

This shows the X client output stack, whose toolkit widget emits a text-changed signal for ATK, which turns it into a text-changed message over the AT-SPI bus, received by Orca which renders it as braille or speech

But screen reader also needs reading

I.e. browse the application content

This show orca sending a get_text message over the AT-SPI bus, received by ATK, which calls get_text on the widget, which returns the text, which is passed by ATK over the AT-SPI bus to orca, which renders it via braille or speech.

Abstract representation

Technically speaking

In practice

First name: Foo

Last name: Bar

Password : baz

In practice

First column

- Label First Name

- Label Last Name

- Label Password

Second column

- Text Foo

- Text Bar

- Text baz

In practice

- Label First Name for Text Foo

- Label Last Name for Text Bar

- Label Password for Text baz

In practice

First column

- Label First Name

- Label Last Name

- Label Password

Second column

- Text Foo

- Text Bar

- Text baz

In practice

First column

- Label First Name

- Label Last Name

- Label Password

Second column

- Text Foo

- Text Bar

- Text baz

Don't try to make applications accessible,

just make accessible applications

Quite often just a matter of common sense from the start

Not a reason for not fixing

your existing apps of course,

it will just be a bit harder :)

Graphical applications

Some pitfalls and advices

(from the accessibility howtos)

Test it yourself! (GUIs)

Accerciser

Accerciser screenshot

Check that the tree of widgets looks sane and is complete

Documentations

Conclusion