This page is for LUG people that suddenly welcome e.g. a blind person in their community and don't know anything about accessibility.

To know better about disabilities themselves, [http://tldp.org/HOWTO/Accessibility-HOWTO/ the accessibility HOWTO] has pretty good information.

For a looot of pointers to various projects, see the [http://larswiki.atrc.utoronto.ca/wiki LarsWiki].

General accessibility

Now, how accessibility is handled? The basic principle is that a daemon keeps reading the content displayed on the screen, and sends it to a braille device, a speech synthesis, a magnification engine,... That said, there are actually two or three main ways for disabled people to use a computer.

Good Old Text Mode

There are several screen readers (BRLTTY, Suseblinux, speakup, yasr, Brass, ...) that just get the content of the text console via /dev/vcsa*. The most used are BRLTTY which supports a very wide range of braille displays, Suseblinux in the Suse distribution, and speakup as a kernel patch for speech. The good thing is that text applications usually don't need to be modified, but they sometimes provide information in a very non-accessible way (alsamixer shows audio levels vertically for instance). There is a [http://brl.thefreecat.org/text-apps-a11y-test.html guide] for testing accessibility of text applications. Sometimes they just need some configuration, see CategoryText.

Graphical Mode

The AT-SPI facility recently permitted to make graphical applications accessible. This needs the modification of the toolkit that is used by applications. For now, only GTK applications (hence the gnome desktop) are accessible this way, QT applications (hence the KDE desktop) should follow shortly in KDE4, more details about applications in CategoryGraphical.

Integrated Mode

emacs has its own readers emacspeak and speechd-el, and since "emacs can do anything you want", this can be an all-in-one solution.

What is best?

As usual, there is no simple answer, it depends a lot on the person. For a developper, the text mode will probably be quite fine. Else the graphical mode will probably be easiest, since that's what people are used to nowadays. The emacspeak solution is a bit peculiar, in that it's an "all in one" solution, but it works quite neat, the only problem being that "only" emacs gets accessible.

Distributions

Distributions have various levels of accessibility. In a few words and alphabetical order:

Software

On this wiki (see CategoryCategory), you can find a bunch of applications that we know to be accessible, but as said above, in general text applications are usually accessible, and gnome applications get more and more accessible (see http://live.gnome.org/Orca/AccessibleApps for the current status).

Hardware

The list of braille devices supported by BRLTTY (and hence orca and LSR since these connect to BRLTTY via BrlAPI) can be found in its [http://www.mielke.cc/brltty/doc/README.txt README] file. I don't know about Suseblinux. USB devices are usually just automatically detected (that generally triggers text-mode of installation CDs). Serial devices sometimes are too, but it's often considered not safe, and thus they need to be configured by hand at boot by appending a parameter to the kernel command line, as explained on [http://mielke.cc/brltty/download.html#braillified].

Speakup supports Accent SA, Accent PC, Appollo II, Audapter, Braille 'n' Speak, DecTalk Express, DecTalk External, DecTalk PC, DoubleTalk, Keynote Gold PC, LiteTalk/DoubleTalk LT, Speakout, Transport hardware speech synthesis boards, as well as software speech synthesis.

Software speech synthesis are quite various, the [http://larswiki.atrc.utoronto.ca/wiki/LinuxUnixAccessibilitySoftware#Speech LarsWiki] enumerates them. English speech is usually quite good, other langages less, so your mileage may vary. Now, for making a screen reader X use a speech synthesis Y, either X already knows how to drive Y, or you can use speech dispatcher, which X hopefully knows how to drive, and for which Y hopefully wrote a backend for.

Magnification

For users with low vision, magnification can be useful.

On the Linux console, this can be done by using svgatextmode or by using the stty tool (see page linux).

The X server itself used to provide a way to achieve magnification by pressing ctrl-alt-plus / ctrl-alt-minus for iterating between video modes. This is however limited to the capacities of the video board (and the device driver). If you are lucky, 320x240 (~4x magnification) or even 160x120 (~8x magnification) are available.

The newer X servers do not provide this shortcut anymore, it has to be set by hand, see Xorg.

The gnome desktop provides gnome-mag, driven by Orca, which magnifies up to 32x (not a hard limit), either:

How does it compare to Windows?

Text mode is neatly accessible, and it happens that one can do hardly everything with the console (except browsing javascript or flash sites, or using OpenOffice.org, roughly).

Now, braille & speech accessibility of the gnome desktop is not yet as good as what JAWS and WindowEyes achieve on Windows. But it gets better and better everyday, and what we have now is already fairly good. Now, some points deserve attention:

Other kinds of accessibility issues

Blind people are not the only ones to need software help. Quadriplegics may for instance use dasher in order to type by just pointing their eyes on lettersĀ ; Gnome On-screen Keyboard may be used to quickly select menu elements, etc.

None: LUG (last edited 2010-08-26 00:52:38 by SamuelThibault)