Project X Description

Assist ASCLA in planning and promoting the release of an "accessibility toolkit" of fifteen pamphlets that provide information on different aspects of accessibility as well as an Electronic Accessibility Checklist.

The subjects of the pamphlets are:

1) Developmental Disabilities

2) Learning Disabilities

3) Management

4) Physical disabilities

5) Children

6) Autism & Spectrum Disorders

7) Mental Illness

8) Service Animals

9) Volunteers

10) Deaf & Hard of Hearing

11) Assistive Technology

12) Multiple Disabilities

13) Staff

14) Vision

15) Trustees

Expected Outcomes

The project team will analyze audiences for the toolkit and checklist and design a promotion plan to ensure a wide awareness and utilization of these materials. The promotion plan will be a report that includes recommendations for ASCLA as well as templates and examples of promotional materials.

The project team will create a poster that is a promotion of the materials to an audience of "Emerging Leaders" at the June 27 workshop.

Project X Planning

Our Audience

Identify audiences and how to best reach them...

Promotion Plan

Recommendations and examples...

Poster

Poster ideas here...

Project X Meetings

The team will meet once a month virtually (Google Chat) at 2:00 Eastern on the third Friday of the month

January 18

February 15

March 21

April 18

May 16

June 20

Our Group Strengths

Michael brings initiative and IT background. Institution supports so can flourish and assist process.
Alan brings new eye to tech experience and good analytical skills.
Annelise brings ability to see big picture and IT background. Facilitator of communication in groups.
Olivia has design background and web experience. Very good with details and follow through.
Adri diverse work experience. Working on similar process for own website.

Bibliography

Articles

Brajnik, Giorgio. "Web Accessibility Testing: When the Method is the Culprit." Computers Helping People with Special Needs (2006): 156-163.
Article suggests accessibility testing using a method known as barriers walkthough (BW), where a number of different barriers scenarios need to be identified: defined by user characteristics, settings, goals, and possibly tasks of users belonging to those categories. At least categories involving blind users of screen readers, low-vision users of screen magnifiers, motor-disabled users of a normal keyboard and/or mouse, deaf users, and cognitively disabled users (with reading and learning disabilities and/or attention deficit disorders) should be considered. In a scientific comparison with checklist-based conformance testing (CT), Brajnik found BW to be more valid and useful than CT. An additional benefit of the method is that evaluators become more familiar with accessibility and assistive technology while working with the BW test subjects.

Lazar, Jonathan, Alfreda Dudley-Sponaugle, and Kisha-Dawn Greenidge. "Improving Web Accessibility: A Study of Webmaster Perceptions." Computers in Human Behavior (2004): 269-288.
Article surveyed webmasters from different types of organizations about their knowledge of accessibility and their perceptions about when and why sites should or should not be accessible. 74% of respondents said they were familiar with Section 508 and similar laws; 1% was not; 8% were unsure; and the rest did not answer the question. Approximately 25% of the webmasters said their sites were subject to federal accessibility rules; 17% were not sure if their sites were or not. 79% were aware of the existence of software that tests the site for accessibility. 69% of webmasters had used a web-based accessibility tool while 22% had used a non-web tool. 39% indicated that they had tested their sites using screen readers. 59% said that their organization is planning on improving accessibility; 21% said no improvements were planned; and 17% were not sure. 64% said they were familiar with WCAG. Webmasters noted that challenges to accessibility included balancing accessibility with design, convincing clients and management that accessibility is important, technical challenges, lack of funding to address accessibility, the need for training, and the need for better software tools. A large number of respondents noted that webmasters and programmers should be responsible for accessibility, though the disability compliance office and the help desk manager were also listed. Most noted that accessibility is not a solo effort. Most also noted that they were most influenced to strive for accessibility by legal mandates and that they do at least consider accessibility when they update their sites. Overall, the study found that most webmasters support accessibility but do not feel they have the resources to create a fully accessible site. A few webmasters object to accessibility guidelines and consider them interference in their web design.

Mankoff, Jennifer, Holly Fait, and Tu Tran. "Is Your Web Page Accessible? A Comparative Study of Methods for Assessing Web Page Accessibility for the Blind," CHI 2005. Papers: Web Interactions (2005): 41-50.
The authors compared four methods of accessibility testing: Expert Review (expert web developers examining the sites with W3C Web Content Accessibility Guidelines 1.0); Screen Reader (expert web developers examining the sites with a monitor and a screen reader); Automated (Bobby 4.0); and Remote (review by blind users testing the sites in their own homes with their own screen reader set up). These methods were compared with a baseline test of accessibility done in a lab with blind participants and screen readers. They found that participants in the screen reader group were the most consistent in finding both WCAG and empirical problems. Overall, the automated condition performed especially poorly, though the authors believe the results may have improved had they studied how developers interpret the results rather than the results themselves. No single evaluator or tool could find a high percentage of accessibility problems, but multiple evaluators, working independently and using a screen reader, did find a high percentage of accessibility problems. The remote user study might be more effective if it is modified to encourage better self-reporting.
Key points:

No one way of accessibility testing finds every accessibility issue. Multiple tests with multiple subjects should be run.

Automated accessibility testing is the least useful.

Experts testing sites with both a screen reader and monitor simultaneously seems to find the most accessibility problems.

Lab testing with blind users can be costly and time consuming, often more than web designers can do.

Remote testing where users test sites using their own accessibility technology can be useful, but strong self-reporting mechanisms need to be in place.

Providenti, Michael, and Robert Zai III. "Web Accessibility at Academic Libraries: Standards, Legislation, and Enforcement." Library Hi Tech (2007):494-508.
The authors review legislation that may mandate academic libraries to create accessible websites. Section 508 of the Rehabilitation Act mandates accessibility for federal websites, but the authors also argue that Section 504 of the Rehabilitation Act, which calls for "effective communication" is also a mandate for web accessibility, as does the ADA's Title II, which makes it illegal to deny people with disabilities equal access to goods, services, facilities, privileges, advantages, or accomodations of all public and private universities and colleges, including their libraries. The Department of Education's Office of Civil Rights is responsible for enforcing Section 504 of the RA and Title II of the ADA and can facilitate an informal agreement between the complainant and the school or initiate a full investigation. Unsuccessful complainants can file suit against the college or university in federal court independent of the OCR findings.
Key Points

They compare WCAG to the indexing and cataloging standards used, which "ensure that materials in the library can be made accessibile to all, not because someone might complain or to accomodate a particular group of people."

Does ALA receive federal funds?

Stringer, Elizabeth C., Yeliz Yesilada, and Simon Harper. “Experiments Towards Web 2.0 Accessibility.” Proceedings of the 18th conference on Hypertext and hypermedia (2007).
Summary of a conference paper. Notes that changes to dynamically updating pages (such as those using AJAX, XForms, or IFrames) when the user is working on those pages are not registered by screen readers because the urls don't change. The authors hypothesized that by monitoring network traffic, updates can be identified even if they missed by screen readers. They ran an experiment using Wireshark as their network analyser, Guide as their screen reader, and Yahoo Mail Beta as their dynamic webpage. They found that the network analyzer consistently outperformed the screen reader in detecting updates to the page. When the reader did detect updates, they were inserted at the end of the page. They also found that the screen reader identified objects that were not visible on the screen and thus not available for action until they are activated.
Key point:

Many web 2.0 sites and technologies are not yet accessible because screen readers cannot properly identify updates to the page.

ALA background literature

Usability Assessment Report: American Library Association Website Submitted to American Library Association Submitted by Jimmy Vu, Dustin Chambers, Elizabeth Buie, Dick Horst, UserWorks, Inc., December 6, 2006
In the accessibility review, an automated accessibility evaluation tool (Bobby) was used to identify areas of the site to manual evaluate. 30,000 links and their associated web pages were sampled, and the site was examined with JavaScript both on and off. The tests were conducted on machines running Windows XP (with Internet Explorer and Firefox) and on Macintosh machines (running Firefox and Safari). Most of the pages were found to be "fairly accessible": Bobby found many potential violations, but manual review revealed that most instances "were well within the range of accessibility." Pages and their links were accessible with scripts turned off. and the layout and organization follow W3C guidelines.
Priority 1 problems:

Lack of alt text for many graphics

Lack of transcripts for audio files and captions for video files

Priority 2 problems:

Link text does not always make sense when read aloud (e.g. "go" button)

Many graphics are used to represent text.

They also found that the site does not render as cleanly in Firefox and Safari as it does in Internet Explorer.Note that they do not mention trying the site with a screen reader, nor do they talk about keyboard shortcuts.

Choose to use table header tags (<th>) for the first row of tables to create tables with column headings, ensuring that assistive software correctly associates the information in each table cell with its heading

Add summary attributes to describe tables for users who rely on non-visual media such as speech or Braille.

Add label controls to describe form elements (such as input fields) that display on the screen. Screen readers can read these to users.

Add label controls to specify access keys--keyboard shortcuts to make the associated form element the active field on the page.

Define alt text when adding an image to a page.

Separate HTML formatting from content with a stylesheet using the Style button in the contribution editor.

Create text labels for images and text versions of links defined on an image map.

Use external stylesheets to format content, and make sure that content is legible when the stylesheet is not available. Use HTML for structural purpose (e.g. nest H2 headings under H1 headings) instead of for formatting.

Avoid effects that cause the screen to flicker or blink.

Include a link to download the appropriate viewer/reader when the page includes documents that require applets or plugins. Make sure that the page is usable when these elements are turned off.