Accessibility Hack Days

Our recent Accessibility Hack Days in Birmingham brought together developers, user experience designers, accessibility specialists and first hand users of assistive technologies with both visible and invisible disabilities. The event aimed to generate ideas and discussion about the issues involved with designing technology solutions for people with disabilities. This diverse mixture of perspectives helped to produce a range of projects and debates, which led to tangible solutions and even some working code.

In this event report we aim to summarise the key developments and give you a flavour of the event. Our thanks go to our partners at OpenDirective, supported us with the organisation of the event.

Open Accessibility

Steve Lee, OpenDirective

Steve Lee opened the event by discussing the benefits of open development, including agility, diversity and, ultimately, sustainability. This leads to user innovation, as the users are first class citizens in the development process. He expressed the hope that this event will act as a taster of this process.

To inspire us, he showed a video about GPII – a global public inclusive infrastructure, which will use the cloud to help more devices become accessible for more users by giving users a personalised interface.

Keynote 1:
Assistive Technology: Impact and Trends

Banes provided us with a wider picture of accessibility, noting that whilst we talk about issues like inclusive infrastructure and global solutions we often don’t questioning what that actually means. He showed a video to provide a sense of scale and to explore the links between poverty and disability. This made the powerful point that if you have a disability you are part of the largest minority in the world.

Banes identified several key trends, including a demand for portable solutions (not just mobile), a demand for cloud-based solutions, a demand for global solutions that allow people to move around the world, and a demand for personal solutions, as the disability is only one part of that person. He noted that fewer people now use just one product, they have systemic solutions.

He provided us with several case studies, including Mohamed from Doha, who was filmed speaking in Arabic. Banes used this to emphasise that language and culture are really important when considering global accessibility. He highlighted the limited amount of free open source developmental work available to turn to in developing countries, as it can be very difficult to adapt existing systems into languages like Arabic. Designing for people means not just thinking about the disability that someone has, but also the whole person – their age, their experience, their language and culture, the cost, the platform and the setting.

Banes gave the group some practical advice to consider issues such as how the product will be localised, how it will be distributed, how potential users will be informed and how will training and support be provided. He noted that most of the cost of a solution is not taken up by development, but instead by the distribution and support infrastructure surrounding the product, so we it is important to think about the whole assisted technology ecosystem when designing.

He emphasised that open source is very good at producing unique and interoperable solutions which are modular and easily customised. However, if we are going to grow global access, we need to design “vanilla”, language-free technology that we can flavour.

He concluded by issuing the rally cry: “Let’s not design for a market: let’s design for communities, regardless of location.”

Lightning Talks

Participants were invited to give a lightning talk to describe their area of work or interest by way of introduction.

How people with cerebral palsy use online social networksMakayla Lewis, PhD researcher, City University London

Lewis introduced her lightning talk by highlighting that 49% of the 16.46m internet users in the UK use online social networks, but many of these sites are not easily accessible by people with disabilities. She focussed particularly on people with more severe cases of Cerebral Palsy who struggle with communication, making online social networks a potential route to greater integration into a community.

She discussed the interviews she conducted with Cerebral Palsy sufferers as part of her PhD research to examine how they use social networks. Many reported this was the only way they could maintain independent communication, communicate privately and reduce isolation.

Lewis concluded by briefly examining some of the contexts in which social networks are used and the problems her interview subjects identified, which included: slow input speed, poor UI features, lack of appropriate help, frequent and abrupt OSN changes, which mean that they suddenly find they cannot use a network and have to rely on a carer.

BNCI BrainAble projectClare Folkes, Abilitynet

Folkes introduced us to the BrainAble project, which is a brain-Computer interface designed to capture, amplify and digitise brain signals to help provide commands to a computer. This can be used by people with locked-in syndrome, who wear an electrocap which identifies brain signals generated when they recognise things on a screen.

Folkes, explained that she is interested in the application of this for the deaf-blind, as the necessary brain signals are generated though recognition, which could be generated by touch recognition. Her aim is to to find people interested in making accessible semantic web tools to collate resources about deaf-blind across the world and the different ways they communicate to help further this and similar work.

WAC Mobile Device APIsScott Wilson, JISC CETIS

Wilson introduced us to W3C widgets, which support cross-device applications, providing one package for an application which can then run in all sorts of contexts. He provided an an overview of the .wgt file used to create a widget, which includes HTML javascript, CSS, a config.xml file and an icon, which get wrapped in a zip file.

Discussed some of the key features, including extensibility, deep i18n l10n support and user preferences, and listed some implementations, including Opera 11, Apache Wookie, Blackberry, Android, Nokia S40 series devices and interactive white boards.

He also outlined how widgets can use device APIs to help a website gain programmatic access to native features without requiring native code.

Templates for accessible W3C WidgetsRoss Gardler, OpenDirective

Following on from Wilson’s description of W3C widgets, Gardler outlined the Rave in Context project, which aims to build useful widget templates that have accessibility baked in and embed good practice. They hope these will help to address accessibility issues, even when developer doesn’t care that much about accessibility. He observed that making the widgets accessible can also help people without a disability, but who are working in an environment which poses a similar problem (e.g. loud environment).

Gardler appealed to participants for their help so he can understand the biggest issues and therefore see what they should be tackling first. Some templates were available for people to play with throughout the event.

Sense and the SenseBoardShirley Evans, JISC Techdis Associate

Evans provided an outline of her work in connection with the Open University’s new level one course “My Digital Life” which presents a range of accessibility issues in the context of distance learning. The course uses a piece of software called Sense, together with a piece of hardware called the SenseBoard. These are designed to help give people a grounding in software development as an introduction to further programming courses.

A SenseBoard

Evans observed that this is very visual interface, designed to make it easier to learn programming concepts. However, it can be very difficult for people with a variety of disabilities. She is investigating whether there is a solution within this environment by adding accessibility features to the software, or whether there may be an alternative solution to make introductory programming more accessible without relying on the traditional linear way of teaching. She appealed for any ideas and suggestions to help her with this.

In this video interview, Shirley describes some of the ideas that were generated by the event following her lightning talk…

Johnson introduced his work to create an accessibility block for the virtual learning environment, Moodle. He had observed that there were some obvious features missing that would make Moodle more accessible. Many of the relevant features were hidden away, so the aim of his plugin is to bring these “into the user’s faces” more.

He demonstrated his accessibility plugin, which saves your preferences for future visits to the site. In the new version of Moodle he has also introduced an ATbar, developed at the University of Southampton, which is a javascript tool allowing you to add a toolbar into any page with any functions you like. This is relatively new, so it only has a few features so far, including zooming, text-to-speech and font changing.

He emphasised that he would like to hear ideas about ways of integrating further or better tools, or if anyone would be interested in translating the tool.

Keynote 2:
Accessibility of native HTML5 multimedia

Bruce Lawson, Open Web Evangelist for Opera

Lawson provided an overview of the features of the HTML 5 specification which aim to make multimedia on the web more accessible, focussing on the new video tag.

He began by examining the current embed code used for video, which he described as “minging”, violating the DRY (don’t repeat yourself) rule, just for starters.

“It is what your grandmother would use!”

He compared this to the new HTML 5 tag, discussing the functionality this provides, the fall back mechanisms for browsers which do not support HTML 5 and recommended techniques to maximise efficiency when designing for mobile.

Lawson went on to emphasis why video as a native object is an important thing. Native video gives you full control, including keyboard access to the video, which is difficult to do with Flash movies in non-IE browsers. He also noted that you can control it using other web standards such as CSS 3 and described how you can use HTML 5 form features to create some custom controls that look good and are automatically keyboard accessible.

Not all browsers support these features yet, but Lawson emphasised that the golden rule behind HTML5 is that you can do feature detection behind all of the cool stuff and patch with javascript if it isn’t supported. He used the example of the subtitles feature, which is currently in development. This enables you to add subtitles in the browser, which is done using a WebVTT file, which is just text, rather than burning the subtitles into the video. You can style it, increase the font size, use vertical text for Chinese, and index it. This is effectively internationally accessible out of the box, in the browser.

He concluded by providing an overview of other features in development, including synchronised videos, which would provide the functionality to synchronise multiple videos on the same web page, allowing the user to play two videos (one providing sign language interpretation for the other) in synch with each other.

Keynote 3:
Accessibility, Inclusivity and Interoperability

Sandi Wassmer, Copious Ltd

In her motivational keynote, Wassmer reflected the dynamic of the day. She emphasised that accessibility should be a right for all, and that whereas most people see accessibility as about disabled people, it is actually about everybody.

Wassmer observed that whilst we have come a long way – citing the example from web development, where most content management systems now tend to separate style from content – accessibility is still associated with things that people don’t get, as it is very subjective. However, in her own work, accessibility is not something Wassmer considers in and of itself. Instead, her approach is to connect accessibility with content strategy, user experience and how people interact with your brand when they visit the website. The aim is to deliver the best website they can that is accessible to as many people as possible, noting that a website that is accessible to everyone would just be text, which would be ugly.

Wassmer went on to make clear the connection between open standards and accessibility. She explained that open standards should be at the core, as this will lead to greater accessibility and interoperability, but this is not just the responsibility of the developers. Even the AT companies do not necessarily engage with open standards at present, but this will be crucial if we are to achieve proper accessibility, rather than silo solutions.

In conclusion, Wassmer emphasised that there were some really interesting ideas shared on day one. However, she observed a discord between the developers who want to get on and fix problems, whilst others want to talk about the issues and the users. She urged the developers to explain what they are doing and for the non-technical domain experts to look for the manageable things to solve. This led to an open discussion about articulating processes to help generate further ideas.

Keynote 4:
Experiences of Open Accessibility Projects

Julian Harty, eBay

Harty, who has been involved in open source development since around 2007, began by giving us a practical example of an open source project by describing the evolution of his DAISY reader project, which now exists in three versions. He moved on to discuss the difficulties faced when attempting to include disabled people in open source development, highlighting the barriers that exist and the benefits to the individual that can be gained if they are able to contribute.

Harty observed that to get a sighted person set up and ready to contribute to an open source project can take about two days. By comparison, getting a visually impaired person involved is more difficult due to the variety of websites and tools required to make a useful contribution. These may not be easy to use with a screen reader, particularly if they are very rich, command driven-editors.

He described his attempts to make it easier to contribute to his project, including creating a wiki page to explain how to contribute translations easily, highlighting that key success factors include finding ways to make contribution as quick and easy as possible, and encouraging people to feel like part of the development team.

Harty observed that open source work can act as a testimony to the quality of your work, which can help in job interviews. There are tools that aggregate these contributions, which create a trustworthy testimony of your work and can help you get a better job. This makes it important to ensure that such projects are accessible to contributors with disabilities.

He concluded by discussing the considerable licensing and intellectual property issues associated with making contributions to open source projects. He noted the responsibility of the developer to know what they are giving away and to check what contractual restrictions may apply with their current employer before contributing.

A Word of Warning

Neil Williams from Toby Churchill Ltd provided a commercial perspective on the developments at the event by observing that it is important to consider the assumptions you are making at the earliest stages in the design process. He noted that reality has to come into the design, as it can be difficult to unpick design assumptions later which may affect how useful the tool is in practice.

Outcomes

“I am the problem space”

Ross Gardler made the point that it is easy to assume that everyone knows what we know, but it is important to remember and communicate with those who don’t have the time to become domain experts. He described himself as part of this problem space, but hopes he is becoming part of the solution.

VTT Video Caption Creator

Scott Wilson explained his moment of revelation following Bruce Lawson’s presentation about HTML 5 video. He hadn’t realised before that video on the web was not video in the web, whereas HTML5 makes it really part of the web. To exploit this, he created a prototype video subtitle editor VTT Caption Creator, which enables users to play a video, stop it, add a caption and then resume, generating a subtitle text file at the end to use with in their HTML.

He hopes this will make it easier to get more video content subtitled, which is currently an expensive and laborious process.

In this video interview, Scott explains how the event led him to develop the VTT Caption Creator, and how easy it actually was…

Accessibility Control Widget

Mark Johnson worked with others to adapt his Moodle accessibility block, turning it into a W3C widget and hosting it on the Wookie server. This means it can now be embedded in other environments. Laura Dickinson also helped to style it so it looks more attractive.
This development helps the user by enabling them to customise any site using the widget to suit their needs, and helps web designers by providing a simple plugin to make their site more accessible.

ATbar

Sebastian created a bookmarklet to add the ATbar toolbar to a browser, including a function to create a list of all headings and links on a webpage. He hopes this will help people who want keystroke functionality but can’t install JAWS for any reason, as it works through the browser, or as a mouse alternative to help people get the JAWS-style functionality. Code is already available on Github.

JAWS Menu System

David Yates developed a JAWS menu system to help beginners, which, as Mark Wassmer pointed out, could be extended to help developers test their websites without knowing the keystrokes. He is going to pursue this as a result of the event.

Text Communication

David Banes identified a real life scenario he had been trying to solve for two deaf people in Qatar, who wanted to communicate with each other using text through handheld devices, preferably via a bluetooth connection, as wifi isn’t always available.

Through discussions at the event, he found the solution, which involved using iPhone/iTouch, installing the Arabic keyboard layout, then using existing applications to connect and facilitate the chat. This problem had taken weeks of research, but was solved quickly through the fresh perspective of other participants at the event.

In this video interview, David explains how the event helped him solve this problem, and summarises his key points from his keynote presentation…

WAC Standards Tool

A group of developers produced a prototype tool for testing websites against WAC standards to see if the site is accessible. They hope to launch this as a toolkit or web service, which could be run against very large sites automatically.

DAISY Reader Keyboard Control

Julian Harty demonstrated his extension to the DAISY reader project, providing keyboard support which in turn will enable users with motor impairments to control the system using click pads.

This video shows the system in action, accompanied by an explanation of some of the issues encountered during development.

FriendlySpaces

The FriendlySpaces project follows on from the Spot It, Share It Android app developed at an event last week. The group carried out user research to examined how people experience difficulties accessing different spaces and devised a way of people finding friendly spaces to suit their needs, then share information about those spaces. They produced a mock up of a recommendation system that can take previously stored preferences and compare them with information about a particular room to determine if that space is suitable for the person.

In this short video interview, Makayla Lewis explains in more detail how the project evolved, and highlights what she hopes to have achieved through the event….

#a11yhack Community Activity

Brian Kelly discussed the ways in which people have collaborated throughout the event and the sustainability of the communities which have formed. He looked at how we can provide mechanisms for participants to help them curate their own communities, including identifying contacts that have been taking part, looking at their profile of usage over the event, their use of geo location, the applications they use and the resources they shared.

FocusIn

Web developer Laura Dickinson was inspired to develop FocusIn directly after attending the event. The tool works within a web browser to highlight a selected area, making the text easier to read. You can try the demo here.

Screenshot of Laura Dickinson's FocusIn tool

Conclusions

The event aimed to give a taste of what open development is about and generate dialogue about accessibility issues by bringing together developers and accessibility experts.
Whilst there were concerns about the ways and extent to which the developer community currently engages with users with disabilities, the outcomes of this event demonstrated the wide range of innovative solutions that can be created when such a diverse group of people come together to talk and learn from each other. We are very keen to learn from this event and welcome all feedback.