Making an Application Accessible on the iPhone 3GS.

March 26th, 2010

Introduction

When I began working at the Norwegian Computing Center as part of the e–Inclusion group, I became
interested in accessibility issues that surround mobile phones. I noticed that one of the “exciting
features” coming to mobile phones was the touchscreen interface. I wondered how such things could be
used by people that cannot see. When I casually asked this to one of my friends showing off his new
Google phone, he immediately shot back, “well, I wouldn’t get a touch screen phone.”

I felt this wasn’t a very good answer. If phone manufacturers want to make touchscreen phones, it
would seem that eventually that would be all that you could get. I heard about the new accessible
technology that was in the latest iPhone. Apple claimed was possible for someone who was blind to
use the phone. I wanted to see how this works, and also how well it would work for the third party
applications that could be installed on the phone.

This is a report about my experiences about making an application accessible on the iPhone 3GS. It is
part of the Universal ICT course from MediaLT. We will first present background on both the
VoiceOver technology on the iPhone 3GS and the application, SeSam4. We then discuss running
Sesam4 without any consideration for accessibility. In the changes section, we detailing the
changes that were made to the application. For completeness, we briefly discuss the final version
of the application. In the conclusion, we provide some final thoughts about whole process and
what could be done in addition.

Background

Here we review both the assistive technology (AT) that was used in this project and
the application itself. The AT is available for anyone to try; the application is part of a
research project and has no current plans for being distributed.

VoiceOver

When Apple released their iPhone 3GS in June 2009, it included a screen reader called VoiceOver.
VoiceOver originally premiered on Mac OS X, but has been modified to work with gestures on the
iPhone. Another point is that, unlike AT on other phones, VoiceOver comes standard on the
iPhone 3GS. There is no additional software to buy and one only needs to activate the VoiceOver to
use it. This can be done on the phone itself, or when it is connected to a Mac or PC through
iTunes (which is accessible through screen readers). Another bonus, VoiceOver on the iPhone
includes voices for 21 languages (Norwegian included).

VoiceOver uses gestures to interact with the phone. The typical interaction involves tapping on the
screen to select an item and then double tapping to activate that item. Other gestures help with
navigation. For example, you can flick in the direction you want with three fingers to
change pages (up or down, left or right). Flicking up with two fingers will read all the items on
the screen or you can drag your finger across the screen to get an idea of various elements on
the screen. There is also a Rotor Control that is typically used for the web
browser to “dial in” the level of granularity you want to access the web browser at (for example,
links–only, form elements, or headings–only). Excluding the rotor control, there are 19 key
gestures that you need to know about. Including the two rotor dial left and
dial right, that’s 21 gestures. Though in most circumstances, you use only a few of them. This makes
VoiceOver significantly less complex when comparing to the keyboard commands for VoiceOver on the
Mac1.

It takes a little bit of time to get used to this different way of interacting with the iPhone, but
one can pick up the most commonly used gestures quickly and began to use the applications. Apple has
ensured that the applications that ship with the iPhone are accessible and provides a developer API
for making applications on the AppStore accessible as well. This will be expanded on below.

The Application

The application was written as a prototype for the SeSam4 project and is called SeSam4. The
application idea is to give a tourist information about what is happening at the moment in the
surrounding area. Since it is a prototype, some of the functionality is missing.

This information is presented in two ways, as a list of events with the time and location or as a map
showing the location of each event. In the list, tapping an event in the list brings up a view with
detailed information.

The detailed information view should provide more information about the event. In the current
version, it loads an external web page. In theory, the idea would be to have a format for these
pages. On this page, pushing the options button will present a dialog asking if you want to open the
detailed information page in Safari, get directions to the place in the Maps application, or add the
event to favorites. None of this functionality is currently implemented.

The map view is similar to the Maps application that is provided by Apple. It shows the
standard map view and pins indicating where an event is taking place. Taping a pin will pop up a
little window that shows the name of the event and a short description and a button for more
information. Tapping that button will show the same detailed information that is shown in the
list view.

There is also an “Interested In…” button that presents a dialog for selecting what sort of events
the user is interested in. The list includes:

Concerts

Films

Guided Tours

Theater

Other

There is also a selector for choosing the distance that the user is willing to travel. This is
divided into distance accessible by foot, bike, or car. This functionality is not currently
implemented.

Running SeSam4

Disclaimer

One thing to keep in mind when I talk about how well or badly VoiceOver performs is that I’m a
sighted person using it. I am also the developer of the application in question. This means that
there is likely a bias in what I’m saying. I tried to be impartial in my evaluation, but there are
likely some sighted or developer preferences that may come through. One VoiceOver feature is that
doing a triple tap on the screen will activate screen curtain mode. This disables the display and
makes the user dependent on the information that is presented by VoiceOver. This keeps you from
using your eyes to cheat. As a bonus, it probably increases battery life.

First Test Run

Before testing the SeSam4 app, I used VoiceOver on some apps to see how well the apps fared and also
to get some experience with the various gestures. Some applications work very well (like the
Trafikanten app) and some less so (the yr.no app does not speak the
hourly weather information). Others, like some games, are not accessible at all. Still, it did seem
that it was possible to get some sort of information out of most apps.

After doing some experiments with VoiceOver and getting some experience with the gestures and how it
works. I loaded up the SeSam4 app to see how well it could be used “out of the box” without any
extra work done to make it accessible. The results were a bit surprising.

The list of events was fairly accessible, VoiceOver read all the information for each item in the
list: the event title, the location, the time, and the description. This was OK, though it was a lot
of information to process and the presentation was confusing. This is because the information in the
labels is formatted for visual presentation. You have some rows that helps you interpret what the
information is saying; hearing it all read as one line is more disorienting. You do not get any sort
of indication that if you tapped (double–tapped in voice over), it would load up the details page.
Since I wrote the application I did know this and it did work if I double tapped it, but others would
not have this knowledge.

The detail view worked about as well as expected since it is loading arbitrary web pages. Though it
would just read the “Back” button when switching over to that page. The dialog for showing the page
in Safari, etc., worked fine.

The Map view was interesting. By default it opens up centered on your location and then shows the
events that are nearby. Finding these events when you could not see the screen was very difficult.
The pins that indicate an event did not have a label to identify themselves, but they did have a hint
that explained it would show more information if you double–tapped it. However, this only was only
spoken if you paused for a second on the pin. If you were dragging your finger across the screen to
find one, you would likely just pass over it and back onto the map. This meant hearing “Map,”
followed by a sound that indicated you came upon something, then followed immediately after with
“Map.” It was very confusing. There are other ways to navigate by walking the list, but I didn’t try
them. I had to turn off the screen curtain to see the pins on the screen and figure out what was
going on.

If you did double–tap a pin, there was an activation sound. Unfortunately, you wouldn’t know where
the pop up was. You must find this pop up yourself. It would have two labels, one for the event
name and another for the description. You also had to find the button for getting more information,
but all you would get is the text that said, “Button, double tap for more info.”

The preferences dialog was also somewhat accessible. The list of kinds of events has a switch for
deciding if you want that kind of event included or not. However, the switch suffers from a similar
problem as the more info button in the map view. You hear, “Switch, double–tap to switch,” but do
not know what you are switching. It is possible to figure out which kind of event the switch belongs
to, but it is potentially confusing.

In summary, it was possible to navigate around the application, but far from perfect. On one hand, if
you do use the standard views in building your application, you get a lot of things for free and you
probably are at least half way to making an accessible application. On the other hand, for this
application, the information was confusing, and you would likely never get to the details screen. It
worked as well as it did only because of my inside knowledge of how the application works.

Changes

I did some changes to the application to make it more VoiceOver–friendly. The information is
presented below. Most of the information is technical in nature, but hopefully it still will make
sense. The majority of the information for making an application accessible is included in Apple’s
Accessibility Programming Guide for iPhone OS. This guide is essential if you
want to ensure the accessibility of your application.

Making the List View More Accessible

The list view is actually a “table view” in the application. Each event is actually a table cell
that consists of multiple labels. It also shows extra information if you tap it. The two things that
this required was setting extra information about what sort of abilities the table cell had and
making its label easier to understand.

Each view has a set of traits that can be used for identifying how something can be used. Since
tapping the table cell does something we can think of as a button. If we add this as a trait
for the cell, VoiceOver will read the label for the cell and then say that it is a “Button” with a
hint to “double tap to activate.” It does not matter if it looks like a traditional button visually
(it has a visual queue to indicate it can be tapped), the user will interact with it like it is a
button.

The second fix was to create a better label for the cell. As mentioned above, the cell contains
labels that work well visually, but can be confusing if when listening to them all being read as one
line. Since this information is created dynamically via a query, we need to generate the label
dynamically as well. The fix was to create a label that would make sense as it was read. One way you
could think about this was how you, as a person, would read this information out loud.

After the changes, the table cell should be more accessible. An example from before would read:

Tour of the Royal Palace, Royal Palace, two hours, See the wonders of the royal palace

After the changes, the label instead reads:

Tour of the Royal Palace, at the Royal Palace, beginning in two hours, See the wonders of the
royal palace, Button

Obviously, the second label makes more sense. You also know that it is possible to interact with the
cell as a button.

Making the Preference View More Accessible

The preferences panel is very similar to the System Preferences app on the iPhone with a list of
labels that each contain a switch. However, the switches did not say which label they belong to. This
was very easy to fix. Since the labels are all static, it was simply a matter of updating the
switch’s accessibility label with the proper label text and we were as good as the System
Preferences.

Making the Map View More Accessible

The MapView needed a bit more work. The first issue was with the pin, it was easy to skip over. The
reason for this was that by default it has no accessibility label. Adding a label that said the name
of the event suddenly made the map much more usable. I also updated the more info button in the popup
to also say which event it was connected to, like in the preferences view. You still have to find the
pop-up yourself however, but finding the pins is much easier.

Miscellaneous Changes

After I had done these main changes, I noticed some other things that were problematic and I fixed
those as well. I added a title to the detail view. I also did a little extra work and made the
application localizable. This means that, in theory, when the application is translated to other
languages, the custom accessibility labels will also be translated. This allowed me to fix up some
other problems with labeling of preferences. I also noticed that the map view wasn’t providing time
information in its pop up, so I updated the label (both visually and accessibility) to include that
as well.

Testing Changes in the Simulator

While the VoiceOver screen reader is the best way to test your app, that doesn’t help so much when
you are in the process of creating the app or happen to have an iPhone 3GS available for
development. In those cases, you can use the iPhone Simulator for testing the
changes. The Simulator does not have VoiceOver, but it does have a tool called Accessibility
Inspector. The Accessibility Inspector is a floating window that you can toggle on and off. When
active, you can click on the various items on the screen and it will display the accessibility
information that it has for them. Double–tapping the items will activate it, just like in VoiceOver.
This is useful if you want to check the information provided and make changes without having to
constantly install a new application on the device. However, it is not a substitute for using
VoiceOver. Once you think the information is correct in the inspector, you need to install it on the
device and test it (preferably with multiple people).

Final Version

Testing the final version was like running an entire different application. I could navigate around
the app with no problem at all. I could listen to the element and explain what was going on and know
what sort of gesture I should do to change things. I could even use the map view and find elements. I
was able to use the app with the screen off and felt like the application was much better. I doubted
I had solved all the accessibility issues with the application, but I had certainly brought it to a
different level.

Conclusion

To summarize my thoughts on this project, I would have to say that making the SeSam4 iPhone
application accessible was straightforward. The biggest investment is on learning how to use
VoiceOver and reading the Apple’s accessibility guide. Then, it’s pretty much the case of going
through the application and seeing how well things work. It seems that if you use Apple’s components
and libraries when building the application itself, it can cut down on a lot of the work, leaving you
to tweak the final bits. Apple also provided good tools for testing the accessibility. The screen
curtain on the iPhone is probably the best way a developer can test an application for accessibility
issues outside of real user testing. Learning how to use VoiceOver was quick. It takes a little while
to get used to it, but you can quickly make your way around. The longer you spend working with it,
the better you become. I myself had cranked up the voice speed at the end, because I felt it was too
slow otherwise.

Improving the accessibility helped the usability of the application overall. For example, when using
the map view, I noticed that the events did not list the time that they started. Here we were trying
to present the same information that the list view had, but we were omitting an important piece of
information. I did not notice this until I was hearing the information being read to me. There were
also places where I was hearing label names that worked well for a programmer, but would make little
sense to the user. I changed them to something more sensible. There were also places where I had left
a label blank; this looked fine visually, but when you had only VoiceOver to tell you what was going
on, it became difficult to figure out where you were. It didn’t hurt visually to add it later.

I should also point out that even though it was easy to test the application and I was sure that I
had fixed several of the issues, I really doubt that it can substitute for someone who is blind and
uses the iPhone daily. They would have insight that a sighted developer simply does not have. For
example, I spent time trying to get the map view to be accessible. However, I do not know if my
changes were useful to someone who is blind, would they even bother using the mapview, or would the
list be the only thing the would use? Do my labels hold too much information that they get boring to
listen to? Is there something else I simply missed? These questions I can’t answer, but an
expert blind user could. I’m sure even doing regular user testing would uncover other things that I’m
not taking into account at all.

I also realize that making my application accessible for blind people doesn’t make it accessible for
everyone. What if someone was deaf and blind? This would not be useful for them. Also, if you lacked
limbs it might not work so well either. However, future AT’s for the iPhone would use the same
information here, so you probably have done work in future–proofing for future users. Creating
something universally accessible is certainly a journey, one that may not have an end.

While I know that every application is different and that my experiences may not match others, I
would recommend iPhone developers take some time and make their application accessible and
maintaining this accessibility. It is a little bit of work, but the results are worth it. Your
application will be better and can be used by blind users and likely others in the future. It’s a
nice reward for the time that is put in.

Apple has enabled gesture support for VoiceOver using the trackpad on
its portable computers in its Snow Leopard version of the operating system. So, it is possible to
use the gestures now on Mac OS X, but I did not try this out. ↩