Sep 13, 2014 | Towards a more accessible map

The views represented here are my own, and not of my employer.

I don’t think a single day goes by where I don’t use Google Maps, and not merely because I work for Google. And as much as championing accessibility is something I wished I did more often, it’s not something that I think about on a day-to-day basis. But a confluence of events — including hearing about the 20/20/20 project to provide free surgery to those suffering from cataracts — prompted me to explore the area in depth. An important note: since I am not blind or vision impaired, the tools I use and the way I use them might differ significantly from actual users’ usage, and as such the best next step beyond this is to actually work together with blind and impaired users.

The main task I gave myself was to find directions from my alma mater, Swarthmore College, to New York City. While I assess the individual merits and weaknesses, I also dive into some potential opportunities abstracted away from our current solutions.

Desktop

A screenshot of the Google Accessibility Guide that mentions VoiceOver is a good screenreader for Macs

On my MacBook Air, I followed Google’s Accessibility Guide and enabled VoiceOver and tried to navigate Google Maps by going to maps.google.com on a Chrome browser. There are some initiatives through Open Street Maps to enable blind-friendly maps, but I couldn’t get to it, nor did I have ready access to screenreaders that a blind person may already be using, like JAWS. The basic premise of VoiceOver is navigating the screen real-estate using the keyboard – in my case, my main key was the tab and Control + Option key. I didn’t set up any fancy settings that I knew some of my friends who were blind were using (I know, for example, that the typical blind user will set the speech speed high — almost indecipherably high to my ears).

The first hiccup I notice is that the tab key won’t get me anywhere, nor will the arrow keys bring me to any other section of the UI. I was thoroughly confused later when the tabbing through the UI I heard mention of the Google Apps menu and G+ Profile section. Once I’m in the directions detailed flow, I’m given a flurry of steps, but the VoiceOver tool picks up on the miles of each step, completely ignoring the actual steps necessary. The layout of the site also meant VoiceOver spent a lot of time reminding me that the data was copyrighted to Google.

I then followed up with a check of the Apple Maps desktop version, which has been a bundled app since the Mavericks release. Here, the integration with the desktop UI makes it much easier to navigate through the map itself, as well as access directions and step through the multiple steps. POIs were also called out in the UI, which is useful so that you know what’s available to search for in the area.

At this point if I were trying to navigate my way through an unfamiliar city (for example, by trying to search for the nearest grocery store) I would probably save the directions onto my phone as a note and use that to remember how to get there.

iOS

On the iPad running 7.1.2, I tested both the latest versions of Google Maps (v 3.2.0) and Apple Maps.

A screenshot of the VoiceOver panel on iPad

Perhaps not surprisingly, VoiceOver integration on the iPad was much better with Apple Maps. Google Maps would not return any additional information to VoiceOver based on tapping on the map, which meant that you had to either know where to enter or engage with UI elements, or you were stuck. Tapping on various locations on the map, Apple Maps would highlight the area with a dark bar, or speak the actual location.

A view of the iPad Apple Maps where there's a black bar across a highlighted section of the map

Worth of note is the “Rotor” feature, where you use two fingers to rotate through an additional set of dials that give you a variety of on-screen categories, like headings or points of interest, or toggling between the voice reading words or lines. Since they are context sensitive, swiping quickly upwards after selecting an option in the Apple Maps app, you can navigate through the POIs available on the map. This feature is not available to the Google Maps app.

A screenshot of the iPad Apple Maps app with the rotor dial on top, with the text 'Containers'

A screenshot of the iPad Apple Maps app with the rotor dial on top, with the text 'Point of Interest'

Android

Similarly to iOS, Android has a feature called TalkBack that will render a speech synthesizer to most of the UI components. Most of the usual single tap interactions require two taps instead of one, and a swipe takes 3 fingers instead of one. I tested this using my Nexus 4, running Android KitKat.

Just using Google Now

There were a myriad of things broken here, starting with the fact that Google Now interpreted the “Tap to speak button” TalkBack prompt as an actual cue for search. The Google Now card was not tappable in any way, so the card was completely useless.

Just using Google Maps

Here, I dive into the Google Maps app and try to complete my task. There’s an instance where I get results back for directions “Showing items 2 of 2” but I’m not aware what was returned, so I randomly tap around.

Using speech on Google Maps

Towards the tail end of the above video you’ll note that I was getting a bit frustrated and as such decided to resort to speech. I did a more thorough run through in the video below.

In Android, tapped UI elements are highlighted in yellow. On iOS, they are bounded in a black frame.
A screenshot from an Android phone, with a yellow highlighted box near the compass, which is shown when TalkBack is activated

Opportunities at a meta level

The key learnings:

One point you may raise is that the current solutions and implementations are hardly helpful even after resolving some of the glaring accessibility problems inherent in Google Maps. Maps are for wayfinding and orienting — wayfinding with a visually-rich UI doesn’t help someone who needs to know where and which direction to head. Google Maps’s navigation tools will help, but it doesn’t help if a user can’t get to it. And our current UIs hardly help if a user can’t tell which direction is north, much less which direction is the right one to take.

Beyond the mobile or desktop interfaces, users interacting with the real world also need to understand how best to navigate it. We make choices — based on time, personal preferences, externalities that force us to decide what route we take. But what if those choices were also grounded in how much support in real life you could get in getting there?

A view of the Tokyo subway system, with a yellow band running across the platform floor, denoting a slightly raised divet warning blind people that the platform ends there
Those yellow bars at the bottom are raised divets that let blind people know it’s the end of the platform

In Japan, most of the subway platforms and vending machines are Braille friendly. It does not necessarily mean that they are always accessible, or that Japanese infrastructure as a whole is more accessible than anywhere else. But it’s worth imaging a world where we can tell blind and vision-impaired users about how they might best navigate through accessible roads, minimizing occasions where they may stumble, get bumped into or otherwise lose their orientation.

We need to develop tools that are more than hacked on bandaids on top of beautiful, cutting edge interaction design that most of the world can enjoy. There may only be 4% of people who are blind, but they are the ones who can benefit most from a better way to navigate the world we live in.

This entry was posted on Saturday, September 13th, 2014 at 11:18 am, EST under the category of Articles, Photography. You can leave a response, or trackback from your own site.