Star icon to indicate updates At a glance

Writings and musings on the latest web trends and life, advertising, design, projects, and news from an avid and prolific web designer.
Up and about since 2003.

Click here for the RSS 2.0 feed Follow me on Twitter Pictures from my Flickr account

Mat Kearney

Most commented posts Most commented

Interesting Defunct United States Airlines
What I learned from the Boston Career Forum
Getting FancyUpload to Work
Google Plus bar color changer extension
Home Sweet American Culture
Install Android 2.2 Froyo on iPhone 3G on Windows with pics and video
An Engineer's Nightmare - Ocean Tower, South Padre Island
Setting up the Arduino Pro Mini and Bluetooth Mate on Mac
Python on the iPad
Custom iPhone lock screens on iOS5




Sep 13 | Towards a more accessible map

The views represented here are my own, and not of my employer.

I don’t think a single day goes by where I don’t use Google Maps, and not merely because I work for Google. And as much as championing accessibility is something I wished I did more often, it’s not something that I think about on a day-to-day basis. But a confluence of events — including hearing about the 20/20/20 project to provide free surgery to those suffering from cataracts — prompted me to explore the area in depth. An important note: since I am not blind or vision impaired, the tools I use and the way I use them might differ significantly from actual users’ usage, and as such the best next step beyond this is to actually work together with blind and impaired users.

The main task I gave myself was to find directions from my alma mater, Swarthmore College, to New York City. While I assess the individual merits and weaknesses, I also dive into some potential opportunities abstracted away from our current solutions.

Desktop

A screenshot of the Google Accessibility Guide that mentions VoiceOver is a good screenreader for Macs

On my MacBook Air, I followed Google’s Accessibility Guide and enabled VoiceOver and tried to navigate Google Maps by going to maps.google.com on a Chrome browser. There are some initiatives through Open Street Maps to enable blind-friendly maps, but I couldn’t get to it, nor did I have ready access to screenreaders that a blind person may already be using, like JAWS. The basic premise of VoiceOver is navigating the screen real-estate using the keyboard - in my case, my main key was the tab and Control + Option key. I didn’t set up any fancy settings that I knew some of my friends who were blind were using (I know, for example, that the typical blind user will set the speech speed high — almost indecipherably high to my ears).

The first hiccup I notice is that the tab key won’t get me anywhere, nor will the arrow keys bring me to any other section of the UI. I was thoroughly confused later when the tabbing through the UI I heard mention of the Google Apps menu and G+ Profile section. Once I’m in the directions detailed flow, I’m given a flurry of steps, but the VoiceOver tool picks up on the miles of each step, completely ignoring the actual steps necessary. The layout of the site also meant VoiceOver spent a lot of time reminding me that the data was copyrighted to Google.

I then followed up with a check of the Apple Maps desktop version, which has been a bundled app since the Mavericks release. Here, the integration with the desktop UI makes it much easier to navigate through the map itself, as well as access directions and step through the multiple steps. POIs were also called out in the UI, which is useful so that you know what’s available to search for in the area.

At this point if I were trying to navigate my way through an unfamiliar city (for example, by trying to search for the nearest grocery store) I would probably save the directions onto my phone as a note and use that to remember how to get there.

iOS

On the iPad running 7.1.2, I tested both the latest versions of Google Maps (v 3.2.0) and Apple Maps.

A screenshot of the VoiceOver panel on iPad

Perhaps not surprisingly, VoiceOver integration on the iPad was much better with Apple Maps. Google Maps would not return any additional information to VoiceOver based on tapping on the map, which meant that you had to either know where to enter or engage with UI elements, or you were stuck. Tapping on various locations on the map, Apple Maps would highlight the area with a dark bar, or speak the actual location.

A view of the iPad Apple Maps where there's a black bar across a highlighted section of the map

Worth of note is the “Rotor” feature, where you use two fingers to rotate through an additional set of dials that give you a variety of on-screen categories, like headings or points of interest, or toggling between the voice reading words or lines. Since they are context sensitive, swiping quickly upwards after selecting an option in the Apple Maps app, you can navigate through the POIs available on the map. This feature is not available to the Google Maps app.

A screenshot of the iPad Apple Maps app with the rotor dial on top, with the text 'Containers'

A screenshot of the iPad Apple Maps app with the rotor dial on top, with the text 'Point of Interest'

Android

Similarly to iOS, Android has a feature called TalkBack that will render a speech synthesizer to most of the UI components. Most of the usual single tap interactions require two taps instead of one, and a swipe takes 3 fingers instead of one. I tested this using my Nexus 4, running Android KitKat.

Just using Google Now

There were a myriad of things broken here, starting with the fact that Google Now interpreted the “Tap to speak button” TalkBack prompt as an actual cue for search. The Google Now card was not tappable in any way, so the card was completely useless.

Just using Google Maps

Here, I dive into the Google Maps app and try to complete my task. There’s an instance where I get results back for directions “Showing items 2 of 2″ but I’m not aware what was returned, so I randomly tap around.

Using speech on Google Maps

Towards the tail end of the above video you’ll note that I was getting a bit frustrated and as such decided to resort to speech. I did a more thorough run through in the video below.

In Android, tapped UI elements are highlighted in yellow. On iOS, they are bounded in a black frame.
A screenshot from an Android phone, with a yellow highlighted box near the compass, which is shown when TalkBack is activated

Opportunities at a meta level

The key learnings:

  • Desktop UIs are much more discoverable than mobile ones, but if they don’t tell you the whole picture, you miss out on key features (whereas you may stumble upon them on a mobile UI, as screen real estate is still pretty limited).
  • POIs are really important for framing the map view. When there’s no way to understand what POIs are available on the map, you have to search for them and get directions to each one of them, which is both time consuming and difficult to understand spatially.
  • iOS in general is still ahead of Android when it comes to user-friendly accessibility tools; interactions with TalkBack were still unpredictable and unreliable.
  • Apple Maps in general had the benefit of a tighter VoiceOver integration compared to Google Maps.
  • Google Maps on maps.google.com is better than the mobile app but the ordering of UI elements means some things get repeated often or in odd ways (like the number of miles required to travel in each step of the navigation)
  • At no time on the desktop experience was I told about what I was expected to see (like where the map view was relative to the text). Once in the detailed view, I had no idea how to escape it.
  • The desktop map mentions “zoom” for both zooming in and zooming out.

One point you may raise is that the current solutions and implementations are hardly helpful even after resolving some of the glaring accessibility problems inherent in Google Maps. Maps are for wayfinding and orienting — wayfinding with a visually-rich UI doesn’t help someone who needs to know where and which direction to head. Google Maps’s navigation tools will help, but it doesn’t help if a user can’t get to it. And our current UIs hardly help if a user can’t tell which direction is north, much less which direction is the right one to take.

Beyond the mobile or desktop interfaces, users interacting with the real world also need to understand how best to navigate it. We make choices — based on time, personal preferences, externalities that force us to decide what route we take. But what if those choices were also grounded in how much support in real life you could get in getting there?

A view of the Tokyo subway system, with a yellow band running across the platform floor, denoting a slightly raised divet warning blind people that the platform ends there
Those yellow bars at the bottom are raised divets that let blind people know it’s the end of the platform

In Japan, most of the subway platforms and vending machines are Braille friendly. It does not necessarily mean that they are always accessible, or that Japanese infrastructure as a whole is more accessible than anywhere else. But it’s worth imaging a world where we can tell blind and vision-impaired users about how they might best navigate through accessible roads, minimizing occasions where they may stumble, get bumped into or otherwise lose their orientation.

We need to develop tools that are more than hacked on bandaids on top of beautiful, cutting edge interaction design that most of the world can enjoy. There may only be 4% of people who are blind, but they are the ones who can benefit most from a better way to navigate the world we live in.

Comments Comments

More recent posts


Jul 27 | Xiaomi, MIUI and the Android ecosystem within China

Introduction
One of the highlights of traveling to China was having the opportunity to understand the unique technology silo it lives in, and how it manages to remain innovating and cutting edge, relevant to its users, and rapidly gaining traction on the international tech scene.

Jul 13 | Wandering Istanbul

I hear the muezzin’s call and make my way towards the Kizilkayalar Hamburger stand in Taksim square, where I’d been wandering about as the clock ticked 8:36pm on my phone.

Jun 22 | Photographing the Matterhorn

Click thumbnails for larger versions

70-200mm f/2.8 II at f/11, 170mm 1/15 seconds, ISO 100
It’s approaching 8pm and I’m sitting at the Chez Vrony, with the Matterhorn in front, waiting for someone to ask me if I’d like to have a drink.

May 11 | The thing that never was: ringback tones

This technology is part of a $2.5 billion market expected to grow soon into a $17 billion yearly market in the US alone.

May 4 | Google Glass is not augmented reality

One of the things that make me most excited about having shelled out an exorbitant amount of money for Google Glass is that the space for research and design using it remains pretty big.

Apr 12 | Durla Photo Workshop with Daragh

It’s nearing the end of the afternoon, and my feet are sinking into the bog, unstable among the mole-hill-like bumps that dot the hills.

Recently read

Updated 137 days ago.

» Mobile apps must die

» Useful objects made uncomfortably un-useful.

» A slightly weighted font for dyslexics.

» The nicest place on the Internet

» The definitive guide to trading candy

» An awesome handwriting font made to mimic a doctor’s penmanship.

» 32 innovations that will change your tomorrow

» Six years in the making, a proposition for a sexy highway font.

» It turns out there are many things that don’t exist.

» Are expensive batteries worth it? Maybe not

» 10 things to do with Hostess Twinkies

» He’s eaten at 362 pizza joints in New York City and has reviewed each one on an 8-slice scale.

» If you’re going to destroy your reputation as a PR person, better do it in an epic way

» Beijing pollution meter goes off the charts, literally

» Photoshop orgasm takes the shape of deblurring blurry images.

Subscribe »

Search Categories