Accessibility: Putting Yourself in the User’s Place

I once worked at a building where, oddly, the handicapped parking spaces were not in the row closest to the building, but rather in a long strip perpendicular to the building (simplified picture below). The row closest to the building was partly visitor parking spaces, and partly open spaces.

rob_parking_lot

As an employee I didn’t give it much thought, and frankly, I was happy on the rare occasions that I got one of the coveted spots close to the door.

That was, until the day my father and I attended a Saturday seminar at the building. My father is in his 80s and walks with a cane. Getting around is slow and physically taxing for him. He had his handicap placard, and we looked for a handicap parking place. But there were no close places, and in fact some of the non-handicap places, although not close, were still actually closer than some handicap spaces!

Seeing the experience through my father’s eyes put a whole new spin on it.

I tried to imagine the reasoning of the people who designed the parking lot. Did they justify their choices because handicap spaces near a building’s entrance are often empty? Did they think the handicap places were “close enough”? One thing was clear to me: I doubt they had ever accompanied a handicapped person as they tried to find a close parking place on a busy day.

I came away with some much needed-reminders:

  1. Test designs with differently-abled users, and strive for win-win design.
  2. Frequency of use is not synonymous with importance. Even if my father only needed that parking place for a few hours on a Saturday morning, that need was very important for him.
  3. We can’t excuse ourselves in failing to address accessibility.
  4. Shortcuts and assumptions compromise user experience.There’s no substitute for understanding one’s users, and that understanding means putting ourselves in their place.

Design Bloopers #2: Tablet Power Management

I love my Evo tablet—it has revolutionized the way I manage my life. Like most tablet owners, I try to maximize battery life. One of the main ways I do so is by keeping the screen on its dimmest setting when I’m indoors.

But my tablet has an odd quirk: when it alerts me that the battery level has reached critical, it simultaneously adjusts the screen brightness to the highest setting! At the very point when my tablet most needs to conserve battery power, the tablet changes my setting to drain the battery even more quickly.

Was this intentional? I doubt it. I can’t imagine a developer coding a battery alert to include a power drain. So it was probably caused by a unexpected line of code somewhere. But once again, it points to the importance of testing designs in realistic scenarios to make sure they work as expected and are free of glitches. User testing adds to the cost of the project; but failing to do user testing costs even more.

The Value of Realistic Experience, or Lessons Learned from Herbal Tea

Insomnia is a frequent challenge for me, so I was interested when a friend told me about one of those herbal teas that is supposed to help one relax and fall asleep more easily. Hoping for the best, I drank a cup a little before bedtime—only to be awakened some time later with an extremely full bladder, more so than should have resulted from a small cup of tea near bedtime.

I decided to do a little checking on the ingredients. As expected, I found herbs known to promote relaxation and sleep. Then I found something I definitely wouldn’t have expected. One of the ingredients was a diuretic. Sleep promoter and diuretic: not a good combination!

Did the people who created this herb tea never try drinking it themselves? No, I’m sure they did—but probably during the day time, and probably only in small spoonfuls. In other words, they tested their design for a new herb tea in unrealistic circumstances, circumstances which did not reflect the users’ reality. In doing so, they missed a major problem—a deal breaker, in fact. (I never bought that tea again!)

A design may look good on paper or on the screen. Without trying it in a realistic scenario, users may even tell you it would work. But you never know a design really works until you’ve proven it in the crucible of realistic experience.

“You can observe a lot just by watching”

As Yogi Berra famously commented, “You can observe a lot just by watching.” Obvious? Sure. Easily ignored? That too, unless we make it a point to observe.

To illustrate with three of my favorite user experience examples:

Example 1

For the past few months, I’ve been enjoying an online course in Human-Computer Interaction offered by Coursera and taught by Stanford Associate Professor Scott Klemmer. In one session, he talked about Expedia’s experience with a problematic web page design. Their web analytics showed that a significant portion of users would click the Buy Now button, obviously intending to make a purchase, but then wouldn’t go through with it.

They were puzzled until they focused on the user experience. A confusing design led users to put their bank information in the wrong field—and then, of course, the transaction failed. Once the problem was fixed, Expedia calculated that they realized an additional $12 million in profit that year.

Example 2

With identify theft prevalent in cyberspace, password security is getting a lot of attention. You’ve probably been to sites that measure the strength of your password as you’re setting up a new account. You’d think that would be motivating to most people… but Associate Professor Anthony Vance of BYU conducted research that showed it wasn’t; in fact, it had no more effect on password strength than static text.

So what does motivate users to create stronger passwords? Paying attention to user behavior revealed this interesting insight: the most effective motivator was an interactive interface that evaluates the password and shows roughly how long it would take a hacker to crack. (The research is described here. Professor Vance’s findings were presented at an IT conference I attended and to my knowledge haven’t been published online.)

Example 3

Paul Howe and his colleagues came up with what they thought was a really cool idea: allow people making online purchases to tell their friends via social media. So they created a realistic mockup and did user testing. Were they ever glad they did: most of their users hated it. So after minimal time and expenditure they abandoned the idea.

Several of their competitors had the same idea, which they apparently developed without actually testing with users. Some months and a boatload of money later, they also conceded that it wasn’t such a good idea after all. Paul determined that their user testing probably saved them 9 months of work and around $2 million.

In each of these examples, paying attention to users revealed behavior that was surprising to the designers, and which they wouldn’t have known about otherwise. Yogi Berra was right—you can observe a lot just by watching! The trick is actually doing it.