On Tooltips and Affordances

I just got a new smartphone - a T-Mobile Wing, in fact, and I like it a lot.  I've never used Windows Mobile for any extended length of time, though, so I'm still learning a few things.  This morning, while trying to figure out what a button did, I caught myself doing something astounding, and I gained a whole new appreciation of affordances.

This phone, if you're not familiar, is a touch-screen smartphone with a slide-out keyboard, so if I'm doing anything remotely complicated, I'm usually using a stylus to point to the screen.  This is sort of interesting all by itself, because in many ways the stylus acts as an interface metaphor for a mouse, which is, in many ways, acting as an interface metaphor for a finger.  It's no wonder parts of the UI are screwed up!

So I was looking at the Calendar - a screen I'd used a few dozen times, and I wanted to move to the next week.  I knew I could go to the menu to do this, but I thought perhaps there was an easier way to get there (I've been finding all sorts of those while learning to use this phone).  There was a little button on a little toolbar, and I didn't know what the button did.  So I took my stylus and held it still, poised a few millimeters above the touch-sensitive screen.

I was waiting, obviously, for the tooltip that never came.  I expected the button (especially a toolbar button), to have a tooltip.  This is an affordance of toolbar buttons, and my misguided gesture was a failed attempt to exercise this affordance.

There are several interesting observations to be had here, beyond the obvious, "Dave's a moron" one.  Here are a few that sprang to mind:

  • In the week or so I've had this device, I've adopted the stylus as a seamless analog to a mouse, to the extent that I don't differentiate the things that one can do that the other can't.
  • As users gain experience and UI metaphors grow in penetration, they become their own UI "primitive".  If I hadn't been using toolbars with tooltips on their buttons for the last (mumble, mumble) years, I would never have been conditioned to look for them on a new interface.
  • The smartphone (and the Windows smartphone in particular) is a horribly difficult device to support as a developer.  I've been quite aware as I've used this interface that some of the actions I perform with a stylus would be really difficult to support well on a phone without a touch-sensitive screen, and yet you see a lot of applications try to support both platforms with one version of software - a pretty tall order if your interface is non-trivial.  The whole portait-to-landscape pivot every time I pop our my keyboard would be enough all by itself to make this difficult.
  • Given the above, I think you have to tip a hat once again at Apple for the design of the iPhone.  Granted, their tight control of the whole hardware-software platform is the only way that this is possible, but I think it's still great execution.  This format is *not* easy.

Finally, from a design standpoint, this underscored for me the need to obtain real-world usability information about a system.  I'm no Alan Cooper, but I'm better than your average bear at UI design, and I never would have seen this one coming if I hadn't done it myself.  It's just too hard to relax all of your preconceptions and pre-loaded context when you're designing a user interface.  Real-world usage of a system will almost always turn up a few gems like this.  If you don't know about them, it doesn't mean they're not happening - it means you need to get your head out of the sand and pay attention to your users.

Reblog this post [with Zemanta]