Latest posts

How Attention Shapes Interaction Design

Wednesday August 23rd, 2006 in , , ,

This is a follow up to my recent post about Resonating Technology, so again I’ll use a mobile phone as a case study. This time around the focus is on how attention affects interaction design and the overall user experience.

To start of lets set a few goals that will support a well designed user interface:

  • Don’t present the user with choices that make no sense.
  • The manual should be no longer than five lines of text - in fact, there shouldn’t even be a need for a manual.

Ok, let’s deal with the problem of choice. To this date our toolbox for handling this problem has primarily consisted of two things: menus and buttons. But what if we were to do something completely different? What I’m proposing instead is something I’ll call the Intention-based User Interface which is based on gesture-recognition and the use of attention data.

Let’s take a look at the definition for gestures:

  1. A motion of the limbs or body made to express or help express thought or to emphasize speech.
  2. The act of moving the limbs or body as an expression of thought or emphasis.
  3. An act or a remark made as a formality or as a sign of intention or attitude.

In short, when a user makes a specific gesture it’s a sign of intention and thought. The job of the phone and the user interface is to interpret the gesture and respond accordingly. To make this possible the camera has to see the user and feel how the user is holding it - including rotation, yaw, pitch, pressure etc. Let me give you a few example of how this could work:

  • Taking a picture - we use our eyes to take pictures, and point if we want someone to see what we see. So to take a picture with the phone, I aim it, hold it steady and apply pressure on the top right of it’s frame. If I need to zoom I just slide one of my fingers across the left side of the phone. In contrast if the phone is stationary on the table, the camera covered, this indicates that the user is not currently interested in taking a picture.
  • Answering the phone - we all know the gesture for this - hold it close to one of your ears.
  • Calling someone - Other than the touch-sensitive frame of the phone it has a full-sized touch-screen on each side. Underneath each semi-transparent screen there’s a wide-angle camera that’s used for taking pictures and performing gesture recognition. As you write on a blank area of the screen with your finger or the pen, the phone will interpret your input and present you with the options that match. For example, if I write “Martin” or “555 21″, it will display contacts that match, find notes containing that word and so forth. I circle the contact item with martins name, and hold the phone to my ear to make the call.
  • Writing notes and text messages - Now if I wanted to write a note or text message I would simply continue to do so in the blank area of the screen. As the text gets longer the logical choice is to present options for saving the note, or sending it as a text message to a contact.

What we’ve done is instead of always presenting the user with a multitude of buttons and menus, we let the users do what comes natural, and then update the user interface to present choices that make most sense in that given situation.

Where does this leave us with the second goal of no need for a manual? Pretty good actually. My guess is five minutes to get a demonstration in the store, and you’re ready to go. For example, the salesclerk could tell you the following:

  1. Just aim it at something you want to take a picture of and press down.
  2. It responds to what you write - names, phone numbers, notes, you name it - when you’re done select what you want to do
  3. Hold it to your ear to answer a call

There you have it - interaction design really does benefit from attention. We’ve successfully removed choices that don’t make sense to the user and made better use of basic human behavior and skill sets.

PS: Someone please make this phone.

Resonating Technology

Monday August 21st, 2006 in , ,

I’m sitting here looking at my new cell phone. It’s more elegant than my previous phones, the battery lasts longer, it’s lighter and has more colors, a better camera, more of everything really. But does any of this really make that much of a difference? The short answer for me is no.

Don’t get me wrong, I enjoy the longer battery life as much as the next guy, but when I really think about it, I pretty much use it the same way as my previous phones. Why is this? I recognize that I may be a creature of habit, but from a user’s point of view, the new phone works almost exactly like the old one. It still has digits for entering phone numbers I haven’t used before, the green button answers a call and the red one hangs up. It still rings when I’m at a restaurant, the movies or somewhere it’s considered impolite to answer the phone. It still runs out of battery at the worst possible time, when I’ve had a lot on my mind and didn’t think to recharge it.

What’s missing from my new phone is something that won’t be fixed by continuing the current features arms race or incrementally improving the specs with 10% for each new generation. What I need is a phone that pays attention to its user and the surroundings. The kind of phone that reacts intelligently to our behavior and any given situation to best serve the needs of it’s user. Although this kind of “intelligent and well-behaved” phone might sound far fetched, let’s step back for a moment and think about it:

  • First of all the phone should know where it is. One way to do this is using Assisted GPS and mapping services based on that specific area or building.
  • It should be able to communicate with other nearby devices. Bluetooth and wireless networks to the rescue.
  • It should be aware of the environment. This includes sounds, noise-levels, vibrations, and temperature, all of which can be measured.
  • It shouldn’t run out of battery. On the top of my head: body heat, kinetic energy from body movement, the sun. Anything without wires and an external charger would be a huge improvement.
  • The camera should recognize gestures. It’s complex but quite useful, so I’ll add it anyway.

What would life be like with this phone? Let me give you a few examples of what’s possible:

  • My phone now knows it shouldn’t ring when I’m at certain types of places, for example at the movies, in a restaurant or in meeting.
  • If I need to find a place or meet with someone it can provide relevant information.
  • When I’m in a noisy environment it can adjust the volume of the ring tone. If it’s lying on my desk and I’m sitting next to it, the ring tone will start at low volume.
  • It synchronizes with my PC when it’s within range, allows me to use the PC keyboard for text messages and lets me view photos on the phone using my PC monitor.

Now this is a phone that would make a difference. But what would such a phone look like? For starters it would have no buttons. But how would you answer the phone then? Well, by placing it close to my ear of course.

The bottom line: It doesn’t matter all that much what it looks like, as long as it know what my face and specifically my ears looks like.

What is Attention Data?

Saturday August 19th, 2006 in , , ,

To answer this question let’s start with attention. Simply put, attention is time spent interacting with someone or something. Sounds pretty abstract, I know, but that’s kind of the idea. Let me give you a few examples of attention:

  • Talking to someone
  • Reading a book, listening to music or watching a movie
  • Taking a picture

Not that many years ago we did these activities by sitting or standing across each other, holding a piece of dead tree and finally by means of analog film that we would have to get developed.

Today, we can pretty much do the same things using video conferencing, webcams, e-books, ipods and digital cameras to name but a few. The big difference is that when you go digital it’s much easier and very valuable to keep track of what you pay attention to - this is in essence what attention data is all about:

Attention data is a digital record that describes the
time spent interacting with someone or something.

This means that attention data can pretty much cover anything we do that has some kind of digital footprint, including:

  • books - that I bought, read, recommended or wished for
  • movies and videos - that I saw at the cinema, own on DVD, streamed online etc.
  • music - CDs that I’ve bought and listened to, my playlists, radio etc.
  • games - that I own and play online
  • photos - that I took and how I browse my photo collection
  • websites and blogs - that I read, my bookmarks, blogrolls, rss and opml
  • events and places - calendar entries for concerts, meetups, vacations etc.
  • people in my social network - communities that I participate in, my contacts and friends lists, my subscriptions etc.

As you probably noticed, keeping records with this kind of data has major implications for consuming and discovering products, our social interactions online and last but not least identity and privacy.  

Using Attention to Create Value

How is value created using attention data? As I see it, there are at least six different flavors:

  • Personal value - improved user experience and productivity
  • Network value - use your network to learn and discover, build relationships and strengthen friendships
  • Enterprise value - optimize the use of available resources within the enterprise including people and their competencies
  • Community value - expand your social network and discover outside the limits of your existing social network
  • Consumer value - better customer service through personalization and recommendations
  • Global value - smarter search and better relevance in online search

As mentioned earlier in this post, attention data has a lot of implications that I will continue to write about - including a detailed look at the different kinds of value that attention data creates. In the mean time, your comments and thoughts are very much welcome. I’ll leave you with this to consider:

How are you creating value from your attention data?