Posts tagged as innovation

Designed Life

Sunday October 8th, 2006 in ,

As we continue to surround ourselves with technology, we live an increasingly designed life - you could even call it life as a user experience. Come to think of it, through the products I’ve used in my lifetime, I’ve tapped into the creativity and knowledge of thousands of people I may never meet. For example, there’s 1000s of parts in my car, each one carefully designed by another human being. Imagine for a moment, if you can, the faces of the people who created the products you have used in your lifetime. Your computer, cell phone, TV, stereo, car, all the things in your house or apartment, even your home itself. It would be an army of people the size of which you wouldn’t believe.

Read the rest of this entry »

How Attention Shapes Interaction Design

Wednesday August 23rd, 2006 in , , ,

This is a follow up to my recent post about Resonating Technology, so again I’ll use a mobile phone as a case study. This time around the focus is on how attention affects interaction design and the overall user experience.

To start of lets set a few goals that will support a well designed user interface:

  • Don’t present the user with choices that make no sense.
  • The manual should be no longer than five lines of text - in fact, there shouldn’t even be a need for a manual.

Ok, let’s deal with the problem of choice. To this date our toolbox for handling this problem has primarily consisted of two things: menus and buttons. But what if we were to do something completely different? What I’m proposing instead is something I’ll call the Intention-based User Interface which is based on gesture-recognition and the use of attention data.

Let’s take a look at the definition for gestures:

  1. A motion of the limbs or body made to express or help express thought or to emphasize speech.
  2. The act of moving the limbs or body as an expression of thought or emphasis.
  3. An act or a remark made as a formality or as a sign of intention or attitude.

In short, when a user makes a specific gesture it’s a sign of intention and thought. The job of the phone and the user interface is to interpret the gesture and respond accordingly. To make this possible the camera has to see the user and feel how the user is holding it - including rotation, yaw, pitch, pressure etc. Let me give you a few example of how this could work:

  • Taking a picture - we use our eyes to take pictures, and point if we want someone to see what we see. So to take a picture with the phone, I aim it, hold it steady and apply pressure on the top right of it’s frame. If I need to zoom I just slide one of my fingers across the left side of the phone. In contrast if the phone is stationary on the table, the camera covered, this indicates that the user is not currently interested in taking a picture.
  • Answering the phone - we all know the gesture for this - hold it close to one of your ears.
  • Calling someone - Other than the touch-sensitive frame of the phone it has a full-sized touch-screen on each side. Underneath each semi-transparent screen there’s a wide-angle camera that’s used for taking pictures and performing gesture recognition. As you write on a blank area of the screen with your finger or the pen, the phone will interpret your input and present you with the options that match. For example, if I write “Martin” or “555 21″, it will display contacts that match, find notes containing that word and so forth. I circle the contact item with martins name, and hold the phone to my ear to make the call.
  • Writing notes and text messages - Now if I wanted to write a note or text message I would simply continue to do so in the blank area of the screen. As the text gets longer the logical choice is to present options for saving the note, or sending it as a text message to a contact.

What we’ve done is instead of always presenting the user with a multitude of buttons and menus, we let the users do what comes natural, and then update the user interface to present choices that make most sense in that given situation.

Where does this leave us with the second goal of no need for a manual? Pretty good actually. My guess is five minutes to get a demonstration in the store, and you’re ready to go. For example, the salesclerk could tell you the following:

  1. Just aim it at something you want to take a picture of and press down.
  2. It responds to what you write - names, phone numbers, notes, you name it - when you’re done select what you want to do
  3. Hold it to your ear to answer a call

There you have it - interaction design really does benefit from attention. We’ve successfully removed choices that don’t make sense to the user and made better use of basic human behavior and skill sets.

PS: Someone please make this phone.

Resonating Technology

Monday August 21st, 2006 in , ,

I’m sitting here looking at my new cell phone. It’s more elegant than my previous phones, the battery lasts longer, it’s lighter and has more colors, a better camera, more of everything really. But does any of this really make that much of a difference? The short answer for me is no.

Don’t get me wrong, I enjoy the longer battery life as much as the next guy, but when I really think about it, I pretty much use it the same way as my previous phones. Why is this? I recognize that I may be a creature of habit, but from a user’s point of view, the new phone works almost exactly like the old one. It still has digits for entering phone numbers I haven’t used before, the green button answers a call and the red one hangs up. It still rings when I’m at a restaurant, the movies or somewhere it’s considered impolite to answer the phone. It still runs out of battery at the worst possible time, when I’ve had a lot on my mind and didn’t think to recharge it.

What’s missing from my new phone is something that won’t be fixed by continuing the current features arms race or incrementally improving the specs with 10% for each new generation. What I need is a phone that pays attention to its user and the surroundings. The kind of phone that reacts intelligently to our behavior and any given situation to best serve the needs of it’s user. Although this kind of “intelligent and well-behaved” phone might sound far fetched, let’s step back for a moment and think about it:

  • First of all the phone should know where it is. One way to do this is using Assisted GPS and mapping services based on that specific area or building.
  • It should be able to communicate with other nearby devices. Bluetooth and wireless networks to the rescue.
  • It should be aware of the environment. This includes sounds, noise-levels, vibrations, and temperature, all of which can be measured.
  • It shouldn’t run out of battery. On the top of my head: body heat, kinetic energy from body movement, the sun. Anything without wires and an external charger would be a huge improvement.
  • The camera should recognize gestures. It’s complex but quite useful, so I’ll add it anyway.

What would life be like with this phone? Let me give you a few examples of what’s possible:

  • My phone now knows it shouldn’t ring when I’m at certain types of places, for example at the movies, in a restaurant or in meeting.
  • If I need to find a place or meet with someone it can provide relevant information.
  • When I’m in a noisy environment it can adjust the volume of the ring tone. If it’s lying on my desk and I’m sitting next to it, the ring tone will start at low volume.
  • It synchronizes with my PC when it’s within range, allows me to use the PC keyboard for text messages and lets me view photos on the phone using my PC monitor.

Now this is a phone that would make a difference. But what would such a phone look like? For starters it would have no buttons. But how would you answer the phone then? Well, by placing it close to my ear of course.

The bottom line: It doesn’t matter all that much what it looks like, as long as it know what my face and specifically my ears looks like.