Archive for August, 2006

The Attention Life Cycle

Wednesday August 30th, 2006 in , , ,

They say a picture is worth a thousand words, so I’ve created a quick sketch of how I see the life cycle of attention. Click the picture for the full-sized version.

The Attention Life Cycle

More details to follow. I’m sure there’s many other ways to represent this, so feel free to voice your opinion.

It’s your attention - use it!

Sunday August 27th, 2006 in , , , , ,

One of the interesting aspects of dealing with attention, or specifically the recording of it as , is the immediate and often strong reaction that people have towards it. I’ve seen reactions ranging from “wow - there’s so much potential in this” to “no way - it’s a huge invasion of privacy”. Both are natural human responses - we all have an inbuilt fear of the unknown, but at the same time we are both curious and adventurous.

Our privacy, or the lack thereof, is at the center of a raging battle that affects almost every aspect of our lives. Scott McNealy from Sun once said “You have zero privacy anyway - Get over it“, which sparked a huge debate. In some ways, this is not far from the truth seeing as experts estimate that information about the average working adult in the UK is stored on 700 databases. Interestingly, at the same time we’re seeing services such as myspace and youtube scrape in millions of users looking to express themselves and who they are. In some strange way it seems that the majority of young Americans don’t want privacy, they want attention.

The fact is, as we go about our daily lives we’re leaving an abundance of footprints in the sands of this huge digital sandbox we call the Internet. Most of us let these grains of information slip right through our fingers. Why? For one, most users don’t realize the amount of information they leave behind in the webserver logs of the sites they visit. At the same time, the majority of users have not yet been given the incentive or strong enough reasons to collect and save their different kinds of attention data.

One way of looking at it is this: many companies and organizations already know a lot about you, why shouldn’t you know the same things and hopefully more about yourself? Plus, it’s not like you have to do any extra work, doing what you normally do is all that is required to collect your attention data (and the right tools mind you). For example, last.fm lets you keep your own fully-automated music journal and helps you discover new music simply by listening to music as you’ve always done. Last.fm shows us just a glimpse of the kind of value that can be found in your attention data. Some might think “I already know what kind of music I listen to”, but in my opinion we’re already way past the point where we can remember or keep track of the things we consume digitally - even on a daily basis.

In a way you can think of your own attention data as an extension of your mind and your memory. And who of us wouldn’t want a sharp memory? We might discover surprising and important things about our selves.

Attention Brings Service Online

Scientists from the RAND Corporation have created this model to illustrate what a “home computerĒ could look like in the year 2004. However the needed technology will not be economically feasible for the average home. Also the scientists readily admit that the computer will require not yet invented technology to actually work, but 50 years from now scientific progress is expected to solve these problems. With the teletype interface and the Fortran language, the computer will be easy to use. ó Popular Mechanics, 1954

June 2006, Internet World Stats reports that more than one billion people use the Internet. That’s one billion people looking to connect, be entertained, discover something new, even learn something. As a result, the Internet has fundamentally changed how some of the most basic human needs are met. The problem is that the principles on which our society is built no longer apply, including the laws of physics and many of the established economic models. It’s an entirely different animal and it’s called The Attention Economy. 

One important aspect of how this attention driven economy works is known as The Long Tail. Simply put, the long tail means that in terms of business, small is the new big since storage, shelf-space and distribution no longer factor into the equation. When the product range is broadened the sales generated from small names, for example in books and music, starts to add up and the volume of low popularity items exceeds the volume of high popularity items. One thing remains constant though, it’s still all about giving people what they want, and herein lies the challenge.

The challenge can be outlined as follows:

  1. Assume an almost infinitely broad selection of products
  2. Attention is the most precious resource that the user has - for this reason consider it extremely limited
  3. Present the user with the most relevant products in the shortest possible time, and a minimal amount of work required on their end.

I’ve been focusing on how we spend money on music and books, but variations of this challenge exist anywhere we spend attention. We chose the search engine that provides us with the best results, subscribe to the feeds that have the best chance of keeping using up to speed. Or at least we like to think that’s what we’re doing. We can never really know what’s out there, if we missed that one important thing that would have made all the difference.

Hi, I’ll have the Usual / What’s Good?

We’ve established that we all spend attention. The challenge ahead is to maximize the “return of attention”. Ideally, I want relevant products and information at my fingertips. Products that are a perfect match for me, information I can use and enjoy. How can we achieve this? Logic dictates that in order to do this better, the source of these products and information needs to know more about me - my likes and dislikes, what I’ve done previously. This is where attention data comes into play.

As a consumer you can think of attention data as the relationship you’ve established with the seller of the product in question. A premise for this kind of relationship is trust, but once established the experience becomes more enjoyable for you and more profitable for the seller. A few real-life examples of this includes:

  • The staff at your favorite restaurant - they come to know how you like to be seated, the kind of food and drink you like. As a result they will be able to recommend new dishes you’re likely to enjoy.
  • The bartender at the place you usually hang with your friends - lift a finger and she’ll respond with an ice-cold beer of your favorite brand. Tell her what flavors you like and she’ll suggest new drinks for you to try.
  • The same thing goes for the staff at movie theatres, record stores, bookstores, you name it. Basically anywhere they get to know you through your returned visits.

Amazon understands this, and has been hugely successful as a result. Their technology essentially serves the same purpose as the bookstore clerks do in the real world. The big difference is that their servers know about all books and all other customers. They’ve taken some of the service that people enjoy in the real world, and made it work on a large scale online.

The bottom line is that we are all uniquely special, and we enjoy being treated as such, online or not. The use of attention data will play an important role and shows great promise, but in order to succeed we need to strike a perfect balance between too little or too much data in terms of privacy. One thing is for sure:

The fight for attention has begun…

How Attention Shapes Interaction Design

Wednesday August 23rd, 2006 in , , ,

This is a follow up to my recent post about Resonating Technology, so again I’ll use a mobile phone as a case study. This time around the focus is on how attention affects interaction design and the overall user experience.

To start of lets set a few goals that will support a well designed user interface:

  • Don’t present the user with choices that make no sense.
  • The manual should be no longer than five lines of text - in fact, there shouldn’t even be a need for a manual.

Ok, let’s deal with the problem of choice. To this date our toolbox for handling this problem has primarily consisted of two things: menus and buttons. But what if we were to do something completely different? What I’m proposing instead is something I’ll call the Intention-based User Interface which is based on gesture-recognition and the use of attention data.

Let’s take a look at the definition for gestures:

  1. A motion of the limbs or body made to express or help express thought or to emphasize speech.
  2. The act of moving the limbs or body as an expression of thought or emphasis.
  3. An act or a remark made as a formality or as a sign of intention or attitude.

In short, when a user makes a specific gesture it’s a sign of intention and thought. The job of the phone and the user interface is to interpret the gesture and respond accordingly. To make this possible the camera has to see the user and feel how the user is holding it - including rotation, yaw, pitch, pressure etc. Let me give you a few example of how this could work:

  • Taking a picture - we use our eyes to take pictures, and point if we want someone to see what we see. So to take a picture with the phone, I aim it, hold it steady and apply pressure on the top right of it’s frame. If I need to zoom I just slide one of my fingers across the left side of the phone. In contrast if the phone is stationary on the table, the camera covered, this indicates that the user is not currently interested in taking a picture.
  • Answering the phone - we all know the gesture for this - hold it close to one of your ears.
  • Calling someone - Other than the touch-sensitive frame of the phone it has a full-sized touch-screen on each side. Underneath each semi-transparent screen there’s a wide-angle camera that’s used for taking pictures and performing gesture recognition. As you write on a blank area of the screen with your finger or the pen, the phone will interpret your input and present you with the options that match. For example, if I write “Martin” or “555 21″, it will display contacts that match, find notes containing that word and so forth. I circle the contact item with martins name, and hold the phone to my ear to make the call.
  • Writing notes and text messages - Now if I wanted to write a note or text message I would simply continue to do so in the blank area of the screen. As the text gets longer the logical choice is to present options for saving the note, or sending it as a text message to a contact.

What we’ve done is instead of always presenting the user with a multitude of buttons and menus, we let the users do what comes natural, and then update the user interface to present choices that make most sense in that given situation.

Where does this leave us with the second goal of no need for a manual? Pretty good actually. My guess is five minutes to get a demonstration in the store, and you’re ready to go. For example, the salesclerk could tell you the following:

  1. Just aim it at something you want to take a picture of and press down.
  2. It responds to what you write - names, phone numbers, notes, you name it - when you’re done select what you want to do
  3. Hold it to your ear to answer a call

There you have it - interaction design really does benefit from attention. We’ve successfully removed choices that don’t make sense to the user and made better use of basic human behavior and skill sets.

PS: Someone please make this phone.

Resonating Technology

Monday August 21st, 2006 in , ,

I’m sitting here looking at my new cell phone. It’s more elegant than my previous phones, the battery lasts longer, it’s lighter and has more colors, a better camera, more of everything really. But does any of this really make that much of a difference? The short answer for me is no.

Don’t get me wrong, I enjoy the longer battery life as much as the next guy, but when I really think about it, I pretty much use it the same way as my previous phones. Why is this? I recognize that I may be a creature of habit, but from a user’s point of view, the new phone works almost exactly like the old one. It still has digits for entering phone numbers I haven’t used before, the green button answers a call and the red one hangs up. It still rings when I’m at a restaurant, the movies or somewhere it’s considered impolite to answer the phone. It still runs out of battery at the worst possible time, when I’ve had a lot on my mind and didn’t think to recharge it.

What’s missing from my new phone is something that won’t be fixed by continuing the current features arms race or incrementally improving the specs with 10% for each new generation. What I need is a phone that pays attention to its user and the surroundings. The kind of phone that reacts intelligently to our behavior and any given situation to best serve the needs of it’s user. Although this kind of “intelligent and well-behaved” phone might sound far fetched, let’s step back for a moment and think about it:

  • First of all the phone should know where it is. One way to do this is using Assisted GPS and mapping services based on that specific area or building.
  • It should be able to communicate with other nearby devices. Bluetooth and wireless networks to the rescue.
  • It should be aware of the environment. This includes sounds, noise-levels, vibrations, and temperature, all of which can be measured.
  • It shouldn’t run out of battery. On the top of my head: body heat, kinetic energy from body movement, the sun. Anything without wires and an external charger would be a huge improvement.
  • The camera should recognize gestures. It’s complex but quite useful, so I’ll add it anyway.

What would life be like with this phone? Let me give you a few examples of what’s possible:

  • My phone now knows it shouldn’t ring when I’m at certain types of places, for example at the movies, in a restaurant or in meeting.
  • If I need to find a place or meet with someone it can provide relevant information.
  • When I’m in a noisy environment it can adjust the volume of the ring tone. If it’s lying on my desk and I’m sitting next to it, the ring tone will start at low volume.
  • It synchronizes with my PC when it’s within range, allows me to use the PC keyboard for text messages and lets me view photos on the phone using my PC monitor.

Now this is a phone that would make a difference. But what would such a phone look like? For starters it would have no buttons. But how would you answer the phone then? Well, by placing it close to my ear of course.

The bottom line: It doesn’t matter all that much what it looks like, as long as it know what my face and specifically my ears looks like.

What is Attention Data?

Saturday August 19th, 2006 in , , ,

To answer this question let’s start with attention. Simply put, attention is time spent interacting with someone or something. Sounds pretty abstract, I know, but that’s kind of the idea. Let me give you a few examples of attention:

  • Talking to someone
  • Reading a book, listening to music or watching a movie
  • Taking a picture

Not that many years ago we did these activities by sitting or standing across each other, holding a piece of dead tree and finally by means of analog film that we would have to get developed.

Today, we can pretty much do the same things using video conferencing, webcams, e-books, ipods and digital cameras to name but a few. The big difference is that when you go digital it’s much easier and very valuable to keep track of what you pay attention to - this is in essence what attention data is all about:

Attention data is a digital record that describes the
time spent interacting with someone or something.

This means that attention data can pretty much cover anything we do that has some kind of digital footprint, including:

  • books - that I bought, read, recommended or wished for
  • movies and videos - that I saw at the cinema, own on DVD, streamed online etc.
  • music - CDs that I’ve bought and listened to, my playlists, radio etc.
  • games - that I own and play online
  • photos - that I took and how I browse my photo collection
  • websites and blogs - that I read, my bookmarks, blogrolls, rss and opml
  • events and places - calendar entries for concerts, meetups, vacations etc.
  • people in my social network - communities that I participate in, my contacts and friends lists, my subscriptions etc.

As you probably noticed, keeping records with this kind of data has major implications for consuming and discovering products, our social interactions online and last but not least identity and privacy.  

Using Attention to Create Value

How is value created using attention data? As I see it, there are at least six different flavors:

  • Personal value - improved user experience and productivity
  • Network value - use your network to learn and discover, build relationships and strengthen friendships
  • Enterprise value - optimize the use of available resources within the enterprise including people and their competencies
  • Community value - expand your social network and discover outside the limits of your existing social network
  • Consumer value - better customer service through personalization and recommendations
  • Global value - smarter search and better relevance in online search

As mentioned earlier in this post, attention data has a lot of implications that I will continue to write about - including a detailed look at the different kinds of value that attention data creates. In the mean time, your comments and thoughts are very much welcome. I’ll leave you with this to consider:

How are you creating value from your attention data?