También disponible en Español

Inf@Vis!

The digital magazine of InfoVis.net

It's child's play
by Juan C. Dürsteler [message nº 42]

Up until this moment computers have done what we told them to do, not what we wanted them to do. This has to change, but it's not an easy task.

Some time ago we talked about an easy to do or an easy to use thing that was "like child's play". Nowadays any child is able to program a video recorder or to find the functions of a mobile phone better than an adult. Maybe we should create a label with "adapted to adult use" on it meaning that it's so easy to use than even an adult can do it.

Initially the idiosyncrasy of a particular piece of software had to be learned by reading the manual and trying out the commands. The advent of the desktop metaphor with Apple's McIntosh and with Microsoft Windows introduced some ease of use, since you didn’t have to learn so many commands. We could (to some extent) explore the software idiosyncrasy just by interacting with it and exploring its menus. The graphical interface allowed the users to visualise the information on the expected behaviour of the program more easily. The manual is still needed, but the approach is more direct.

Nevertheless the idea is that the user adapts him or herself to the philosophy (and the restrictions) created by the user interface designer, and not the opposite. This has hindered the introduction of computers in important sectors of society. The solution, sometimes it's argued, comes from the new generations that will have used the new interface since childhood.

Not everybody thinks so. The ideal thing is that software adapts to our idiosyncrasy and not the other way round. In order for this to happen, the software has to know many things about us: our preferences, what we use an application for and even our mood.

The set of techniques that is trying to solve this problem is called Affective Computing. Some of the most important research centres, like MIT's Media Lab (see the specific page on Affective Computing) or the IBM Almaden research centre with its BlueEyes initiative  have been working in the field over the last few years.

Some of the preliminary work indicates that at least some emotional states can be detected unequivocally by analysing the facial expression taken with a video camera (basically the position of the eyebrows and of both ends of the mouth). Other studies allow you to relate the cardiac rhythm, the body temperature, electrical conductivity of the skin and other physiological attributes with the mood. This has led to the creation of the "Emotion Mouse".

Eye pupil detection allows the software to know, within a 5m range, where in a room you are and what direction you are looking in. One of the applications is for the household appliances of a kitchen to "know" when the user centres his/her attention on them in order to ask the user about the action he/she wants to be done.

Other developments attempt to learn the needs of the user just by following the interaction between the user and the computer in order to know what he/she is interested in at any given moment. For example, by remembering the type of web sites that the user links to according to the mood and time of the day, the computer could search on related sites and suggest the results the user.

IBM estimates that in 4 years 30% of the new household appliances in Europe will interact with the user via body sensors and speech recognition.

In any case, the design of the user interface will continue to be a mixture of art and science, where talent and experience will continue making the difference between producing information that is easy to use and understand or not.

At least computers then will know that sometimes we hate them.


Two interesting books on the topic are:

Links of this issue:

http://www.media.mit.edu/affect  
http://www.almaden.ibm.com/cs/blueeyes  
http://www.almaden.ibm.com/cs/blueeyes/HCII99_emouse.doc  
http://www.time.com/time/europe/magazine/2000/228/gadgets.html  
http://www.amazon.com/exec/obidos/ASIN/0262661152/infovisnet  
http://www.amazon.com/exec/obidos/ASIN/1558604448/infovisnet  
© Copyright InfoVis.net 2000-2014