También disponible en Español

Inf@Vis!

The digital magazine of InfoVis.net

Common Sense: the Sequel
by Juan C. Dürsteler [message nº 69]

Scientific criterion is indispensable in order to distinguish between what works on the web and what doesn’t. But in emerging disciplines like Information Visualisation and usability this criterion is still in its early stages.

Talking via e-mail with Jared M. Spool about last week’s issue, Jared told me that he disagreed with the sentence “Common sense is the tool of choice”. In his opinion science is the tool of choice and the construction of scientific criterion needs a lot of research.

Jared is right. It’s a constant in the History of Science that when scientific experiments are refined and common wisdom hypotheses are tested against scientific methodology many things that appeared to be indisputable truths turn out to be inaccurate or plainly wrong. However, some of them are confirmed as well.

This problem is common to all the emerging disciplines where a theoretical body of knowledge is lacking. This knowledge would allow us to predict the behaviour of a person in front of a user interface, an information chart or the design of a particular feature of a web site.

It’s worth taking a look at the September issue of the monthly newsletter that the company Human Factors International (HFI) publishes titled (“How reliable is usability performance testing?”). Written by Bob Bailey, chief scientist of HFI, it shows the results of three independent studies carried out from 1998 until now by different institutions. These studies prove consistently that the conclusions of several teams of usability experts acting on the same web site differed substantially. Particularly :

  • None of the problems was found by all of the teams. 

  • Some teams found few problems while other teams found many problems.

  • A great deal of problems was found by only one of the teams.

In the different studies the number of teams oscillated between 4 and 9. The tables with the results can be found in the above mentioned article.

We could argue for a long time about the conditions that usability trials should accomplish and whether the consultants offering services in this field are really prepared. Nevertheless I think that what’s important is not losing sight of the evolution of emerging disciplines.

The natural history of this evolution indicates that on one hand the body of scientific knowledge contrasted by Universities and Research Centres will be developed until the (possible) establishment of a theory of the field that will allow us to predict (and even compute) the best solution for a specific problem. On the other hand, in parallel, designers and the personnel working in the field will develop a set of experience and “best practices” based on the conjunction of experience and common sense.

Human beings present very variable features. Defining a common pattern for human behaviour in a specific topic, like, for example, the human walk, is very difficult. For this reason the scientific advances in the disciplines of our interest, that include aspects ranging from computer graphics to psychology passing through linguistics, will be slow

In the meantime, we will have to carefully weigh up the decisions we take using our experience, common sense and technological intelligence to be up to date with the scientific advances in this field.


Links of this issue:

http://www.humanfactors.com/library/sep01.asp  
© Copyright InfoVis.net 2000-2014