En Ru
+375 (29) 692-02-78 +375 (17) 219-48-27
Отправить запрос

History of usability and HCI

16 марта 2011 | admin | Theory

HCI as an independent branch of science that began to develop in late 70’s – early 80’s. It started with the recognition of problems that users of complicated and powerful systems were experiencing. Usability is based on research of communication theories, graphics, industrial design, linguistics, social sciences and cognitive psychology. This knowledge helps designers create technically efficient (i.e. enabling to solve assigned tasks) and user operable (i.e. users can solve tasks) systems. We are now witnessing a great improvement in user interface quality and usability. Formerly, visual interface design was left for later and was carried out right before the end of the project. Interface design was usually done by programmers, who of course are experts in using complicated systems, but have very vague (if any) impression about users abilities.

Cognitive sciences and their practical conclusions allowed to research and study the abilities of computer systems users. As a response to this kind of knowledge such discipline as HCI appeared which uses it in user interface design.

40’s – Ergonomics

It was known long before HCI appeared that the reasons for multiple interface problems are directly related to human psychology. During Second World War, British psychologists conducted a research of attention of radar operators. The researchers were able to work out some advice on monitor design optimization, which helped operators concentrate on monitors and distinguish signals even when tired or thoughtful. By late 40’s a basis of empiric knowledge about human activity in general and psychological aspects of human professional activity was formed. This resulted in appearance of formal discipline about designing workplace environment: ergonomics. Ergonomists study the “joint” between human and his work. The better this joint is designed, the more efficient and safe work is.

50’s – 60’s – the birth of computers

Computers became available within a decade. It transformed the way of working. Computers were big, expensive and very rare, but, nevertheless, they made a revolution in the way people worked. In 1960 the first computerized system of transactions processing was launched. It was the system of airplane ticket booking “Sabre”. In the same year Jim Slagel from MIT created a system that was able to pass academic minimum in differential calculus and got an “A”. Unfortunately systems like “Sabre” were large and complicated and most people couldn’t figure them out. The fails in “bringing together” computers and humans at workplaces made it possible for J.C.R.Licklider to introduce a concept of “human-computer symbiosis”, a concept of joining human brain and computer systems to revolutionize the process of data processing. Eventually, in ergonomics this concept was named man-machine interface. MMI design was understood as application of known principles of ergonomics in interface design to achieve the best connection between a human and a computer.

Throughout 60’s first seeds of 80’s-90’s PC and internet revolution were planted. Due to capacity increase researchers started to think about possible future ways of interaction with computer systems. It was the time, when several well-known concepts appeared: Doug Engelbart (1962) invented a mouse; Ted Nelson (1960) invented the term “hypertext”. In 1969 the US Ministry of Defense assigned to research the nets to the four leading universities. This was how ARPANET – the predecessor of today’s Internet – appeared.

1970’s – Rise of PC

During 1970’s the processing power of computers increased and became cheaper. Computers started to appear on people’s desks. Right after PCs were invented, IBM surprised many people by separating hardware and software and allowing other programs to work on their PCs.
Producers of software profited from separating hard- and software, but at the same time they began to experience difficulties with technical support of their own software. Small companies realized that the cost of technical support of their own software systems is very high. The companies figured out that this problem can be solved by designing efficient interfaces that would solve user tasks without addressing support team

1980’s – GUI

In late 70’s, when PCs started to play a more important role in everyday work, many researches started looking for an alternative to command line interface. The project of Xerox “Star”, which was opened in 1981, included first signs of what Ben Shneiderman later (1982) called “direct manipulation”. Graphic interaction had some advantages over command line interface: increment actions and quick interface response, option to cancel the action and usage of visual actions instead of “learning” a new language (of command line).
Xerox “Star” used a desktop metaphor, which brought along something that is nowadays considered a standard – WIMP set (Windows, Icons, Menus, Pointers). This was the first PC system, based on principles which soon became universal usability methods: prototyping and analyses, user testing and iterative detailing (of requirements, interface elements). Unfortunately, this project failed, mainly due to the fact, that primary PC users at that time were businesspeople and “Star” lacked some basic functions of electronic tables. Therefore, “Star” has violated one of the main usability principles, which is now recognized by usability-professionals: simple convenience is not enough – the product should also be useful.
Within a decade new graphic interface paradigm (GUI) was implemented by a number of companies (Lisa, VisiCalc — 1983, Apple Macintosh — 1984, Microsoft Windows, Commodore Amiga — 1985). But, the lack of programs, using new interface, variety of competitive platforms and shortage of processing power for using the whole potential of the new paradigm left people indifferent to GUI throughout 80’s. GUI distribution started only when Microsoft launched Windows 3.0 (with a better design and clickable buttons) together with the growing amount of home PCs.
Many innovations in graphic interfaces go back to academic research of 60-70’s. By the 80’s such research drastically deviated from classic ergonomics: it ignored physical aspects of working systems. It focused on cognitive aspects of human-computer interaction. The term “Human-computer interaction” or HCI substituted the archaic (and politically incorrect) “man-machine interface”.

1990’s - Internet and collaboration

By mid 90’s Windows became a very widespread OS. This required developers to keep to the standards. While early GUI researches where based on psychological studies, new interfaces were simply copying the style of Windows. Interface in such products often didn’t conform to the standards and wasn’t clear to the user. Increase in the number of home PCs and Internet-boom created a great demand for usability-professionals in software developing teams. The possibility to connect computers by net enabled distanced colleagues to work on projects and communicate by means of chats and e-mail. Social aspects of computer work became crucial for successful systems. Designers no longer developed a single user -computer interaction, they started to work on collaboration with the help of a computer. Many designers ignored these “community elements”, which resulted in popularity of e-mail only while other types of groupware weren’t that popular.
HCI discipline (now “usability” is a more frequent term) is based on psychology which enables designers to model complicated social interaction, surrounding new computer technologies. These models allow researchers, such as Jacob Nielsen (2000), develop various guidelines and rules for web-interface creation. Meanwhile revolution was taking place inside HCI itself. HCI, which was traditionally founded on processing models of cognitive psychology, is now implementing ethnography, linguistics and communication theories, as well as cultural aspects and humanities. All for one purpose – to better understand a person working with computer in the era of Internet.

2000’s – mobile technologies and near future

Obviously, Internet boom couldn’t have lasted for long. Many people thought, that they knew requirements and needs of their users, but they never thought of user-oriented design techniques. Business-models of poor quality resulted in large sums being wasted on designing a product no one could use or wanted to use.

Nevertheless, it was too early to think that the Internet is over. The companies, that were able to provide useful and helpful services, managed to survive. Usability techniques and HCI proved to be effective in design risks prevention. In late 90’s mobile devices broadened ways of our interaction with computers and each other. Cell phones, PDA and wireless networks gave birth to a new concept of omnipresent digital devices: a world, where technology is everywhere, but it’s in the background. People stopped being single users of separate isolated devices; they are now a part of digital communities. Systems, that influence our life so much, can’t be designed using outdated technologies. Human-centered systems require human-centered design. This is the usability task for the next decade.
 

Понравилась статья?