Human–computer interaction (HCI) involves the study, planning, and design of the interaction between people (users) and computers. It is often regarded as the intersection of computer science, behavioral sciences, design and several other fields of study. The term was popularized by Card, Moran, and Newell in their seminal 1983 book, The Psychology of Human-Computer Interaction, although the authors first used the term in 1980,[1] and the first known use was in 1975.[2] The term connotes that, unlike other tools with only limited uses (such as a hammer, useful for driving nails, but not much else), a computer has many affordances for use and this takes place in an open-ended dialog between the user and the computer.

Attention to human-machine interaction is important because poorly designed human-machine interfaces can lead to many unexpected problems. A classic example of this is the Three Mile Island accident, a nuclear meltdown accident, where investigations concluded that the design of the human–machine interface was at least partially responsible for the disaster.[3][4][5] Similarly, accidents in aviation have resulted from manufacturers' decisions to use non-standard flight instrument and/or throttle quadrant layouts: even though the new designs were proposed to be superior in regards to basic human–machine interaction, pilots had already ingrained the "standard" layout and thus the conceptually good idea actually had undesirable results.

A basic goal of HCI is to improve the interaction between users and computers by making computers more user-friendly and receptive to the user's needs. Specifically, HCI is concerned with

methodologies and processes for designing interfaces (i.e., given a task and a class of users, design the best possible interface within given constraints, optimizing for a desired property such as learnability or efficiency of use)

developing descriptive and predictive models and theories of interaction

A long term goal of HCI is to design systems that minimize the barrier between the human's cognitive model of what they want to accomplish and the computer's understanding of the user's task (see CSCW).

Professional practitioners in HCI are usually designers concerned with the practical application of design methodologies to real-world problems. Their work often revolves around designing graphical user interfaces and web interfaces.

Researchers in HCI are interested in developing new design methodologies, experimenting with new hardware devices, prototyping new software systems, exploring new paradigms for interaction, and developing models and theories of interaction.

HCI vs CHI. The acronym CHI (pronounced kai), for computer-human interaction, has been used to refer to this field, perhaps more frequently in the past than now. However, researchers and practitioners now refer to their field of study as HCI (pronounced as an initialism), which perhaps rose in popularity partly because of the notion that the human, and the human's needs and time, should be considered first, and are more important than the machine's. This notion became increasingly relevant towards the end of the 20th century as computers became increasingly inexpensive (as did CPU time), small, and powerful. Since the turn of the millennium, the field of human-centered computing has emerged as an even more pronounced focus on understanding human beings as actors within socio-technical systems.

Usability vs Usefulness. Design methodologies in HCI aim to create user interfaces that are usable, i.e. that can be operated with ease and efficiency. However, an even more basic requirement is that the user interface be useful, i.e. that it allow the user to complete relevant tasks.

Intuitive and Natural. Software products are often touted by marketeers as being "intuitive" and "natural" to use, often simply because they have a graphical user interface. Many researchers in HCI view such claims as unfounded (e.g. a poorly designed GUI may be very unusable), and some object to the use of the words intuitive and natural as vague and/or misleading, since these are very context-dependent terms.

Data Density and Information Absorption. The rapid growth in the density of computer screen real estate has created an opportunity to accelerate "information absorption" to much higher levels. Classic "data density" on a computer is 50-100 data points, recent advances in data visualization enable thousands of data points to be presented in forms which can be rapidly absorbed. Interfaces such as virtual reality will give further growth the potential density of information presented.

When evaluating a current user interface, or designing a new user interface, it is important to keep in mind the following experimental design principles:

Early focus on user(s) and task(s): Establish how many users are needed to perform the task(s) and determine who the appropriate users should be; someone who has never used the interface, and will not use the interface in the future, is most likely not a valid user. In addition, define the task(s) the users will be performing and how often the task(s) need to be performed.

Empirical measurement: Test the interface early on with real users who come in contact with the interface on an everyday basis. Keep in mind that results may vary with the performance level of the user and may not be an accurate depiction of the typical human-computer interaction. Establish quantitative usability specifics such as: the number of users performing the task(s), the time to complete the task(s), and the number of errors made during the task(s).

Iterative design: After determining the users, tasks, and empirical measurements to include, perform the following iterative design steps:

Design the user interface

Test

Analyze results

Repeat

Repeat the iterative design process until a sensible, user-friendly interface is created.[6]

A number of diverse methodologies outlining techniques for human-computer interaction design have emerged since the rise of the field in the 1980s. Most design methodologies stem from a model for how users, designers, and technical systems interact. Early methodologies, for example, treated users' cognitive processes as predictable and quantifiable and encouraged design practitioners to look to cognitive science results in areas such as memory and attention when designing user interfaces. Modern models tend to focus on a constant feedback and conversation between users, designers, and engineers and push for technical systems to be wrapped around the types of experiences users want to have, rather than wrapping user experience around a completed system.

Activity theory: used in HCI to define and study the context in which human interactions with computers take place. Activity theory provides a framework to reason about actions in these contexts, analytical tools with the format of checklists of items that researchers should consider, and informs design of interactions from an activity-centric perspective.[7]

User-centered design: User-centered design (UCD) is a modern, widely practiced design philosophy rooted in the idea that users must take center-stage in the design of any computer system. Users, designers, and technical practitioners work together to articulate the wants, needs, and limitations of the user and create a system that addresses these elements. Often, user-centered design projects are informed by ethnographic studies of the environments in which users will be interacting with the system.

Contextual Usability: Contextual Usability (CU) is a framework also arising from the ‘ethnographic turn’ in the human, social and computer sciences and during the 1990s, although statistical direct observation methods and system-logging also play a role in its analysis. CU seeks to privilege neither users nor technology within a use or usage process. As such it links usability, ergonomics and user experience design to ideas emerging from social studies of science and technology such as actor-networks and sociotechnical constituencies . It seeks to locate motivations, instances and circumstances of use against social, cognitive and cultural influences. These can promote or negate the formation of usage patterns and periodicities. It views usability as a project (in design) and an experience (in use), one which is 'just outside' the boundaries of design affect and 'just inside' a potential or actual users whole experience of an artifact or service. It generates data according to a quadrant which includes use, usability, usage, and usefulness. It is most associated with the work of Derek William Nicoll.

Principles of user interface design: these are seven principles of user interface design that may be considered at any time during the design of a user interface in any order: tolerance, simplicity, visibility, affordance, consistency, structure and feedback.[8]

One of the top academic conferences for new research in human-computer interaction, especially within computer science, is the annually held ACM's Conference on Human Factors in Computing Systems, usually referred to by its short name CHI (pronounced kai, or khai). CHI is organized by ACM SIGCHI Special Interest Group on Computer-Human Interaction. CHI is a large, highly competitive conference, with thousands of attendants, and is quite broad in scope.