Pieter Jan StappersDelft University of Technology, Froukje Sleeswijk VisserDelft University of Technology, Corrie van der LelieDelft University of Technology

Abstract » Storyboards have become popular techniques for visualising human-product interaction. Not only in design education, but also in design practice. They can help the design team focus on the user's actions, understanding, and experience instead of the appliance's physical form; they can be used to highlight the context, e.g., place, situation, social setting, in which the appliance is used. Their form ranges from very sketchy to very detailed, depending on whether they are used to explore new ideas, report existing situations, or present design concepts for criticism and discussion.

In this workshop we will use examples of storyboards from product design, movies, and comics to show the possibilities of their visual language. In hands-on exercises, we develop storyboards in the context of appliance design, and explore how they can be used to support design discussions and presentations. We explore the relation between storyboards and other design techniques (role-playing, sketching, quick-and-dirty modelling, use scenarios). Special attention will be given to questions of how to suggest visually situations, social interactions, emotions, causal relations, and how to set up a story line by integrating situations.

The tutorial uses a mixture of forms. After an introductory lecture covering theory and examples, participants are given a practical photoboarding exercise. On the basis of this exercise, participants discuss with instructors how they can use the techniques in their work. After this discussion, a second presentation gives more background information and guidelines for further study.

Abstract » Logging user behavior is a crucial aspect of understanding how people actually use online systems. Yet a large number of HCI practitioners have not had a great deal of exposure to the issues involved in obtaining, cleaning, interpreting or analyzing large log file data sets. In this CHI 2011 course, five experts in very-large-scale logs analysis will present the ways and means for CHI attendees to gather and analyze log data.

Abstract » Participants in this course will benefit by learning how to conduct empirical research in human-computer interaction. As most attendees at CHI conferences will agree, a "user study" is the hallmark of good research in human-computer interaction. But, what constitutes a user study? By and large, a user study is an experiment conforming to the norms for empirical inquiry and the scientific method. It is founded on observation, measurement, and posing and answering testable research questions. This course delivers an A-to-Z tutorial on conducting an empirical experiment (aka user study) in human-computer interaction.

Abstract » We can't predict the future in detail, but by understanding the dynamics that shaped the present, we can more effectively direct our efforts. This course covers the history of human-computer interaction as it has been approached by psychologists, computer scientists, human factors engineers, researchers in information systems and information science, and others. HCI has changed dramatically, surprising experienced researchers and practitioners. Seeing how events unfolded in the past may prepare us to better handle the surprises that lie ahead.

This course was well-received at two previous CHI conferences. It draws on my articles and handbook chapters, and on the work of many people including those who contributed Timelines columns to Interactions magazine. It was not offered the past two years and has been significantly enhanced for 2011.

Features:

• How is HCI seen by different fields – CHI, Human Factors, Information Systems, and Information Science? Why do their views differ?

• How have technology innovation and behavior co-evolved?

• What is involved in bridging HCI-related disciplines?

• What major shifts of direction have occurred in human-computer interaction, and why?

• What do trajectories of change from past to present tell us about what may lie ahead?

• The course only touches on who did what when and on conceptual history. The focus is on forces that led to widespread shifts over time.

• The course now includes the history of library and information studies, suggesting why this discipline was distant from computer science, and why that is changing.

This lecture course has very few bullet points! It relies on timelines, graphics and quotations. There will be time for discussion.

Presenter

Jonathan Grudin is a Principal Researcher in the Microsoft Research Adaptive Systems and Interaction Group, prior to which he was Professor of Information and Computer Science at UC Irvine.

Pieter Jan StappersDelft University of Technology, Froukje Sleeswijk VisserDelft University of Technology, Corrie van der LelieDelft University of Technology

Abstract » Storyboards have become popular techniques for visualising human-product interaction. Not only in design education, but also in design practice. They can help the design team focus on the user's actions, understanding, and experience instead of the appliance's physical form; they can be used to highlight the context, e.g., place, situation, social setting, in which the appliance is used. Their form ranges from very sketchy to very detailed, depending on whether they are used to explore new ideas, report existing situations, or present design concepts for criticism and discussion.

In this workshop we will use examples of storyboards from product design, movies, and comics to show the possibilities of their visual language. In hands-on exercises, we develop storyboards in the context of appliance design, and explore how they can be used to support design discussions and presentations. We explore the relation between storyboards and other design techniques (role-playing, sketching, quick-and-dirty modelling, use scenarios). Special attention will be given to questions of how to suggest visually situations, social interactions, emotions, causal relations, and how to set up a story line by integrating situations.

The tutorial uses a mixture of forms. After an introductory lecture covering theory and examples, participants are given a practical photoboarding exercise. On the basis of this exercise, participants discuss with instructors how they can use the techniques in their work. After this discussion, a second presentation gives more background information and guidelines for further study.

Abstract » Logging user behavior is a crucial aspect of understanding how people actually use online systems. Yet a large number of HCI practitioners have not had a great deal of exposure to the issues involved in obtaining, cleaning, interpreting or analyzing large log file data sets. In this CHI 2011 course, five experts in very-large-scale logs analysis will present the ways and means for CHI attendees to gather and analyze log data.

Abstract » Participants in this course will benefit by learning how to conduct empirical research in human-computer interaction. As most attendees at CHI conferences will agree, a "user study" is the hallmark of good research in human-computer interaction. But, what constitutes a user study? By and large, a user study is an experiment conforming to the norms for empirical inquiry and the scientific method. It is founded on observation, measurement, and posing and answering testable research questions. This course delivers an A-to-Z tutorial on conducting an empirical experiment (aka user study) in human-computer interaction.

Keith ButlerUniversity of Washington, Robert JacobTufts University, David KierasUniversity of Michigan

Abstract » The objective of this course is to provide newcomers to Human-Computer Interaction (HCI) with an introduction and overview of the field. The overview will also make their conference attendance more meaningful. In addition to introducing basic concepts, the course will provide enough structure to help understand how the advanced material in CHI 2011 technical program fits into the overall field.

The material begins with a brief history and explanation of need. The main discussion considers the field from three perspectives: what it takes to build usable systems; the psychology of the needed technology; and the computer science of the needed technology. Specific topics include psychologically based data, design methods and tools, user interface media and tools, and introduction to user interface architecture. In each, we will cover research, technology under development, and current application. Sources for follow-on information will be given.

The intended audience is made up of professionals in computer-related fields who have not yet had a systematic exposure to the discipline of computer-human interaction, typically first-time attendees of the CHI conference. CHI professionals who wish to examine how their work relates to the field as a whole should also attend.

Abstract » At the core of all computer systems is a design—the one being used by your customer. The blueprint or foundation of that design is found in the interaction design. While numerous people are involved in designing systems, products, and services, many don’t have formal design training. Human factors specialists, user researchers, usability professionals, user experience designers, developers, and others often struggle when it comes to interaction design. Even with good design instincts, it can be hard to participate in interaction design conversations and evaluations when you don’t know the principles and underlying structure. Even those with formal design training (especially other design disciplines) can have difficulty articulating and communicating interaction design decisions—particularly when working with those who have no formal design training.

When creating or evaluating designs, people often get caught up in the surface (or UI) of the design, or they try to use the latest design trends without looking at the more important structure that underlies an interaction design. Focusing on the essential core concepts, this course provides a foundation to better understand interaction design and the importance of underlying structure. The basic materials and building blocks, key design principles, and structure of interaction design are illustrated by using familiar, real-world examples. The course also introduces interaction design patterns as a method for identifying structure. By learning how to use patterns to analyze structure and reveal new information, participants also learn how to evaluate designs in a more substantive and valuable way.

Abstract » Logging user behavior is a crucial aspect of understanding how people actually use online systems. Yet a large number of HCI practitioners have not had a great deal of exposure to the issues involved in obtaining, cleaning, interpreting or analyzing large log file data sets. In this CHI 2011 course, five experts in very-large-scale logs analysis will present the ways and means for CHI attendees to gather and analyze log data.

Abstract » This course will introduce participants to the area of evaluating with children. Using examples and experience from the filed, the course organisers will cover key areas including ethical practice, what to do when things go wrong, methods for engaging with children, especially in schools, and idaes for how to interpret what children say and do during evaluation studies

Keith ButlerUniversity of Washington, Robert JacobTufts University, David KierasUniversity of Michigan

Abstract » The objective of this course is to provide newcomers to Human-Computer Interaction (HCI) with an introduction and overview of the field. The overview will also make their conference attendance more meaningful. In addition to introducing basic concepts, the course will provide enough structure to help understand how the advanced material in CHI 2011 technical program fits into the overall field.

The material begins with a brief history and explanation of need. The main discussion considers the field from three perspectives: what it takes to build usable systems; the psychology of the needed technology; and the computer science of the needed technology. Specific topics include psychologically based data, design methods and tools, user interface media and tools, and introduction to user interface architecture. In each, we will cover research, technology under development, and current application. Sources for follow-on information will be given.

The intended audience is made up of professionals in computer-related fields who have not yet had a systematic exposure to the discipline of computer-human interaction, typically first-time attendees of the CHI conference. CHI professionals who wish to examine how their work relates to the field as a whole should also attend.

Abstract » If you don’t measure it you can’t manage it. Usability analysis and user-research is about more than rules of thumb, good design and intuition: it’s about making better decisions with data. Is Product A faster than Product B? Will more users complete tasks on the new design? Learn how to conduct and interpret appropriate statistical tests on small and large sample usability data then communicate your results in easy to understand terms to stakeholders.

Features

-- Get a visual introduction or refresher to the most important statistical concepts for applied use.

-- Be able to compare two interfaces or versions (A/B Testing) by showing statistical significance (e.g. Product A takes 20% less time to complete a task than Product B p <.05).

-- Clearly understand both the limits and data available from small sample usability data through use of confidence intervals.

Audience

Open to anyone who’s interested in quantitative user research. Participants should be familiar with the process of conducting usability tests or research as well as basic descriptive statistics such as the mean, median and standard deviation and have access to Microsoft Excel.

The presentation will be a mix of enthusiastic instruction, with movie-clips, pictures, demonstrations and interactive exercises all aimed at helping make the abstract topic of statistics concrete, memorable and actionable.

Abstract » Qualitative research is essential to user-centered design. Unfortunately, qualitative data is often overwhelming in volume and ambiguity. Practitioners quickly discover that the challenges of making (valid) sense of a mountain of field notes, artifacts, photos, and audio and video recordings are immense. Without a disciplined process of analysis, researchers are subject to many kinds of errors of inference. In practice, conclusions are often impressionistic or anecdotal, with vague or even misleading implications for design. The difficulties of managing qualitative data, and following a disciplined process to extract valid meaning and practical design guidance from it, require specific, well thought-out strategies at each stage of the research.

The need for these strategies is growing as field research becomes increasingly common in HCI. Therefore, this tutorial will focus on teaching practical strategies to apply during data collection and analysis. Although inspired by the challenges of qualitative data from contextual field studies, many of the skills taught can be applied to other types of qualitative data, such as that from exploratory usability testing, open-ended interviews, etc.

Abstract » Every designer dreams of creating something more – something so great that people crave it, long for it, must have it. Marketers call it “a must have”, “compelling”, or “insanely great”. But most of the rest of us just call it Cool.

Over the past several decades, Cool has evolved into a marketing imperative: an overarching requirement for many designs. Companies spend billions organizing and reorganizing to cultivate “innovation” so they can reliably create it. And a new generation, with vastly different expectations on Cool design, is coming of age as consumers and workers. But Cool is hard to pin down – there’s no accepted way to define it, measure it, or design for it. Like glamour, it is an ineffable yet powerful quality that depends on a host of subtle factors.

This course presents a set of core attributes that make products and applications Cool. These design attributes emerged from an extensive cross-generational contextual research project understanding how people from 15 to 60 experience “cool” and its relationship to value and impact on their lives.

We first present core Cool concepts based on the research, using real product and service examples to illustrate and illuminate the material. We include the application of cool concepts to productivity business applications. Attendees participate in an exercise to evaluate the products they use, own and/or are designing to how Cool attributes apply and affect them. We end with an analysis of what it takes for an organization to develop and ship game-changing products revealing the real effort required to create cool products. We look at the problems organizations face in creating Cool, and discuss the challenges inherent in large organizations as they attempt to move toward a more innovative culture.

Abstract » This course will help attendees understand the logic of research through design. It describes the process of conducting design-led research, including how projects are framed, the trajectory of a typical project, and what research outcomes can teach us. A number of design-led methods will be discussed, including Cultural Probes, the construction of design proposals, and the use of ethnography and cultural commentators during field trials of design prototypes.

At a more fundamental level, the course considers the logic of design research through a contrast with the scientific tradition that underpins most traditional CHI research. This will allow consideration of what can – and should – be known in the course of design, and when explicitly not knowing something can be a more useful position. The intention will be to build a view of research through design that can assist attendees in developing their own approaches and projects and in assessing and learning from design-led research at CHI.

Abstract » If you don’t measure it you can’t manage it. Usability analysis and user-research is about more than rules of thumb, good design and intuition: it’s about making better decisions with data. Is Product A faster than Product B? Will more users complete tasks on the new design? Learn how to conduct and interpret appropriate statistical tests on small and large sample usability data then communicate your results in easy to understand terms to stakeholders.

Features

-- Get a visual introduction or refresher to the most important statistical concepts for applied use.

-- Be able to compare two interfaces or versions (A/B Testing) by showing statistical significance (e.g. Product A takes 20% less time to complete a task than Product B p <.05).

-- Clearly understand both the limits and data available from small sample usability data through use of confidence intervals.

Audience

Open to anyone who’s interested in quantitative user research. Participants should be familiar with the process of conducting usability tests or research as well as basic descriptive statistics such as the mean, median and standard deviation and have access to Microsoft Excel.

The presentation will be a mix of enthusiastic instruction, with movie-clips, pictures, demonstrations and interactive exercises all aimed at helping make the abstract topic of statistics concrete, memorable and actionable.

Abstract » Qualitative research is essential to user-centered design. Unfortunately, qualitative data is often overwhelming in volume and ambiguity. Practitioners quickly discover that the challenges of making (valid) sense of a mountain of field notes, artifacts, photos, and audio and video recordings are immense. Without a disciplined process of analysis, researchers are subject to many kinds of errors of inference. In practice, conclusions are often impressionistic or anecdotal, with vague or even misleading implications for design. The difficulties of managing qualitative data, and following a disciplined process to extract valid meaning and practical design guidance from it, require specific, well thought-out strategies at each stage of the research.

The need for these strategies is growing as field research becomes increasingly common in HCI. Therefore, this tutorial will focus on teaching practical strategies to apply during data collection and analysis. Although inspired by the challenges of qualitative data from contextual field studies, many of the skills taught can be applied to other types of qualitative data, such as that from exploratory usability testing, open-ended interviews, etc.

Abstract » For over four billion people, the mobile phone (or “cellphone”) is an essential part of everyday life. It’s a business tool to clinch important deals; a “remote control” for the real world, helping us cope with daily travel delay frustrations; a “relationship appliance” to say goodnight to loved-ones when away from home; a community device to organize political demonstrations.

This course is about shifting the mobile design perspective away from “smart” phones, to people who are smart, creative, busy, or plain bored. The course will both introduce interesting and empowering mobile design philosophies, principles and methods as well as giving specific guidance on key emerging consumer application areas such as pedestrian navigation and location aware services.

Our aim is to inspire attendees to strive for breathtakingly effective services. We want attendees to leave the course with a fresh perspective on their current projects and an eagerness to build a long-term better future for mobile users.

This course will appeal to a broad audience. Some will be novices to the field of mobile interactive service and device design; others will be developers and researchers with experience in producing innovative work themselves. The material will also be accessible to those involved in non-technical roles such as mobile analysts and strategists.

Abstract » Design researchers often follow a research through design (RtD) approach, where they employ a design practice process of making things as a method of research; however, this research approach is not well known to HCI researchers and is not always appropriate to the cares and concerns of this community. To address this issue of fit, we have crafted a specific approach for RtD to better situate it for HCI.

In our approach, design researchers iteratively reframe a problematic situation. As they engage in the process of making things, design researchers integrate the latest technology from engineers, the theories and models from behavioral scientists, and the descriptions of the real world from anthropologist. Each artifact produced offers a glimpse of a possible preferred future state.

In this course we will describe the practice of RtD, we will describe some example cases, and we will discuss the criteria for evaluating research contributions made following this approach. We will also offer a critique of this approach and discuss how it fits within the HCI research community.

Abstract » One of the most persistent factors limiting the impact of user research in business is that projects often stop with a cataloging findings and implications rather than generating opportunities that directly enable the findings. We’ve long heard the lament “Well, we got this report and it just sat there. We didn’t know what to do with it.” But design research (or ethnography, or user research, or whatever the term du jour may be) has also become standard practice, as opposed to something exceptional or innovative. That means that designers are increasingly involved in using contextual research to inform their design work. Courses at CHI and elsewhere have increased the ranks of designers and others who feel comfortable conducting user research. But analysis and synthesis is a more slippery skill set, and we see how easy it is for teams to ignore (more out of frustration than anything malicious) data that doesn’t immediately seem actionable. This course gives people the tools to take control over synthesis and ideation themselves by breaking it down into a manageable framework and process.

Abstract » Qualitative research is essential to user-centered design. Unfortunately, qualitative data is often overwhelming in volume and ambiguity. Practitioners quickly discover that the challenges of making (valid) sense of a mountain of field notes, artifacts, photos, and audio and video recordings are immense. Without a disciplined process of analysis, researchers are subject to many kinds of errors of inference. In practice, conclusions are often impressionistic or anecdotal, with vague or even misleading implications for design. The difficulties of managing qualitative data, and following a disciplined process to extract valid meaning and practical design guidance from it, require specific, well thought-out strategies at each stage of the research.

The need for these strategies is growing as field research becomes increasingly common in HCI. Therefore, this tutorial will focus on teaching practical strategies to apply during data collection and analysis. Although inspired by the challenges of qualitative data from contextual field studies, many of the skills taught can be applied to other types of qualitative data, such as that from exploratory usability testing, open-ended interviews, etc.

Abstract » For over four billion people, the mobile phone (or “cellphone”) is an essential part of everyday life. It’s a business tool to clinch important deals; a “remote control” for the real world, helping us cope with daily travel delay frustrations; a “relationship appliance” to say goodnight to loved-ones when away from home; a community device to organize political demonstrations.

This course is about shifting the mobile design perspective away from “smart” phones, to people who are smart, creative, busy, or plain bored. The course will both introduce interesting and empowering mobile design philosophies, principles and methods as well as giving specific guidance on key emerging consumer application areas such as pedestrian navigation and location aware services.

Our aim is to inspire attendees to strive for breathtakingly effective services. We want attendees to leave the course with a fresh perspective on their current projects and an eagerness to build a long-term better future for mobile users.

This course will appeal to a broad audience. Some will be novices to the field of mobile interactive service and device design; others will be developers and researchers with experience in producing innovative work themselves. The material will also be accessible to those involved in non-technical roles such as mobile analysts and strategists.

Cosmin MunteanuNational Research Council Canada, Gerald PennUniversity of Toronto

Abstract » Speech remains the "holy grail" of interaction, as this is the most natural form of communication that humans employ. Unfortunately, it is also one of the most difficult modalities to be understood by machines -- despite, and perhaps, because it is the highest-bandwidth communication channel we possess. While significant research efforts, from engineering, to linguistic, and to cognitive sciences, have been spent on improving machines' ability to understand speech, the HCI community has been relatively timid in embracing this modality as a central focus of research. This can be attributed in part to the relatively discouraging levels of accuracy in understanding speech, in contrast with often-unfounded claims of success from industry, but also to the intrinsic difficulty of designing and especially evaluating speech and natural language interfaces. While the accuracies of understanding speech input are still discouraging, several interesting areas are yet to be explored that could make speech-based interaction truly hands-free. The goal of this course is to inform the HCI community of the current state of speech and natural language research, to dispel some of the myths surrounding speech-based interaction, as well as to provide an opportunity for HCI researchers and practitioners to learn more about how speech recognition works, what are its limitations, and how it could be used to enhance current interaction paradigms.

Abstract » One of the most persistent factors limiting the impact of user research in business is that projects often stop with a cataloging findings and implications rather than generating opportunities that directly enable the findings. We’ve long heard the lament “Well, we got this report and it just sat there. We didn’t know what to do with it.” But design research (or ethnography, or user research, or whatever the term du jour may be) has also become standard practice, as opposed to something exceptional or innovative. That means that designers are increasingly involved in using contextual research to inform their design work. Courses at CHI and elsewhere have increased the ranks of designers and others who feel comfortable conducting user research. But analysis and synthesis is a more slippery skill set, and we see how easy it is for teams to ignore (more out of frustration than anything malicious) data that doesn’t immediately seem actionable. This course gives people the tools to take control over synthesis and ideation themselves by breaking it down into a manageable framework and process.

Abstract » Qualitative research is essential to user-centered design. Unfortunately, qualitative data is often overwhelming in volume and ambiguity. Practitioners quickly discover that the challenges of making (valid) sense of a mountain of field notes, artifacts, photos, and audio and video recordings are immense. Without a disciplined process of analysis, researchers are subject to many kinds of errors of inference. In practice, conclusions are often impressionistic or anecdotal, with vague or even misleading implications for design. The difficulties of managing qualitative data, and following a disciplined process to extract valid meaning and practical design guidance from it, require specific, well thought-out strategies at each stage of the research.

The need for these strategies is growing as field research becomes increasingly common in HCI. Therefore, this tutorial will focus on teaching practical strategies to apply during data collection and analysis. Although inspired by the challenges of qualitative data from contextual field studies, many of the skills taught can be applied to other types of qualitative data, such as that from exploratory usability testing, open-ended interviews, etc.

Abstract » People are constantly making small choices and larger decisions about their use of computing technology, such as: "Shall I use this new application as a replacement for my current one?" "Should I bother to configure the new application to suit my preferences?" "Shall I make a contribution to this on-line community?" "If so, which of the two available methods should I use?" The processes by which users arrive at these choices and decisions can take many different forms and depend on a wide range of factors, such as previous learning and habit formation, mental models, values and goals, the current context, social influence, and details of interface design. This course offers a synthesis of relevant research in psychology and HCI that will enable you to analyse systematically the choices made by the users that you study. This type of analysis will be useful in the design and interpretation of studies that involve users' choices and in the generation of strategies for enabling users to make better choices.

Abstract » Social television and social communications for the home constitute a fundamental shift on how people interact and socialize. Web sites are combining streaming video with social media such as Facebook and Twitter, Boxee has already launched its set-top box and Google TV has been announced. According to CISCO, Internet video and video communications will be the two major generators of data traffic in the future, and MIT lists social TV as one of the 10 emerging technologies for 2010. The purpose of this course is to provide an overview on current developments and its implications for the CHI community. By exploring in detail the design and evaluation of existing social communication applications for the home, the attendees will come to a deeper understanding of the strengths and weaknesses of them.

Abstract » In this class, you will learn how to plan for and carry out studies of users in the field. Rather than teaching a single way to do field research, we provide you with the tools to think critically about the many planning and methodological choices you will have to make. You will watch videos of a variety of user research sessions where we have used a variety of techniques including Contextual Inquiry and Artifact Walkthrough.

This is a significant update of a highly rated tutorial from many past CHI conferences.

In this class, you will: • Learn how field research complements other User-Centered Design (UCD) techniques • Learn what it takes to make fieldwork more than just "anecdote collecting." • Learn fine points of four data-gathering techniques o Naturalistic Observation o Contextual Inquiry o Artifact Walkthroughs o Naturalistic Usability Evaluation • Identify next steps for data analysis • Learn when and how to apply these tools to user-centered design

This hands-on session is aimed at practitioners doing, planning, and leading field research, including developers, designers, and managers who are responsible for user experience or user requirements identification. This is an introductory to intermediate level tutorial. It will be useful for beginners in fieldwork, as well as those with some experience who want to broaden their knowledge of approaches.

Abstract » People are constantly making small choices and larger decisions about their use of computing technology, such as: "Shall I use this new application as a replacement for my current one?" "Should I bother to configure the new application to suit my preferences?" "Shall I make a contribution to this on-line community?" "If so, which of the two available methods should I use?" The processes by which users arrive at these choices and decisions can take many different forms and depend on a wide range of factors, such as previous learning and habit formation, mental models, values and goals, the current context, social influence, and details of interface design. This course offers a synthesis of relevant research in psychology and HCI that will enable you to analyse systematically the choices made by the users that you study. This type of analysis will be useful in the design and interpretation of studies that involve users' choices and in the generation of strategies for enabling users to make better choices.

Abstract » Social television and social communications for the home constitute a fundamental shift on how people interact and socialize. Web sites are combining streaming video with social media such as Facebook and Twitter, Boxee has already launched its set-top box and Google TV has been announced. According to CISCO, Internet video and video communications will be the two major generators of data traffic in the future, and MIT lists social TV as one of the 10 emerging technologies for 2010. The purpose of this course is to provide an overview on current developments and its implications for the CHI community. By exploring in detail the design and evaluation of existing social communication applications for the home, the attendees will come to a deeper understanding of the strengths and weaknesses of them.

Abstract » In this class, you will learn how to plan for and carry out studies of users in the field. Rather than teaching a single way to do field research, we provide you with the tools to think critically about the many planning and methodological choices you will have to make. You will watch videos of a variety of user research sessions where we have used a variety of techniques including Contextual Inquiry and Artifact Walkthrough.

This is a significant update of a highly rated tutorial from many past CHI conferences.

In this class, you will: • Learn how field research complements other User-Centered Design (UCD) techniques • Learn what it takes to make fieldwork more than just "anecdote collecting." • Learn fine points of four data-gathering techniques o Naturalistic Observation o Contextual Inquiry o Artifact Walkthroughs o Naturalistic Usability Evaluation • Identify next steps for data analysis • Learn when and how to apply these tools to user-centered design

This hands-on session is aimed at practitioners doing, planning, and leading field research, including developers, designers, and managers who are responsible for user experience or user requirements identification. This is an introductory to intermediate level tutorial. It will be useful for beginners in fieldwork, as well as those with some experience who want to broaden their knowledge of approaches.

Features: An important early step when designing a software user interface is to design a coherent, task-focused conceptual model. Unfortunately, many designers start sketching and prototyping the UI before they understand the application at a conceptual level. The result is incoherent, overly-complex applications that expose concepts unrelated to users’ tasks. This course covers:

Instructor background: Jeff Johnson is Principal Consultant at UI Wizards, a product usability consulting firm. He has worked in HCI since 1978. After earning B.A. and Ph.D. degrees in cognitive psychology from Yale and Stanford, he worked as a UI designer/implementer, usability tester, manager, and researcher at Cromemco, Xerox, US West, Hewlett-Packard, and Sun. Since 1996 he has been a consultant and an author. He has published numerous articles and chapters on HCI. He wrote the books GUI Bloopers, Web Bloopers, and GUI Bloopers 2.0. His new book, Designing with the Mind in Mind, introduces perceptual and cognitive psychology to software developers.

If you don’t measure it you can’t manage it. Usability analysis and user-research is about more than rules of thumb, good design and intuition: it’s about making better decisions with data. Did we meet our goal of a 75% completion rate? What sample size should we plan on for a survey, or for comparing product? Will five users really find 85% of all problems?

Learn how to conduct and interpret appropriate statistical tests on usability data, compute sample sizes and communicate your results in easy to understand terms to stakeholders.

Features

-- Determine your sample size for comparing two designs, a benchmarking study, survey analysis or finding problems in an interface.

-- Determine if a usability test has met or exceeded a goal (e.g. users can complete the transaction is less than 2 minutes).

-- Get practice knowing what statistical test to perform and how to interpret the results (p-values and confidence intervals).

Audience Open to anyone who’s interested in quantitative usability tests. Participants should be familiar with the process of conducting usability tests as well as be familiar with major statistical topics such as normal theory, confidence intervals and t-tests. Participants should also have access to Microsoft Excel to use the provided calculators.

Presentation The presentation will be a mix of enthusiastic instruction, with movie-clips, pictures, demonstrations and interactive exercises all aimed at helping make the abstract topic of statistics concrete, memorable and actionable.

Abstract » In this class, you will learn how to plan for and carry out studies of users in the field. Rather than teaching a single way to do field research, we provide you with the tools to think critically about the many planning and methodological choices you will have to make. You will watch videos of a variety of user research sessions where we have used a variety of techniques including Contextual Inquiry and Artifact Walkthrough.

This is a significant update of a highly rated tutorial from many past CHI conferences.

In this class, you will: • Learn how field research complements other User-Centered Design (UCD) techniques • Learn what it takes to make fieldwork more than just "anecdote collecting." • Learn fine points of four data-gathering techniques o Naturalistic Observation o Contextual Inquiry o Artifact Walkthroughs o Naturalistic Usability Evaluation • Identify next steps for data analysis • Learn when and how to apply these tools to user-centered design

This hands-on session is aimed at practitioners doing, planning, and leading field research, including developers, designers, and managers who are responsible for user experience or user requirements identification. This is an introductory to intermediate level tutorial. It will be useful for beginners in fieldwork, as well as those with some experience who want to broaden their knowledge of approaches.

Abstract » For over four billion people, the mobile phone (or “cellphone”) is an essential part of everyday life. It’s a business tool to clinch important deals; a “remote control” for the real world, helping us cope with daily travel delay frustrations; a “relationship appliance” to say goodnight to loved-ones when away from home; a community device to organize political demonstrations.

This course is about shifting the mobile design perspective away from “smart” phones, to people who are smart, creative, busy, or plain bored. The course will both introduce interesting and empowering mobile design philosophies, principles and methods as well as giving specific guidance on key emerging consumer application areas such as pedestrian navigation and location aware services.

Our aim is to inspire attendees to strive for breathtakingly effective services. We want attendees to leave the course with a fresh perspective on their current projects and an eagerness to build a long-term better future for mobile users.

This course will appeal to a broad audience. Some will be novices to the field of mobile interactive service and device design; others will be developers and researchers with experience in producing innovative work themselves. The material will also be accessible to those involved in non-technical roles such as mobile analysts and strategists.

Features: An important early step when designing a software user interface is to design a coherent, task-focused conceptual model. Unfortunately, many designers start sketching and prototyping the UI before they understand the application at a conceptual level. The result is incoherent, overly-complex applications that expose concepts unrelated to users’ tasks. This course covers:

Instructor background: Jeff Johnson is Principal Consultant at UI Wizards, a product usability consulting firm. He has worked in HCI since 1978. After earning B.A. and Ph.D. degrees in cognitive psychology from Yale and Stanford, he worked as a UI designer/implementer, usability tester, manager, and researcher at Cromemco, Xerox, US West, Hewlett-Packard, and Sun. Since 1996 he has been a consultant and an author. He has published numerous articles and chapters on HCI. He wrote the books GUI Bloopers, Web Bloopers, and GUI Bloopers 2.0. His new book, Designing with the Mind in Mind, introduces perceptual and cognitive psychology to software developers.

If you don’t measure it you can’t manage it. Usability analysis and user-research is about more than rules of thumb, good design and intuition: it’s about making better decisions with data. Did we meet our goal of a 75% completion rate? What sample size should we plan on for a survey, or for comparing product? Will five users really find 85% of all problems?

Learn how to conduct and interpret appropriate statistical tests on usability data, compute sample sizes and communicate your results in easy to understand terms to stakeholders.

Features

-- Determine your sample size for comparing two designs, a benchmarking study, survey analysis or finding problems in an interface.

-- Determine if a usability test has met or exceeded a goal (e.g. users can complete the transaction is less than 2 minutes).

-- Get practice knowing what statistical test to perform and how to interpret the results (p-values and confidence intervals).

Audience Open to anyone who’s interested in quantitative usability tests. Participants should be familiar with the process of conducting usability tests as well as be familiar with major statistical topics such as normal theory, confidence intervals and t-tests. Participants should also have access to Microsoft Excel to use the provided calculators.

Presentation The presentation will be a mix of enthusiastic instruction, with movie-clips, pictures, demonstrations and interactive exercises all aimed at helping make the abstract topic of statistics concrete, memorable and actionable.

Abstract » In this class, you will learn how to plan for and carry out studies of users in the field. Rather than teaching a single way to do field research, we provide you with the tools to think critically about the many planning and methodological choices you will have to make. You will watch videos of a variety of user research sessions where we have used a variety of techniques including Contextual Inquiry and Artifact Walkthrough.

This is a significant update of a highly rated tutorial from many past CHI conferences.

In this class, you will: • Learn how field research complements other User-Centered Design (UCD) techniques • Learn what it takes to make fieldwork more than just "anecdote collecting." • Learn fine points of four data-gathering techniques o Naturalistic Observation o Contextual Inquiry o Artifact Walkthroughs o Naturalistic Usability Evaluation • Identify next steps for data analysis • Learn when and how to apply these tools to user-centered design

This hands-on session is aimed at practitioners doing, planning, and leading field research, including developers, designers, and managers who are responsible for user experience or user requirements identification. This is an introductory to intermediate level tutorial. It will be useful for beginners in fieldwork, as well as those with some experience who want to broaden their knowledge of approaches.

Abstract » Agile methods, Scrum in particular, are now widely used in the development community. UX professionals who work with Agile teams find that Agile approaches create roadblocks to their participation. Minimal up-front planning means there’s no time for user research or UX design; short sprints leave little time for considered interface design; and sprint reviews leave no place for usability testing or other validation of the sprint’s work. UX designers find that their old role relationships and procedures no longer work, their skills and techniques devalued, and there’s no clear guidance on how to contribute.

But, looking at their base principles, Agile methods should be friendly to UX participation. Continuous user feedback is core to Agile—and who better to supply it than UX designers? But many Agile values and attitudes work against the needs of UX design. Agile methods were created by developers, for developers, without much consideration for user interaction.

In this tutorial, we arm UX designers with concepts and techniques enabling them to participate effectively in Agile projects. We show why Agile methods make sense from the developers’ point of view—and how principles driving Agile methods can be used to support UX involvement. We also show where Agile methods work against the UX goal of a coherent, consistent interface and provide strategies to accomplish a coherent design anyway. We describe proven Agile/UX best practices for integrating the two perspectives.

Finally, we step back and look at project scope. Agile methods address small-scale projects—how to scale them up is debated in the Agile community. We show how to plan a user-centered Agile project of any scale, from iterative fixes to whole systems.

Abstract » For over four billion people, the mobile phone (or “cellphone”) is an essential part of everyday life. It’s a business tool to clinch important deals; a “remote control” for the real world, helping us cope with daily travel delay frustrations; a “relationship appliance” to say goodnight to loved-ones when away from home; a community device to organize political demonstrations.

This course is about shifting the mobile design perspective away from “smart” phones, to people who are smart, creative, busy, or plain bored. The course will both introduce interesting and empowering mobile design philosophies, principles and methods as well as giving specific guidance on key emerging consumer application areas such as pedestrian navigation and location aware services.

Our aim is to inspire attendees to strive for breathtakingly effective services. We want attendees to leave the course with a fresh perspective on their current projects and an eagerness to build a long-term better future for mobile users.

This course will appeal to a broad audience. Some will be novices to the field of mobile interactive service and device design; others will be developers and researchers with experience in producing innovative work themselves. The material will also be accessible to those involved in non-technical roles such as mobile analysts and strategists.

Abstract » This course provides attendees with the knowledge of why it’s worthwhile to conduct mobile international ethnographic research, how to set up this type of research, how to execute this type of research, how to produce exciting deliverables related to this type of research, and ways to ensure ethnographic research findings make positive effects on final products.

Today’s young people are information active, socially aware, and highly mobile. Designing new technologies for the iChild necessitates new design strategies. Attendees will be given a historical overview of co-designing with children and be introduced to co-design methods that have been specifically adapted for mobility, distributed sociability, and ubiquitous information. Attendees will participate in hands-on activities that employ the actual techniques for design. Each design technique will be given a context by presenting technologies that have been developed with that technique. Attendees will leave the course having been introduced to or updated on co-design methods that can lead to new technologies for the independent, interactive, and information active iChild.

Features:

• Hands-on experience using new methods in designing for children’s mobile, social, and Internet technologies

• Historical overview of co-designing with children

• Examples of technologies that have been developed with children using co-design methods </ul>

Audience:

The audience for this course requires no special background. We view design as most effective when it is interdisciplinary; therefore, we welcome and encourage attendance by industry professionals, academics, and students from a wide variety of communities (e.g., design, computer science, information studies, and psychology).

Presentation:

Hands-on design activities, small and whole-group discussion, short presentations with multimedia slides.

Abstract » Prototyping tools are making it easier to explore a design space so many different ideas can be generated and discussed, but evaluating those ideas to understand whether they are better, as opposed to just different, is still an intensely human task. User testing, concept validation, focus groups, design walkthroughs, all are expensive in both people’s time and real dollars.

Just as crash dummies in the automotive industry save lives by testing the physical safety of automobiles before they are brought to market, cognitive crash dummies save time, money, and potentially even lives, by allowing designers to automatically test their design ideas before implementing them. Cognitive crash dummies are models of human performance that make quantitative predictions of human behavior on proposed systems without the expense of empirical studies on running prototypes.

When cognitive crash dummies are built into prototyping tools, design ideas can be rapidly expressed and easily evaluated.

This course reviews the state of the art of predictive modeling and presents a tool, CogTool, that integrates rapid prototyping with modeling. Participants will use their own laptops to mock-up an interactive system and create a model of skilled performance on that mock-up. The course ends with a review of other tools and a look to the future of predictive modeling.

Participants in this course will • Understand the state of the art and the future of predictive human performance modeling. • Learn to prototype using CogTool, a free software tool, so that human performance models can be created. • Learn to make quantitative predictions of skilled execution time and how to use these predictions for benchmarking, competitive analysis, and requirements setting. • Walk away with the skills to use CogTool on their company's projects.

Abstract » This course provides attendees with the knowledge of why it’s worthwhile to conduct mobile international ethnographic research, how to set up this type of research, how to execute this type of research, how to produce exciting deliverables related to this type of research, and ways to ensure ethnographic research findings make positive effects on final products.

Today’s young people are information active, socially aware, and highly mobile. Designing new technologies for the iChild necessitates new design strategies. Attendees will be given a historical overview of co-designing with children and be introduced to co-design methods that have been specifically adapted for mobility, distributed sociability, and ubiquitous information. Attendees will participate in hands-on activities that employ the actual techniques for design. Each design technique will be given a context by presenting technologies that have been developed with that technique. Attendees will leave the course having been introduced to or updated on co-design methods that can lead to new technologies for the independent, interactive, and information active iChild.

Features:

• Hands-on experience using new methods in designing for children’s mobile, social, and Internet technologies

• Historical overview of co-designing with children

• Examples of technologies that have been developed with children using co-design methods </ul>

Audience:

The audience for this course requires no special background. We view design as most effective when it is interdisciplinary; therefore, we welcome and encourage attendance by industry professionals, academics, and students from a wide variety of communities (e.g., design, computer science, information studies, and psychology).

Presentation:

Hands-on design activities, small and whole-group discussion, short presentations with multimedia slides.

Abstract » Prototyping tools are making it easier to explore a design space so many different ideas can be generated and discussed, but evaluating those ideas to understand whether they are better, as opposed to just different, is still an intensely human task. User testing, concept validation, focus groups, design walkthroughs, all are expensive in both people’s time and real dollars.

Just as crash dummies in the automotive industry save lives by testing the physical safety of automobiles before they are brought to market, cognitive crash dummies save time, money, and potentially even lives, by allowing designers to automatically test their design ideas before implementing them. Cognitive crash dummies are models of human performance that make quantitative predictions of human behavior on proposed systems without the expense of empirical studies on running prototypes.

When cognitive crash dummies are built into prototyping tools, design ideas can be rapidly expressed and easily evaluated.

This course reviews the state of the art of predictive modeling and presents a tool, CogTool, that integrates rapid prototyping with modeling. Participants will use their own laptops to mock-up an interactive system and create a model of skilled performance on that mock-up. The course ends with a review of other tools and a look to the future of predictive modeling.

Participants in this course will • Understand the state of the art and the future of predictive human performance modeling. • Learn to prototype using CogTool, a free software tool, so that human performance models can be created. • Learn to make quantitative predictions of skilled execution time and how to use these predictions for benchmarking, competitive analysis, and requirements setting. • Walk away with the skills to use CogTool on their company's projects.

Abstract » Participants will learn how, with the support of the online Usability Planner tool, they can select user-centered methods that are most effective in reducing risk and maximizing cost benefits in a particular project, as an integral part of systems development.

• A logical basis for selection of usability methods, rather than just relying on professional skills or textbook recommendations.

• How to apply the principles of value-based software engineering to the use of UCD activities as part of systems development.

• An extensive checklist of criteria for selecting the most appropriate UCD methods.

• How to use a public domain tool that embodies these principles.

The course will be of value to:

• Usability practitioners wishing to improve their skills.

• Usability practitioners who need to prioritize and justify their activities.

• Project managers with responsibility for activities including use of user centered methods.

• Researchers concerned with developing criteria for the selection of methods, and with the integration of usability engineering and systems engineering.

• Educators and trainers who help students decide which methods to use.

Audience:

• Usability specialists and project managers who need to justify UCD activities, or who want to broaden the range of methods that they use, or who want advice on which methods to use in a new situation.

• Educators and trainers who help students decide which methods to use.

• Anyone who is interested in a more systematic approach to user-centered design.

Some familiarity with usability and user centered design is assumed, but no specific prior knowledge is needed.

Abstract » After taking this course, you will know the best practices in these frequently controversial areas of web forms design: - Where to put the labels - Where to put the buttons - Whether to put a colon at the end of a label - Whether to use sentence or title case for labels - How to indicate required fields

You will also know the underlying research results that inform these best practices. Although all of these points might seem somewhat trivial, they can create heated discussions that consume a disproportionate amount of development time. Cut through the discussion with specific, well-informed advice.

The course instructor, Caroline Jarrett, is co-author of “Forms that work: Designing web forms for usability” (Morgan Kaufmann, 2008). She has been working on making forms easier to fill in since 1992.

Today’s young people are information active, socially aware, and highly mobile. Designing new technologies for the iChild necessitates new design strategies. Attendees will be given a historical overview of co-designing with children and be introduced to co-design methods that have been specifically adapted for mobility, distributed sociability, and ubiquitous information. Attendees will participate in hands-on activities that employ the actual techniques for design. Each design technique will be given a context by presenting technologies that have been developed with that technique. Attendees will leave the course having been introduced to or updated on co-design methods that can lead to new technologies for the independent, interactive, and information active iChild.

Features:

• Hands-on experience using new methods in designing for children’s mobile, social, and Internet technologies

• Historical overview of co-designing with children

• Examples of technologies that have been developed with children using co-design methods </ul>

Audience:

The audience for this course requires no special background. We view design as most effective when it is interdisciplinary; therefore, we welcome and encourage attendance by industry professionals, academics, and students from a wide variety of communities (e.g., design, computer science, information studies, and psychology).

Presentation:

Hands-on design activities, small and whole-group discussion, short presentations with multimedia slides.

Abstract » Many in the agile and user experience (UX) communities have identified the need to integrate user experience into agile organizations, but there have been problems integrating the two. Differing mindsets, experiences, practices and goals have led to an "us vs them" mentality between UX people and agile developers. This course will explore ways for practitioners and researchers to bridge these divides through guided activities that frame the value of UX in ways that organizations will value.

We at Virginia Tech and Meridium Inc. have partnered with the support of an NSF STTR grant to find ways to integrate UX into agile teams. Our approach, extreme scenario-based design (XSBD), combines elements from several agile and usability processes including XP, Scrum and scenario-based usability engineering. In this course, we will present our approach to integrating UX into agile, exemplified with activities that can be taken directly to an organizational setting. The course will also provide a bibliography of related methods and approaches, aiming to encourage critical thinking and active discussions of the issues surrounding agile usability.

This course is geared towards user experience/usability professionals working in companies that are transitioning to or have adopted agile methods. We also welcome researchers interested in UX and/or agile methods and any others who are interested.