~ A fine WordPress.com site

Monthly Archives: March 2014

This week, I would like to discuss the floor plan light switch. This switch is the answer to the problem of never knowing which switch belongs to which light or fan. Rather than the switch being a regular toggle switch, the switch is actually a layout of the home it is being used in. the switch gives its user a birds eye view of their home right on the wall. It shows every room that has a light that can be manipulated. This allows the user to know which switch belongs to which room, and what lights are on and off in the house.

This new device is relevant to HCC because it is a device physically manipulated by its user. The device also allows the user to gain feedback and better understand what switch does what. The device also lets you know what lights are already on.

Skills used to create this device:

Electrical engineer

Graphic designer

Construction architect

This design will change the way people interact with the lights in their homes. This new interaction will hopefully save energy and help users take less time to finish a simple task such as turning a light off. I would be very interested in working on a project similar to this once I graduate from the HCC program.

This weeks reading discusses various types of virtual enhancements for pointing facilitation. One idea discussed in the paper was to bring potential targets that need to be pointed at closer to the cursor. This would lessen the amount of time it takes to get a pointer to an icon. The issue with this idea is knowing what icons need to be moved to the cursor, as well as how many icons should move towards the cursor. Once too many items are moved, the cursor may become cluttered with unwanted icons. Another idea discussed in the paper was to create less distance between the icons. Creating less distance between the icons can be done by setting the icons next to one another. This creates a space with more purpose, lessening the amount of unused pixels between icons. The paper also discusses making the cursor larger. By enhancing the width of the cursor, the cursor can then be used as a tool for determining which icon a user would like to select. Lastly, the paper discusses expanding targets. When a user is on a newer model Macintosh, they can scroll over widgets and icons on the bottom of the page. Mac has given the option to make those icons at the bottom of the screen larger when they are being scrolled over. Thus, the widgets and icons expand their own width that may in some cases, lessen the distance between the icon and the cursor.

This paper contained lots of beneficial information for students in HCI. Understanding how icons and cursors are being manipulated to create an easier workflow is something that will affect anyone who uses a touch-screen interface or computer. Creating better sized icons, promoting more usable space, and creating smarter working cursors will all lead to a more efficient workflow within the Human Centered Interaction.

This weeks reading discusses accessibility with touchscreen devices (tablets and smartphones). The research identified types of touch screen accessibility such as timing for certain actions and multi-finger operations that are used on these touchscreen devices. The 16 individuals interviewed in the research all had multiple levels of Dexterity impairments including motion limitations and difficulty isolating movement to just one muscle group. The research was able to identify accidental touches and incorrect movements when the participants were using their touchscreen devices. This research shows how unique users of a specific touchscreen device can be. An example of these differences is how many font sizes were liked most among the participants. Although all participants preferred larger text, not all of the participants had the same visual capacity, causing ambiguity regarding which font size worked best for this group of participants. Just as font size preference was dependent on the participant, touchscreen techniques such as the pinch motion was found to challenge the group of participants in different ways. Some of the participants were able to succeed in the pinching task with no problem, some users were unable to use two fingers at once, and other participants found this task to be somewhat difficult.

Accessibility to touchscreen interfaces such as smartphones and tablets requires the understanding of how dynamic each individual is to the next. Accessibility affects my user experience with a touchscreen device very differently than it does with other people that I know. Some applications and touchscreen actions that I use may not work efficiently for the next person. Recognizing this can help to make these devices more versatile while still maintaining its simplicity.

This weeks reading discusses the design of interfaces as well as the different types of interfaces. Don Norman’s execution-evaluation cycle is based upon seven stages and helps to understand what happens when interacting with a user interface. The reading goes on to explain the cycle in a linear fashion. First the user must create a goal. After that goal has been created, the goal must then become more precise by turning it into an intention. This intention will help the user to better identify what actions need to be taken in order to meet the goal. After the action/actions have taken place, the user must perceive the new system. If the system is in the state that matches the user goal, then the task has been completed.

After reading this chapter, I realized how often I do this in my day-to-day activity. Don Norman explains that even turning on a light switch while reading a book requires all of these steps in the execution-evaluation cycle. This gives more insight to how goals, intentions, and actions must work together in order to complete any task.

One of the interfaces I found most interesting to read about was the WIMP interface. The WIMP interface is the interface used on PC’s and MAC’s. The WIMP interface consists of windows, pointers, icons, menus, toolbars, and much more. Out of all of these WIMP interface attributes, I find the menus to be most interesting. When a menu is clicked, it becomes its own interface. The menu also contains information-cues which helps the user distinguish between meaningful options found on the menu.

The self-sterilizing door handle is exactly what it sounds like. When the door handle is not being used, it actually cleans itself off. The door handle contains a UV light inside of its metal casing. The handle is able to sense when it is being used. When the handle isn’t being used, the UV light is turned on, killing any germs that may remain from the handles past users. Once the handle detects movement, the UV light is turned off only to turn back on when there is a lack of movement.

The door handle is relevant to HCI because it requires human interaction to work as intended. The entire reason the design was created was to help people lessen their chances of becoming sick. The handle is also a part of HCI because it is interpreted by its user through sight and touch.

People needed to create this design:

Ergonomics (for having a comfortable and easy to use handle)

Construction engineer

Electrician

Experts in lighting

I enjoyed reading about this design because it is simple, yet can have a huge impact on peoples lives. I would definitely be interested in working on project such as the UV handle after I have completed the HCC program.