Drs. Kelly Caine (of guest post fame) and Dennis Morrison will be presenting on human factors considerations for the design and use of electronic health records. Audience participation is welcome as they discuss this important topic. See abstract below.

In this conversation hour we will discuss the use of electronic health records in clinical practice. Specifically, we will focus on how, when designed using human factors methods, electronic health records may be used to support evidence based practice in clinical settings. We will begin by giving a brief overview of the current state of electronic health records in use in behavioral health settings, as well as outline the potential future uses of such records. Next, we will provide an opportunity for the audience members to ask questions, thus allowing members to guide the discussion to the issues most relevant to them. At the conclusion of the session, participants will have a broader understanding of the role of electronic health records in clinical practice as well as a deeper understanding of the specific issues they face in their practice. In addition, we hope to use this conversation hour as a starting point to generate additional discussions and collaborations on the use of electronic health records in clinical practice, potentially resulting in an agenda for future research in the area of electronic health records in clinical behavioral health practice.

Kelly Caine is the Principal Reserach Scientist in the Center for Law, Ethics, and Applied Research (CLEAR) Health Information.

I don’t feel that I am in control of the information I share on Facebook, and of the information my friends share… FB has total control of (some of) my information, and I don’t like that.

It’s not that Yohann didn’t like Facebook–he did. He liked being able to see his friend’s latest photos and keep up with status updates. The problem was that Yohann (who is, by the way a very smart, tech savvy guy) felt unable to use the Facebook user interface to effectively maintain control of his information.

The root of this problem could be one of two things. It could be that Facebook has adopted the “evil interface” strategy (discussed by Rich previously on the human factors blog), where an interface is not designed to help a user accomplish their goals easily (a key tenet of human factors), but is instead designed to encourage (or trick) a user to behave the way the interface designer wants the user to behave (even if it’s not what the user really wants). Clearly, this strategy is problematic for a number of reasons, not the least of which from Facebook’s perspective is that users will stop using Facebook altogether if they feel tricked or not in control.

A more optimistic perspective is that the problem of privacy on Facebook is a human factors one: the privacy settings on Facebook need to be redesigned because they are currently not easy to use. Here are a few human factors issues I’ve noticed.

Changes to Privacy Policy Violate Users’ Expectations

Users, especially expert users, had likely already developed expectations about what profile information would be shared with whom. Each time Facebook changed the privacy policy (historically, always in the direction of sharing more), users had to exert effort to reformulate their understanding of what was shared by default, and work to understand how to keep certain information from being made more widely available.

Lack of Feedback

In general, there is very little feedback provided to users about the privacy level of different pieces of information on their Facebook profile. For example, by default, Facebook now considers your name, profile picture, gender, current city, networks, friend list, and Pages to all be public information. However, no feedback is given to users as they enter or change this information to indicate that this is considered public information.

It is unclear what is public and non-public information

While Facebook did introduce a preview function which shows a preview of what information a Facebook friend would see should they visit your profile (which is a great idea!), the preview function does not provide feedback to a user about what information they are sharing publicly or with apps. For example, you can’t type “Yelp” into the preview window to see what information Facebook would share with Yelp through Facebook connect.

You cannot preview what information Facebook shares with sites and apps

No Training (Instructions)

Finally, Facebook does not provide any training and only minimal instructions for users on how to manage privacy settings.

Solutions

Fortunately, there are some relatively simple human factors solutions that could help users manage their privacy without writing their own Dear John letter to Facebook.

In terms of user expectations, given the most recent changes to Facebook’s privacy policy, it’s hard to imagine how much more the Facebook privacy policy can change. So, from an expectations standpoint, I guess that could be considered good?

In terms of interface changes to increase feedback to users, Facebook could for example, notify users when they are entering information that Facebook considers public by placing an icon beside the text box. That way, users would be given immediate feedback about which information would be shared publicly.

Globe icon indicates shared information

Finally, in terms of training, it’s fortunate that a number of people outside of Facebook have already stepped up to provide users instructions on how to use Facebook’s privacy settings. For example, in a post that dominated the NYT “most emailed” for over a month Sarah Perez explained the 3 Facebook settings she though every user should know after Facebook made sweeping changes to their privacy policy that dramatically increased the amount of information from a profile that is shared publicly. Then, after the most recent changes (in April 2010) Gina Trapani at Fast Company provided easy to use instructions complete with screen shots.

Perhaps if Facebook decides to take a human factors approach to privacy in the future, Yohann will re-friend Facebook.

Share this:

Like this:

Excellent post at the EFF describing “evil interfaces“, or interfaces that may be deliberately designed to make you do things you did not intend to do:

As Conti describes it, a good interface is meant to help users achieve their goals as easily as possible. But an “evil” interface is meant to trick users into doing things they don’t want to. Conti’s examples include aggressive pop-up ads, malware that masquerades as anti-virus software, and pre-checked checkboxes for unwanted “special offers”.

Facebook and other social networking sites are used as prime examples. Any others?

Like this:

The Consumerist recently posted on something we haven’t tackled in our posts on electronic medical records: patient trust and privacy.

The California HealthCare Foundation recently released the results of a survey on electronic medical records and consumer behavior. The survey found that 15% of people would hide things from their doctor if the medical record system shared anonymous data with other organizations. Another 33% weren’t sure, but would consider hiding something.

When people are asked about accessing their personal health records (PHRs) online, they said: (from the report)

PHR Users Pay More Attention. More than half of PHR users have learned more about their health as a result of their PHR and one third of those say they used the PHR to take a specific action to improve their health.

Low-Income, Chronically Ill Benefit More from PHRs. Nearly 60% of PHR users with incomes below $50,000 feel more connected to their doctor as a result of their PHR, compared to 31% of higher income users. And four out of ten PHR users with multiple chronic conditions did something to improve their health, compared to 24% of others interviewed.

Doctors Are Most Trusted. About half of all survey respondents say they want to use PHRs provided by their physicians (58%) or insurers (50%). Just one in four (25%) reports wanting to use PHRs developed and marketed by private technology companies.

Privacy Remains a Concern. Sixty-eight percent of respondents are very or somewhat concerned about the privacy of their medical records, about the same number who were concerned in a 2005 CHCF survey. PHR users are less worried about the privacy of the information in their PHR.