March 2013 - Posts

The University of Mississippi Medical Center (UMMC) is notifying patients who visited UMMC between 2008 and January 2013 that their health information may have been stored on a laptop computer that's "missing." Apparently, the device was not protected with laptop encryption like AlertBoot, which may have been a result of the laptop being "a shared device, used by UMMC clinicians."

Giving Access to Shared but Encrypted Resources

One of the problems with encryption software is that, depending on the solution, there isn't a way to allow multiple logins to the same computer (or in some cases, there is a way but it's very complicated, rendering it useless. As an aside, AlertBoot does not suffer from this limitation. Indeed, we make it very easy to host multiple IDs and passwords on the same computer).

This hindrance is very problematic in a hospital setting because (a) resources are shared and (b) HIPAA Security Rules generally forbid the sharing of computer passwords and such.

The obvious answer, then, is to pick a solution that allows multiple IDs and passwords for the same computer. However, sometimes people opt for a different kind of solution: not using encryption. Since most computer operating systems come with the ability to support multiple users, one "solution" is to use only password-protection without encryption.

The problem with this approach is that, while you're able to comply with a certain aspect of the HIPAA Security Rules, you're also exposing patients to a risk that could easily be avoided.

Is this what UMMC decided to do? It could very well be so, and it would be within their rights. After all, HIPAA doesn't require the use of encryption. If a covered entity's risk assessment shows that the odds of a data breach are low, and tantamount security measures can be used – UMMC's laptop was in a non-public area, meaning the odds of the device being stolen were low – then encryption is just one of the ways one can use to lower the risk of an ePHI breach.

On the other hand, these other methods are not as useful in the event that something does go awry.

UMMC: Insufficient Contact Information

Generally, a data breach results in the breached medical entity sending out breach notification letters (via first class mail, as specified by HIPAA and HITECH rules). However, the University of Mississippi Medical Center opted to make a public announcement only (the "only" part is implied) because it didn't have a complete notification list:

Federal and state laws require health-care institutions to notify patients potentially affected by such incidents. In this case, due to insufficient contact information for those who may be affected, individual notifications are not possible. [phiprivacy.net]

As I pointed it out before, the implication is that no one is getting a personal breach notification letter. Again, UMMC is within its legal rights to do so; however, honestly, what are the chances that all of the affected parties will be informed of this notice, be it via word of mouth, a segment in the local news, or some other method?

Perhaps that's the wrong question. My guess is that the odds of all affected parties being reached is close to 0%. Rather, the question ought to be: what percentage of the affected parties will be informed? Is it closer to 90% or 40% or 10% or what is it? The former is better than the latter, obviously, but the honest truth is that we have absolutely no way of knowing.

When you consider that the purpose behind breach notifications is to give who are affected a chance to do something about any potential risks, it feels like UMMC is following the letter of the law, but falling very short when it comes to the spirit of things.

Perhaps a better method may have been to send individual notification letters if a patient's current address was on file in addition to making a public announcement.

The site techcrunch.com reported last week that the US Department of Defense was dumping BlackBerry devices and picking up 650,000 Apple devices, including iPads, iPad minis, iPod touches, and iPhones. When you consider that 470,000 BB devices are currently being used by the DOD – and that the DOD has 600,000 mobile device users, according to defense.gov – it sounds like a wholesale dumping of BlackBerrys.

However, this might not quite be the case. Indeed, when you consider that the DOD has been pushing for "device agnosticism" when it comes to mobile devices (and, thus, has been requesting MDM and security technologies that can handle Apple, Android, BlackBerry, and other mobile devices), it only makes sense that BlackBerry will be around in the Pentagon and other Defense Department buildings for a while.

BlackBerry: Not Going Anywhere

One day later, techcrunch.com reported that BlackBerry "devices and services are in the so-called Security Requirement Guide approval state right now, as are others, and BlackBerry will be the first to come out of it." A BlackBerry spokesperson revealed that work with the DOD was "going well" and its Z10 would receive approval in April.

The site also noted that the DOD denied the original rumors about dumping BlackBerry. This was later followed by an official press release confirming BlackBerry's position within the DOD as part of the "Commercial Mobile Device Implementation Plan."

For a very long time, the BB has been accused of "going nowhere." Sometimes, going nowhere is a good thing. Of course, even better is to go forward.

There is a saying that technology is not evil in of itself; it's just a matter of how people employ it. I was reminded of this when I ran across an article at mcclatchydc.com. For those who are engaged in BYOD and computer data security, the apparent ease of evading IT security is a headache that won't go away. I'm sure there are quite a number of people out there who wonder why things were designed with such obvious, glaring problems.

For others, like the Cubans covered in the mcclatchydc.com story, the same "problem" is what allows them to run around government censorship.

A Different Kind of Information Superhighway

"Underground blogs, digital portals and illicit e-magazines" are spreading in Cuba, the last bastion of communism in the western hemisphere, via memory sticks, according to a dissident interviewed by mcclatchydc.com:

"Information circulates hand to hand through this wonderful gadget known as the memory stick," [Cuban blogger Yoani] Sanchez said, "and it is difficult for the government to intercept them. I can't imagine that they can put a police officer on every corner to see who has a flash drive and who doesn't."

I have no doubt Ms. Sanchez knows what she's talking about. On the other hand, if the Cuban government really wanted to do something about it, couldn't they do something?

I mean, look at North Korea: there's so little information leakage in that country that a former Deputy Assistant Secretary of State (Col. Stephen Garnyard, USMC, retired) noted, "there is nobody at the CIA who can tell you more, personally, about [current North Korean dictator] Kim Jong Un than Dennis Rodman [the former NBA basketball player]." Dictatorships generally don't work out, but when they do....oh, boy.

On the other hand, I can tell you that, as I scan the South Korean media, news about North Korea, even if it's not about Dear Leader, does make it out of its borders in one way or another, including video. It might not be on USB sticks. On the other hand, why not?

I currently have in front of me a 2-GB micro SD card, taken from an old cellphone (the "stupid" type). It's about the size of my thumb's fingernail and just as thick. I could hide this pretty much anywhere.

It might be an understatement to say that 2012 was the year of BYOD: it seemed like everyone debated the pros and cons of allowing employees to bring their own devices to the workplace. Many companies that had gone the way of BYOD reported that the positives outweighed the negatives, and that there were substantial advantages to engaging in Bring Your Own Device. Others publicly declared that they were jumping in the bandwagon or were in the process of doing so.

BYOD, however, did not start in 2012. Indeed, BYOD's underpinnings are as old as the floppy disk. While it might be dubious to call a 5.25-inch or 3.5-inch plastic squares "devices," they certainly posed the same problem many companies struggle with today: how can the company's data be secured? How can we prevent data breaches? How do we prevent company information from walking out the office's front doors?

In the past 25 years, it appears that nobody has really found an ideal answer. At AlertBoot, we're able to control the leaching of data via the use of encryption software: stick a USB stick to an already-encrypted computer and the portable flash disk will also be encrypted, automatically. Not only does the encryption protect the USB's contents if it ends up lost, the end user is unable to read the contents of the flash disk on unauthorized computers (this allows one to use the USB drive within the office as a data transfer medium).

But this is only possible because the USB drive is truly a device – it has electronics in it. However, a CD or DVD disk would still allow one to take data out of the company. In order to prevent data leaches using disc technology, a separate solution would be required. The homogenous nature of computers (makes, models, components, operating systems, solutions, etc.) make it all but impossible to cover every potential breach in data security.

Earlier this month, the United Kingdom's Information Commissioner's Office released guidance for BYOD. The 13-page document, which is available for free as a PDF (see link at the bottom of the post), defines BYOD (bring your own device), its risks and benefits, and how to go about protecting any sensitive data that may be stored on mobile devices. (If I may note, many of the recommended practices and solutions to security risks can be fulfilled via the use of AlertBoot's managed MDM service, a completely web-based solution.)

BYOD: Complying with the Data Protection Act of 1998

Despite criticism to the contrary, the ICO is not a dinosaur that has outlived its usefulness. You can find proof in its recognition that BYOD – an acronym for instances where an individual supplies his or her device for work-related purposes – can be of extreme benefit to companies. Indeed, it has come out with a guideline to ease the process of setting up a data security policy, rooted in the DPA, the Data Protection Act.

The ICO notes that the seventh principle of the UK Data Protection Act states:

appropriate technical and organisational measures shall be taken against accidental loss or destruction of, or damage to, personal data.

The ICO's guide goes on to further note that "if personal data is being processed on devices which you [a company, agency, or other organization] may not have direct control over," you must still prevent any personal data from being breached.

Impossible! you may claim. How am I supposed to control what's happening on other people's devices? But, the DPA is about data, particularly personal data collected by an organization – regardless of whether it may be sensitive in nature – and thus all guidelines and recommendations are data-centric. Indeed, it's spelled out in black-and-white in the guideline:

It is important to remember that the data controller must remain in control of the personal data for which he is responsible, regardless of the ownership of the device used to carry out the processing.

In other words, if it turns out that it was your organization's data that got compromised, the organization's at fault. Sure, so is the employee, but the ICO doesn't deal with individuals, unless they've set up a one-man consultancy, e.g.

Long Story Short: Encrypt, Educate, and Monitor

Yep, it's that easy: encrypt your data, educate your employees (on what is allowed regarding the copying and transferring of data), and monitor the situation so that employees are doing what they're supposed to be doing. While that's not all you need to do, it probably covers a good 80% of it. But, easier said than done, right?

While there are different ways of going about it, the use of an MDM (mobile device management) solution like AlertBoot makes it easy. The fully web-based AlertBoot Mobile Security allows an administrator – without going through the process of setting up any extra services such as physical servers or signing up for cloud storage – to push encryption or passwords on users' smartphones and tablets. And, the integrated and customizable reporting engine allows one to easily see whose device is in compliance and, more importantly, whose isn't.

You'll still have to write up your own company security policies, though.

You have a smartphone or a tablet (or both...and then some) and it's full of data you want to protect. So, you turn on device encryption and secure it with a password. But, you have too many passwords in your life and wonder whether there isn't a better method to control access to your device. There might be, but there are limitations, as the following story shows.

Punching In for Colleagues at Emergency Medical Unit

According to bbc.co.uk, a doctor in the town of Ferraz de Vasconcelos, just outside of São Paulo, Brazil, was caught clocking in for colleagues. Of course, it's illegal, but it sounds like one of those everyday occurrences that happen once in a while all over the world; after all, we've all seen those instances in movies or in real life where a guy says "hey, I have to pick up my kid at school; I'm gonna leave 15 minutes early, could you clock out for me?"

The story in Brazil has grown into a national scandal, however, because:

The doctor was caught with silicone finger tips for six colleagues;

She's claiming that she was instructed to do so from higher ups; and,

This is an emergency medical unit (the BBC says it's a hospital, but it's more than thant. This Brazilian site shows it to be a "Serviço de Atendimento Móvel de Urgência" which roughly translates to "Urgent Mobile Auxiliary Service"). In other words, employees are expected to be there regardless of what happens.

One of the reasons why fingerprints are looked at as a practical, easy alternative to passwords is that they're unique to each person and they're always "there" (assuming you don't have a physical accident). The idea is that nobody else can use your fingerprints; hence, it's the perfect password.

On the other hand, the above story shows how easy it is to override fingerprint biometry. It's not just silicone. There have been successful hacks that use photocopies, gelatin, "left over fingerprints" (i.e., the fingerprint left behind from the last person who used a fingerprint scanner), and the classic case of entering premises after someone opens the door.

The trick to successfully using any type of security measure is to double-up on it. For example, if you're employing biometric scanners, also require the use of passwords. Lifting someone's fingerprints and getting them to spill their password is harder than either one by itself.

We live in an era where BYOD – Bring Your Own Device – is transitioning from niche technical jargon to everyday reality, and people are beginning to use MDM and other mobile security solutions to counter the pitfalls of BYOD. But there are places where mobile security is not welcome. For example, the use of data security tools like encryption software is enough to raise suspicion and delay you (or stop you) at the US border. The implication is that you're a suspicious individual because a password is necessary to access your device. What are you hiding there buddy, hm? appears to be the central question by the Department of Homeland Security.

From now on, the answer could very well be "that's none of your business... unless you have reasonable suspicion" thanks to a watershed decision by the 9th U.S. Circuit Court of Appeals in San Francisco, California. This is the end of the border search exception doctrine, that there are exceptions to the Fourth Amendment at US borders, as we've known it for the last ten years. From now on, US Customs and Border Protection (CBP) agents can't dig too deep into your digital possessions without a reasonable cause.

Kiddie Porn at the Center of the Case

Last Friday, the 9th U.S. Circuit Court of Appeals ruled, according to wired.com, "that U.S. border agents do not have carte blanche authority to search the cellphones, tablets and laptops of travelers entering the country." The key word there is "carte blanche." US border agents can still go through your laptop. If they want to do more than do a cursory examination, however, they must have a tenable reason.

The ruling was divided, although not controversially so: of the 11 judges, 3 dissented from the majority opinion (which is 82 pages long. Happy reading).

The ruling was a result of an arrest at the US-Mexico border. In a nutshell, a man by the name of Cotterman was singled out for inspection based on a "fifteen-year-old conviction for child molestation." Although there was nothing incriminating on his, and his family's, two laptop computers and three digital cameras (and presumably their non-digital belongings), CBP sent the laptop to a forensic examination facility. Deleted child pornography in a variety of media was discovered, including 23 password-protected files that were cracked open to reveal images of Cotterman molesting a girl. The court ruled that this in-depth digital examination requires probable cause, and that it was met in Cotterman (the dissenting opinion, in my opinion, makes a pretty strong case that probable cause was not met).

Password-Protection and Encryption is NOT Grounds for Suspicion

In the summary to US v. Cotterman, the court noted the following (my emphasis):

The en banc court wrote that password protection of files, which is ubiquitous among many law-abiding citizens, will not in isolation give rise to reasonable suspicion, but that password protection may be considered in the totality of the circumstances where, as here, there are other indicia of criminal activity. The en banc court wrote that the existence of password-protected files is also relevant to assessing the reasonableness of the scope and duration of the search of the defendant's computer.

Within the body itself, it was commented on the presence of password-protection as a suspicious factor:

the government adds another [reasonable suspicion] – the existence of password-protected files on Cotterman's computer. We are reluctant to place much weight on this factor because it is commonplace for business travelers, casual computer users, students and others to password protect their files. Law enforcement "cannot rely solely on factors that would apply to many law-abiding citizens," Berber-Tinoco, 510 F.3d at 1087, and password protection is ubiquitous. National standards require that users of mobile electronic devices password protect their files.... Computer users are routinely advised – and in some cases, required by employers – to protect their files when traveling overseas.

The majority opinion goes on to note that password protection alone, in isolation, "will not give rise to reasonable suspicion" and that "to contribute to reasonable suspicion, encryption or password protection of files must have some relationship to the suspected criminal activity."

We do not suggest that password protecting an entire device – as opposed to files within a device – can be a factor supporting a reasonable suspicion determination. Using a password on a device is a basic means of ensuring that the device cannot be accessed by another in the event it is lost or stolen.

And, last but not least, the use of passwords has been described as a "basic privacy right," although I've got to wonder whether I'm quoting out of context. In the dissenting opinion (my emphasis):

Perhaps the most concerning aspect of the majority's opinion, especially given its stated stance on privacy rights at the border, is its readiness to strip former sex offenders and others convicted of past crimes (and who are, theoretically, entitled to be presumption of innocence) of even the most basic of privacy rights, such as the right to password-protect their electronic devices.... Indeed, as the majority acknowledges, making legal files difficult to access makes "perfect sense" for anyone.

Who'd have thunk it? Encryption is a type of basic right.

The case is interesting in many ways. You can find thoughtful, intelligent coverage at arstechnica.com, wired.com, and techdirt.com, among other online media.