Posted
by
Unknown Lamer
on Monday January 07, 2013 @08:14PM
from the sterm-talking-to dept.

netbuzz writes "Losing a single laptop containing sensitive personal information about 441 patients will cost a non-profit Idaho hospice center $50,000, marking the first such HIPAA-related penalty involving fewer than 500 data-breach victims. Yes, the data was not encrypted. 'This action sends a strong message to the health care industry that, regardless of size, covered entities must take action and will be held accountable for safeguarding their patients' health information,' says the Department of Health and Human Services."

Encryption is slow. If you have ever did healthcare data, there is just so much data that encryption can add hours to your tasks.Combined with the fact that Health care organizations are just starting to invest into skilled workers, but still are dominated by a bunch of people who worked their way into IT, they were in accounting or in billing who got transferred to IT 20 years ago.These self learned and worked in the institution so long they really don't know how to think in terms of security. They bitch

but you forgot to send the memo to management to double IT works salaries so they can attract competent people. Around here Medical IT staff get less than $21 an hour. you need to be paying $26-$31 to start to attract the competent people.

Encryption is slow. If you have ever did healthcare data, there is just so much data that encryption can add hours to your tasks.

No excuse. I deal with tons of data and by federal directive my laptop must have full disk encryption. It took 5 days to completely encrypt my laptop but now that it is encrypted, it has not added "hours" to my tasks. In fact it is barely noticeable. You do not have to have the data stored on your laptop either, you could have remote access to patient records.

For every organization that hires copetent IT people at a fair wage, there are 2 that try to scrape the bottom of the barrel by paying an insulting wage and 2 that are clueless and have a nephew or the old secretary running IT.This applies inside healthcare and out. Hospitals usually hire good people in my experience.

This is just not true. I had a three-year-old laptop converted to full hard disk encryption and the change was not noticeable. Most CPUs now have hardware encryption acceleration, and those that don't have it already have fast enough math processors to handle the encryption.

I should mention that in the federal space there are new "data at rest" security requirements and many of the databases in use today are already encrypted on disk.

Encryption is slow. If you have ever did healthcare data, there is just so much data that encryption can add hours to your tasks.

Yeah, I work with plenty of healthcare data, and every computer in my organization uses full-disk encryption, plus all our communications channels that handle healthcare data moving to or from the outside world are also encrypted. The encryption doesn't add noticeable time to tasks.

While not free, a much simpler option for the end-user would be to purchase a laptop with drive encryption available out of the box. Windows 7 Ultimate/Enterprise and Mac OSX respectively. Both can provide end-user support over the phone in the event of needing to recover data (OEM and Apple support). That phone call could make this the most important decision ever made. And to go a step further, you can use an online backup solution such as Mozy and backup to the cloud (both client connection and back-end

While not free, a much simpler option for the end-user would be to purchase a laptop with drive encryption available out of the box. Windows 7 Ultimate/Enterprise and Mac OSX respectively. Both can provide end-user support over the phone in the event of needing to recover data (OEM and Apple support). That phone call could make this the most important decision ever made. And to go a step further, you can use an online backup solution such as Mozy and backup to the cloud (both client connection and back-end storage resides in an encrypted state).

Now, you may say this is expensive. But the cost of paying the fine is much higher. It's also more expensive to society as a whole when sensitive information gets shat all over the internet. I can't speak for everyone, but I know I don't want my stuff out there.

This is exactly the point. Whoever you are, if you deal with medical data, it must be more expensive for you to mess up than to do things right.

Last time I looked into FIPS 140, it was the case that only certain software versions were validated by NIST

As a standard, it must do this, because it's possible for a version of software to have fatal bugs in it. Like say a fatal OpenSSL bug in Debian used to pass through valgrind. That would mean that one cannot certify those versions, but ones that were fixed can then be submitted for certification.

And it's possible that TrueCrypt may be certified, but someone makes an error and version +1 now doesn't m

Are there? Last time I looked into FIPS 140, it was the case that only certain software versions were validated by NIST, and none of the validated incarnations were either free-beer or free-libre.

Well, first off, FIPS 140-2 is only specified as part of the requirement for data to be considered "secured" for data in motion under HIPAA (not data at rest, which is where FDE comes into play.)
Second, where FIPS 140-2 is relevant (data in motion

There are exactly zero FIPS 140-2 software encryption products, as this level requires hardware. Even FIPS 140-1 is problematic, as it only applies to the specific software version you certified. Need a security update? Too bad, the certification is gone.

HIPAA doesn't require a FIPS 140-2 validated product, it requires that, for data in motion, the encryption method is consistent with FIPS 140-2, and it specifically includes anything consistent with NIST SPs 800-52, 800-77, and 800-113.
For data at rest -- which what the issue is here with, e.g., Full Disk Encryption -- FIPS 140-2 isn't even discussed; the requirement is that the method be consistent with NIS

More specific, but not necessarily accurate. FIPS 140-2 is the requirement for data "in motion" (being transmitted via some communication channel.)
The requirements for encryption to be sufficient to not leave the data covered by it "unsecured" under HIPAA are methods consistent with NIST Special Publication 800–111, Guide to Storage Encryption Technologies for End User Devices [nist.gov].

When it comes to technology, they always refer to NIST standards as being tested and compliant. Read NIST special publication 800-111 and its references to the FIPS 140-2 standard at http://csrc.nist.gov/ [nist.gov] (Publications / Special Publica

Yet, HIPAA doesn't mandate the use of any specific technology, at all. FIPS is not mandated for use for HIPAA, the AC is dead wrong.

The HITECH Act, under which the guidance referred to was issued, specifies that the guidance issued under the act controls whether data is considered "secured" or "unsecured"; the various penalties and breach notification requirements in HIPAA apply to breaches of unsecured PHI. So, the guidance specifying particular methods is a mandate as to which methods of securing data

This exactly, much like SarbOx it's mostly a minimum framework for organizations to write their own policies (in fact HIPPA doesn't specify ANY technologies, only policies). Specific auditors might require specific standards in order to make their jobs easier (checkbox auditing) but the law is much more vague. In reality if you put in a goodfaith effort to protect patient information and followed your organizations published guidelines it's highly unlikely that you or your organization will be fined unless

This exactly, much like SarbOx it's mostly a minimum framework for organizations to write their own policies (in fact HIPPA doesn't specify ANY technologies, only policies).

First, its HIPAA, not HIPPA.
Second, the "no technologies, only policies" statement used to be true, but hasn't been really true since the HITECH Act and related guidance/regulation modified the HIPAA Security Rule; there are specific technical requirements for data to be considered "secured". Its not required to actually meet those r

Can you point me to the part of HITECH that requires FIPS certification, because the NIST checklist [nist.gov] still has the standard HIPAA style policy driven directives, not prescribed technical solutions. (section 164.312(a)(2)(iv))

Can you point me to the part of HITECH that requires FIPS certification

It doesn't. What it does (at Section 13402) is require the Secretary of HHS to publish guidance on appropriate methods of securing data, and specifies that PHI not secured by technology consistent with the most-current issued guidance is considered "unsecured", and specifies a number of things that have to be done if "unsecured" PHI is exposed.
The guidance HHS has issued under the HITECH Act requires that encryption methods for data

Thanks, that's the first time I've seen actual guidance on specific technologies as it relates to HIPAA. The lack of guidance on actual implementable solution was one of the biggest frustrations when the enforcement piece was coming online for us as recommending specific solutions was considered dangerous territory as it seemed like the law was written in such a manner as to give you enough rope to hang yourself with (or to allow bureaucrats to target anyone they wanted).

It doesn't say it in HIPAA (which is a statute). It says it in the guidance issued by HHS under the HITECH Act which sets standards for whether data is considered "unsecured" or "secured" under the HIPAA Security Rule (a regulation adopted to implemented HIPAA under the regulatory authority granted to the HHS by HIPAA).
And the "consistent with FIPS 140-2" is for data in motion, not data at rest, so it doesn't actually apply here; the data at rest st

While I do not know the legal angle, TrueCrypt is effective in so far that any reasonably competent expert will testify to it being so. ROT13 can be broken in a fully automatic way even if you do not know it is ROT13. That disqualifies it from being "effective", again to be demonstrated by expert testimony.

I doubt HIPPA can require specific encryption. I rather think that they have to show whatever you use is ineffective when you contend the fine. Of course, with "no encryption", they do not have to show an

...what govt penalizers do best: pick on those least capable of defending themselves... in other words go after the low hanging fruit and don't bother with the really hard stuff like rich, for-profit hospitals and clinics that routinely violate HIPAA... because those have armies of high-dollar lawyers who'll make life hard on the govt if they attempt to go after them.

Any large hospital would have fought this out in court and prevailed.Banks, State Agencies, Military, Doctors and Clinics all over the country have data losses all the time, butnobody gets fined. Because they all have insurance and lawyers.But find one little agency, who's patients never live long enough to sue them and they therefore don't needto retain a huge legal staff, and BAM sue them into the ground.

I fully agree that we need more funding for enforcement of HIPAA violations, however the likelihood of securing such funding now is fairly low, and even if the money could be scrounged up there are other things that need the money more.

Nice rant. Too bad you're mostly wrong. HIPAA actually does manage to get data protection pushed far and wide in an industry that fights tooth and nail against any change. It's hardly perfect but it's not terribly onerous and most of the edge cases and implementation problems have been sorted out.

I'm not sure why they chose to beat up on some rural Hospice provider - they've had plenty of chances to hit some big boys and girls, but this will send out a signal that you shouldn't fuck around and avoid doing simple things. It isn't much of an expense to encrypt laptops. It's not hard to put locks on doors, HIPAA has made it easier to transfer data back and forth between providers because everyone is working off the same set of rules.

Maybe you should bash your head with your copy of Atlas Shrugged a few more times until things are clearer.

I'm fighting it tooth and nail because I AM part of the industry and have an inside view on what it means. Armchair idiots love looking at the abstracts but never have a clue about the reality of the issues. Data breaches are rare, incidents of actual patient harm as a result are virtually nil and enforcement should be focused on the people perpetrating the harm. If you wonder why health care costs keep rising look in the mirror because support for this sort of unproductive regulation is the primary ca

Banks, State Agencies, Military, Doctors and Clinics all over the country have data losses all the time, butnobody gets fined. Because they all have insurance and lawyers.

Nobody gets fined? Are you kidding? Large organizations get fined all [hhs.gov] the [hhs.gov] time [hhs.gov], often for amounts of money that aren't measured in "K". It is, by the way, the reason that said organizations have insurance. And lawyers. This one is making the news precisely because it's a small organization and a small data breach.

You mean, like the $1 million settlement Massachussetts General made in 2011 for HIPAA violations?

Banks, State Agencies, Military, Doctors and Clinics all over the country have data losses all the time, but
nobody gets fined.

Banks aren't covered by HIPAA. Most doctors and clinics are small-entities, and this case was noted as being the first significant penalty for a small entity under HIPAA. Cignet -- a big insurer -- paid a $4.3 mil

BS - They have already gone after Blue Cross/Blue Shield and many large practices. There have been multi-million dollar settlements. This was a warning shot to smaller providers that they have to keep their patients' data safe too because many are too lazy to do so.

It took years before there were any fines. The BCBS fine of $1.5m was for 1m records. The only warning that says is that it is cheaper to ignore the regulations than do anything about it.Also, if you're going to lose records then lose big and you get a discount. It cost the hospice over $100 per record and BCBS $1.50. There does appear to be something to the statement that larger agencies have less to worry about.

...what govt penalizers do best: pick on those least capable of defending themselves...

Why, that's a brilliant example of high moral values: why waste the citizens tax money on those that can do more than defend themselves (like: call someone to just... you know? incidentally mention... they'll deduct the fine from next electoral donation round) ?!

At a university where I work, there is a requirement that any project involving storing personal data must go through several periodic reviews and has to meet some strict requirements - encryption is a must (without it, the project won't even get off the ground). I'd be very surprised if there are no regulations dictating how hospitals must store and protect data.

I read TFA, but I couldn't see whether such requirements are a must for hospices. Did they just go ahead and ignore the requirements? In which case, the fine is too small. Or are there no regulations for healthcare industry (I'd find that very surprising)? Can someone more knowledgeable tell me if this was negligence or outright violation of protocol?

In Washington, many health providers are barely regulated (see Seattle Time's [seattletimes.com] report Seniors for Sale [nwsource.com]). The state regular, DSHS, is notoriously incompetent and hasn't been nationally accredited since at least 2001 [seattletimes.com]. I imagine most of the oversight comes from the feds, who are pretty overworked.
Skylar

Every time I see one of these stories I wonder about the same thing. Why is sensitive patient information on a laptop in the first place, and why is that laptop leaving the hospital.

If you are a business executive, I can understand that you would be carrying a laptop which contains emails and other documents. But I cannot think of a single good reason (GOOD REASON) why a hospital's patient information would ever need to be stored on a laptop. Seriously, if you have employees carrying around laptops loaded with patient information, you're doing it wrong.

Low ranking managers, nurses/doctors/etc who only make "rounds" every other day or something to see patients, remote coders who stopped in the office for some reason, IT support persons with access to shared drives to spreadsheets/data containing patient information, etc, etc.

There are a number of reasons why laptops leave facilities.

Yes, there are many reasons why laptops leave facilities and all the ones you cited are perfectly valid, except for one thing. Why is there patient information on the laptop? That makes absolutely no sense.

You come into the facility for your once a week visit, you connect your laptop to the network and you access the patient information. There simply is no legitimate reason for the patient information to be on any laptop, let alone one that is going to leave the facility.

That requires a network connection. Not every home has an Internet connection, and many that have, do not have easy facilities to connect a visitor's computer to the Internet. And as this is set in the US, I wouldn't consider mobile (3G, 4G data) coverage a given either. So VPN is not an option.

Seriously, get a computer in front of your hospital infrastructure which has Internet access (or a modem) on one side and runs ssh or something.Then you simply log in via your portable computer. Nothing will be cached, nothing will be local, just use your portable computer like you would use any terminal.

That's not rocket science, it already worked in the 1980s, just go and watch "Wargames" and you will even learn about much of the security involved.

Having worked on many projects involving various levels of government regulation and compliance, and seeing all the different facets of security and what-not, I can state for a fact that a case like this will be looked at like "It was only a $50k fine? This security hardening project is costing us well over $200k and we still might have a breach that would lead to such a fine. Why are we even bothering?"

We had a project that was basically just a fuzzy match for numbers that looked like credit card or social security numbers and delete them if it found them, just in case they got into a part of the database they shouldn't (like a customers stuck their social security number into their address, and yes, it's happened before) That project cost us $22,000. It ended up being a single line of SQL that ran as part of a service every hour. $50k is laughable. Security breaches like this should nearly bankrupt a company, there is no other way they'll be taken seriously. I'm involved in 5 different projects right now, each of them billing out at over $100k each, 3 of them revolve around privacy issues and government compliance. The fines issued for such breaches aren't even in our paperwork as a concern. The cost of a breach in regards to public image however has a very specific, very large number near the top of the chart. But we're in a business where people are paying attention to such things. These fines should START in the millions because preventing them costs in the hundreds of thousands of dollars.

We don't encrypt laptops. We don't allow sensitive data on laptops... or desktops for that matter. If you want access to that sort of thing, you need to VPN in and log onto a Virtual machine... that virtual machine is then wiped as soon as you log off. We don't have to worry about the user end of the session at all.

1. Recognizing the risk
2. Spending 22k
3. And ending up with 1 line of code for it.

I mean, at what point in that expenditure was that line of code developed? That 1 line of code is obviously includes a search string for the databases, and a command to delete them. How was that not obvious to implement?

You vastly underestimate the size and complexity of our systems. It does not start with "Recognize risk" it starts with "Risk discovery" which is a very complex process. We're talking rooms full of people with very boring flowcharts. If you just wait for risk to "pop up" before you fix it, you've already got a breach.

Perhaps the fine was sized to cause pain to the organization and not kill it. Everyone makes mistakes and there are consequences but those consequences should not be fatal. Now if it happened a second time the fines should be much larger. A third time should bankrupt the company.

If the information in any laptop (or desktop) could be worth tens of thousands in fines we might just see an increase in health care thefts and blackmail. Cheaper to pay to get the laptop back than to pay the fine if the data goes public.

I love all the immediate "encrypt it" comments. Yes, that would be helpful, but the bigger question to ask is:

"Why would such data be copied onto a laptop in the first place?"

We keep hearing stuff like lost laptops and flash drives over and over. The reality is that sensitive data like this shouldn't be on those devices in the first place. One would think it would be accessed only on secure servers through approved clients and methods. Most facilities' HIPAA guidelines specifically forbid copying such information off the servers in the first place (expect by I.T. for backup) regardless if it is encrypted or not. Seems like employees in the organizations just ignore that.

How do you propose we handle this?If it's a web application it's reasonable to assume that browser caching would cache certain data on the hard drive. Even "clearing cache" would only delete the headers and not securely delete all of the data. With IE, you can enforce a GPO that tells the browser not to cache data retrieved over HTTPS ; but this is assuming that HTTPS is used for internally connected systems (often times they're not), and it assumes the user is using Windows in an Active Directory environme

This is just a case of following the good old, tried and true tax department/RIAA solution. You go after the small, weak, vulnerable targets. The big ones are likely to defend themselves with armies of lawyers and keep your sorry ass in court for the next hundred years.

If there is a definition of cloud computing, it's the abstraction of administration. Managers at a hospice in Idaho are not qualified to make IT decisions about encryption. Even Microsoft's cloud is more secure than what they can put together : ) Combine bio-authentication with a website white list and you eliminate all passive/opportunistic attacks.

cloud computing needs a good data plan and coverage. Based on needs and how the cloud is set up (something on live like) will need a lot more then a 5GB cap. and say $10 a gig after 5gb can add up very fast.

If there is a definition of cloud computing, it's the abstraction of administration. Managers at a hospice in Idaho are not qualified to make IT decisions about encryption. Even Microsoft's cloud is more secure than what they can put together : ) Combine bio-authentication with a website white list and you eliminate all passive/opportunistic attacks.

But until Microsoft, Google, Apple, or any other cloud server encrypts all information transfer and signs HIPAA agreement forms, nobody can put their information on them. There are such services out there, but AFAIK, the easy solutions are not signing those forms which every management in a HIPAA covered enterprise should know.

I am going to assume the hospice is in a similar boat we are... and i will explain how its not as simple as the wand waivers above try to make it sound. I'm essentially the brat mentioned above. Small practice with about 7 providers and about 50 machines... Probably 50/50 desktops and laps. we use a shitbox EHR that was shoved down our throats because our old vendor sold the code to the highest bidder to acquire clients. Me and and 3,000 other clients are stuck with a "new" shit product, $100,000 in debt and India to call for "support". we don't have $22k for one line of SQL code. the EHR requires local users to be admins. Mind blowing. A gpo restriction against data to the local renders the box useless. No matter how many learning moments, hand slaps and write ups you have , users will never understand the difference between My Documents and the shared network drive where stuff is supposed to go. Ironically doctors are the worst. I wrote hundreds of pages of HIPAA policy and then tried to figure out how to encrypt and secure 50 xp machines running on aging dell 2350's/3000's and d510's. state hipaa auditor says we need essentially another $100,000 worth of new stuff and encryption. There is zero IT budget. I just yanked all the drives and am pxe booting thinstation to a terminal session. in the follow up, the auditor agreed it satisfies the encryption issue 100%, and she had never heard of that or seen it done but applauded me. There are thousands of office just like me who have no budget and are already drowning in debt from the non-free software rapists. The number one argument you will get from the business owners is no budget. dwindling reimbursements coupled with exponentially expensive responsibilities like this article make for a rough combo. I feel bad for the chaps in bumblefuck Idaho. They are probably barely scraping by, then this...
I'd pitch the same solution i used that passed the hipaa audit to any of these other offices out there you might find who need help but can't afford anything else. Pass it on./$.02

About time people faced some real consequences for these sort of actions. It's a shame (but not unexpected) that they picked on a hospice to make the example, rather than say a large corporation, but the principle stands. If you dont encrypt private, confidential data you should be held accountable. No more plain text passwords in database tables, no more unencrypted personally identifiable information on removable/portable devices (or in database files for that matter) . No excuses.

The health care sector looses information all the time. Over the last 15 years, two hospitials have managed to lose 5 MRI tests and 1 EEG test, digital and paper copy. I really don't trust the "security" in place with the health care sector at all.

No it doesn't. For starters: such a fine is a good thing, but it should be payable to the victims of the data breach (as in: the people whose sensitive data was dumped on the street). One way or another, they suffer damage from a data breach, they should be compensated.

Secondly, it won't prevent further breaches like they happen so often these days. Maybe if fines are stiff enough, and handed out often enough, over time it will produce an effect. I wouldn't hold my breath though. When it comes to keeping data private, a new idiot is born every day. Sometimes an idiot in charge, but that's not always necessary.

No it doesn't. For starters: such a fine is a good thing, but it should be payable to the victims of the data breach (as in: the people whose sensitive data was dumped on the street).

You did read the article right?

Of course not.

Nobodies data was abused. They didn't suffer any damages from the data breach.(You do know what a Hospice is, right? You understand that their clients could not possibly care less about a data breach?).

Be that as it may, fines are NEVER payable to individuals. The government simply pockets the money.Nobody is taught any lessons, other than to raise their prices to pay for even more insurance.

Yes, and the next time some Hospice official thinks about not encrypting their data, they're going to remember this event and think better of it.

HIPAA violations are serious. People have likely lost their jobs over this. Even though I'm not in a position to routinely work with patient data, my employer requires that my laptop is encrypted - in the case of my Linux laptop I was able to convince them that using encrypted LVM was sufficient.

I'm happy HIPAA is being enforced. We have already had way too many breaches, either tapes left in unsecured locations, or laptops "going missing".

We already have had a decade of businesses giving security the hind teat, since it is viewed as a cost center, and the belief that "calling Geek Squad" after the fact can fix things. Having it made public that if laws/regs are broken, that fines will be levied might get places to zip their flies.

Encryption of laptops is not hard, especially Windows laptops that are the mainstay in business that have TPM chips. With any Windows version newer than Vista, Bitlocker is very easy to enable on an enterprise level. For most things, just forcing BitLocker via GPO on laptops, even if the user is a full admin is more than good enough for security.

For laptops without a TPM, Windows 8 and Windows Server 2012 allow for a password to be set before boot.

Almost all new major operating systems have some form of DAR/WDE encryption ready to go. Linux has LUKS, BSD has gbde, AIX has EFS, Solaris has encrypt(1), OS X has FileVault II. Enabling this may not be trivial, but it is doable.

Of course, almost all new backup programs have encryption, usually create/import a key, set a button to encrypt, and let fly. Netbackup has the Media Server Encryption Option, but even better, if one uses LTO-4 or newer media, NBU can just use the tape drive native AES encryption directly.

Okay, so what exactly does TPM have to do with encrypting your data? Okay, sure, it gives you an off-disk location to store a key, but as far as I can tell everything else it does is related to creating a unique encryption key for that machine so that DRM can bind a software license to it. Useless for encryption of data that will move between machines, and it adds no further security against someone stealing a laptop. The only case in which it seems to offer any potential benefits is against someone stea

Windows, particularly WIndows 7 Enterprise, requires a TPM chip for the included encryption software, Bitlocker, to work. Previous poster is saying that since business laptops come with TPM anyway and Windows that most large enterprises use comes with encryption anyway, the only reason business laptops are not encrypted is because somebody is too lazy to click a button.

In reality, it's a bit more complicated than that. There are many older laptops out there running WinXP that has no built in encryption and

Encrypting laptops is *expensive* in time and effort.
The problem is encrypting the system drive. Without this, our malefactor just edits the system drive, boots the OS with inserted password, and reads the encrypted data. Or if you do encrypt the system drive, there is some sort of pre-boot authentication required. Welcome to a patching and support nightmare. This is why mobile device encryption isn't as widely employed as some would like.

Yeah let's fire the IT guy who suggested all data to be encrypted and not the managers who overruled the IT guy because encryption is annoying.

That pretty much jives with my experience. I had an experience where I warned my management over and over about a single point of failure in our customer supporting infrastructure. They didn't want to hear it. Didn't want to fix it. Didn't want to spend any money on it. Then one day it blew up. Cost the company a small fortune in blown SLA's. Of course their knee jerk reaction was to blame the IT person. The only reason I kept my job was by pulling out the documentation, which I purposely saved, of me telli

(You do know what a Hospice is, right? You understand that their clients could not possibly care less about a data breach?).

I'm sure the thing you want to be dealing with when closing down a loved one's estate is finding out that someone's opened up a bunch of credit cards and gone to town.

Be that as it may, fines are NEVER payable to individuals.

What about the $2.4 billion that the National Fish and Wildlife Foundation received from BP as part of the Deepwater Horizon oil spill? That will have a direct, tangible benefit to the Gulf States.
Skylar

Hopsice prices can't just arbitrarily go up for 99.9% of people who use insurance or medicaid. They work on prenegotiated rates. They can charge all they want, insurance is only going to pay them what they agreed to.

if i recall correctly from the hipaa rules, it isnt the hospice that is required to pay the fine, im pretty sure it is the employees responsible.

Both covered entities and individuals working in them can face criminal charges for certain HIPAA violations (which can include fines and jail time), there are also civil fines against covered entities possible for a wider range of violations. TFA clearly indicates that this is a fine as part of negotiated settlement between the government and the hospice, and t

The key word there being "should". Sadly, there are many reasons this doesn't occur.

1. Overloaded tech guy who was told to crank out new laptops ASAP, damn the torpedoes.
2. Tech guy that knows better but was overruled by his boss.
3. Tech guy told them it needed done, but CEO said "screw it, it costs too much and is too complicated for us to learn".
4. Plain ol' oversight.

My money would probably be on 3. "How's 70 year old Beloved Bill, who's been with our company forever, ever suppose to r

Dude, it's a small nonprofit hospice, it's doubtful they HAVE an IT guy, more likely a consultant they bring in to fix something every few years. I know because I worked consulting in a practice focused largely on smb medical and only our largest and/or most profitable customers ever engaged us for anything more than break/fix. I got out just as HIPPA enforcement was coming online and almost none of our clients was prepared despite the fact that we had sent along information for several years pointing them to organizations that could help them write their policies (we got nothing directly out of this, though given the state of many of their IT systems they would have needed services to become compliant with legal minimum practices).