At the rate technology has changed everything else in our lives, by now we should have the equivalent of tricorders in our smartphones—instant access to our health statistics collected by sensors in our clothes and pulled into our individual health history in the cloud. We should be able to Skype our physician, text our pharmacist, and get both a blood sugar measurement and an MRI at Starbucks while waiting for a grande latte.

Except for the MRI part, all of that is doable today. Thanks to the big stick provided by the Affordable Care Act in the US, some healthcare organizations are pushing more aggressive use of network bandwidth and cloud technology:

Monitoring patients’ health more proactively with networked devices, ranging from wirelessly networked medicine bottle lids to worn or embedded sensors that report back on vital signs;

Coordinating care with the help of analytic tools in the cloud and a wealth of individual and collective patient data; and

Connecting physicians directly with patients over PCs or mobile devices for between-appointment follow-ups.

Those things can’t be pulled off without cloud technology, whether it’s hosted internally in a health organization’s data center or elsewhere. But ask any random sampling of physicians, technologists, and health industry observers. They’ll tell you technology isn't restraining the next big paradigm shift in health care. The bandwidth is willing.

“It’s less about the technology holding the industry back, and more about the reimbursement model for healthcare,” says Kenneth Kleinberg, senior director of research and insights at The Advisory Board Company, a global healthcare research, technology, and consulting firm. “Quality hasn't been rewarded, physicians don’t have incentives to share data, and patients are freaked out about privacy. Healthcare isn't a system—it’s a bunch of individual entities looking out for themselves. Just adding more bandwidth to a broken system doesn't work.”

Real technological change in health care requires changing the “work culture” of health organizations and people’s confidence in health IT systems, says Harry Kim, senior director of Hewlett-Packard’s healthcare group. But other complex (and heavily regulated) businesses have embraced change long ago. Citing ATMs, Kim says, “If we can trust our money to a machine, we can do it with healthcare.”

That’s why health organizations are looking outside of their industry for inspiration. “The companies bringing the biggest changes to medicine today are companies like Cisco, EMC, Apple, and Microsoft,” says Dr. Elliot Fishman, Director of Diagnostic Imaging and Body CT at Johns Hopkins Medical Center. Technology from the consumer sector (such as mobile devices and apps, cloud computing, and even gaming) is seeping into the healthcare field and being seized upon by care providers to improve the connection between physician, patient, and data.

To get an idea of how bandwidth can change medicine, we talked to people on the front lines of medical technology at two of the most well-known hospital systems in the US: Johns Hopkins and the University of Maryland Medical Center. We also caught up with technology and digital health service providers. What we got was a snapshot of organizations that are already working to transform medical care with networked technology, while trying to overcome organizational inertia to make it happen.

Driven by data

The first wave of change that healthcare organizations have dealt with (or are still dealing with) is what Kim calls the “digitization of sick care.” Nearly 80 percent of healthcare is dealing with chronic illness. To improve care for patients with chronic health problems, health providers need to be able to effectively monitor and capture the right data from them, pull it back into electronic medical records, and make it available to both patients and physicians to act upon.

The problem is that many health record systems weren't built to handle those tasks. Healthcare systems have had electronic health records for decades; the problem is the systems lack standardization. These carry with them the sorts of software and schema hangovers that plague every data integration project.

“At Hopkins, it started a long time ago with a longitudinal patient record that pulled in from all our systems,” says Stephanie L. Reel, Vice Provost of Johns Hopkins University, Vice President of Information Services at Johns Hopkins Medicine, and CIO for both the university and hospital. The system acts as a repository for information from all of the hospital systems’ various health systems.

“But in spite of the fact that I think we've done a good job over the last 25 years, we've now realized we didn't,” Reel says. The effort required to get all of the data normalized from each of the systems was “too expensive, cumbersome, and not always possible.”

So Hopkins is replacing its homegrown system with one from Epic, a hosted system with a single, patient-centric database. Reel says that when it’s implemented, the system will “give each patient control over his or her own records.” Patients finally gain complete access. Since it’s a single integrated system, all of an individual's data is there for each caregiver—their allergies, test results, medications, etc. Epic's portal can even be accessed through mobile apps for Apple iOS and Android devices.

But on top of that, the data will also be used to mine information on how well different courses of care worked for patients. This should help tailor care based on patients' own conditions and the outcomes of people with similar cases. “You can look at a population base that has benefited from treatment,” Reel said. “We can learn from our own cases, but also if done appropriately, can learn from interventions elsewhere. This gives us the opportunity to do personalized medicine—based on previous cases, we can be able to predict when patient will benefit from one type of intervention or another—or, from their genetic makeup, might be able to decide if treatment won’t help.”

BYOD medicine

Physicians aren’t waiting for their central IT departments to achieve the nirvana of centralized healthcare data. They’re finding their own ways to get access to the information they need, when they need it—pushing health providers to build Web portals and other applications that give them access to medical records anywhere. One of the most visible signs of change is the adoption of the iPad and other mobile devices by physicians.

Thanks to more reliable and more widely available wireless bandwidth, the iPad has become an essential tool for clinicians. Last October, the Department of Veterans Affairs moved to open up its network so that doctors could use their own mobile devices. While other health systems have been slow to officially adopt the iPad and other devices, John Kornak, Director of Telehealth at the University of Maryland Medical Center says, “A BYOD (bring your own device) mentality is starting to take shape among physicians, and more mobile apps are starting to find their way into use.”

Kornak says that there is a strong push from doctors to find mobile apps that make it easier and more seamless for them to connect to health data such as charts and radiology images. “Physicians are telling us if we don’t have [the apps they need], we need to have a development partner and build it ourselves. They're really urging us to not focus on what the standards are—we need to be open to any devices on market, and keep them in mind when building solutions.”

One of the most obvious applications for the high-resolution screen of the latest iPad is displaying medical imagery. By pulling up images from CT scans and MRI scans on their iPads, Hopkins’ Dr. Fishman says surgeons now use the iPad to explain procedures to patients more effectively. “Doctors can look at their cases in real time. Now my clinicians are looking at the information I generate as it’s created. They can pull down CT slices in 2 seconds. It’s very fast and interactive. They can bring the image to the bedside or in the office.”

That mobility and ease of access pays off in another way: time. “When you speak to surgeons at Hopkins,” says Fishman, “they say that they save about an hour of time each day from using the iPad. And that’s a big deal—instead of going home when their kids are asleep, they get home when their kids are awake.” Fishman says he’s been at the beach and on airplanes and has been able to look at radiology images for consults.

That power doesn’t just come from the digitization of raw information, though. It only works, Fishman says, when the networking piece becomes transparent. “The end-user experience has to be that it just happens," he says, "not typing 20 codes in for access and hoping that it works.”

Share this story

Sean Gallagher
Sean is Ars Technica's IT and National Security Editor. A former Navy officer, systems administrator, and network systems integrator with 20 years of IT journalism experience, he lives and works in Baltimore, Maryland. Emailsean.gallagher@arstechnica.com//Twitter@thepacketrat

65 Reader Comments

The current problem with using an iPad for X-ray imaging is even with the 'retina' display it isn't high enough pixel density for medical imaging for diagnostic use. This came up recently as my wife's veterinary clinic is looking at a new dental x-ray to replace their current film based system. As a selling point the company provides viewing software for the ipad. It is for use only as a secondary display, for showing images to clients, but not as a primary diagnostic display.

The other problem with LCD screens is you really need very high contrast ratio to properly imitate an actual x-ray image. The ipad screen is very good but it's no replacement for film or a really high end screen.

In response to Zonk3r, the Retina has the highest pixel density available today. And it is IPS so the colors are more accurate compared to a TN display that would really cause problems looking at an X-Ray since it is all gray scale.

I can quite see this taking off as a grass roots thing... a lot of people will happily take privacy risks if it means having all their medical data collated, up-to-date, and able to be made immediately available to medical staff if that improves their quality of care... and really things like continuous heart/bp monitoring should be pretty simple to achieve by now.

The other problem with LCD screens is you really need very high contrast ratio to properly imitate an actual x-ray image. The ipad screen is very good but it's no replacement for film or a really high end screen.

I would have imagined that some variable 'sliders' in software to adjust contrast filters / zoom could more than compensate for the limited screen contrast/dpi when compared to static film.

Of course that would mean some re-training of the people doing the viewing...

My biggest concern is not so much privacy as it is hacking of the wireless sensors and devices. My perception is that many the technologies in use were developed with functionality in mind. I would hate for my physician to mis-prescribe my medication because someone intercepted the wireless communication from my device.

I work with a group of surgeons and the larger academic organization. Our health system is also bringing in Epic, which will largely replace the custom-built software we currently enjoy. Having all the information together is a great idea and a milestone which I hope we reach, but the drawback for us here in our specialty is this: we believe we'll have less ability to innovate our own software to fit our clinical interests, and access/control over data is an issue as well.

I can quite see this taking off as a grass roots thing... a lot of people will happily take privacy risks if it means having all their medical data collated, up-to-date, and able to be made immediately available to medical staff if that improves their quality of care... and really things like continuous heart/bp monitoring should be pretty simple to achieve by now.

Sadly society almost always lags technical capability

I'm cool with that since early adopters are quite often beta testers when it comes to technology.

In response to Zonk3r, the Retina has the highest pixel density available today. And it is IPS so the colors are more accurate compared to a TN display that would really cause problems looking at an X-Ray since it is all gray scale.

You could have summed up the article with those two quotes. The problem isn't technological, it's cultural (and legal, and sociological, and, er, business-ological).

That's why the Windows phone study and other similar gadget trials are unimpressive. They proved that technology can improve healthcare. But that wasn't really in question. Those trials ignore the real problem: can technology advocates overcome the red tape?

Firstly, that article references the iPad2 (132 dpi) NOT the 3rd iteration iPad at 264 dpi -and its much improved color gamut. At that resolution -and at a viewing distance of around 18", the human eye would be hard pressed to discern any greater detail.

Firstly, that article references the iPad2 (132 dpi) NOT the 3rd iteration iPad at 264 dpi -and its much improved color gamut. At that resolution -and at a viewing distance of around 18", the human eye would be hard pressed to discern any greater detail.

The other monitors in their product line are even less dense in terms of dpi. And these are being sold as 'professional grade' monitors.

Yep, these monitors rely more on high contrast than high DPI. The hospital where I work uses black & white LCD displays for physicians to view X-ray imaging. It's all electronic, and these displays aren't high-DPI.

In response to Zonk3r, the Retina has the highest pixel density available today. And it is IPS so the colors are more accurate compared to a TN display that would really cause problems looking at an X-Ray since it is all gray scale.

Also note the 30-bit color space specs on some of those models. Medical imaging is a very different class than gaming or video watching. For examination of images, large size and high DPI are both essential. To take an example outside of medicine, but closer, it is not uncommon to sample high quality 35mm film at 20-50 megapixels on a drum scanner. For some fields, there are very different graphic requirements than you would think of. Color scanning for professional use is frequently at 36 or 48 bits per pixel because of the need to reproduce color accuracy in print. The NTSC color gamut used to measure the iPad screen is only a fraction of the visible spectrum.

As for the arguments that Retina DPI at 18" viewing distance is an end-all be-all of displays, consider the use case. Ever watch a doctor or radiologist looking over images on film or a large monitor. They constantly move their heads closer and farther and pan around. Sure you can emulate this with pinch-to-zoom and panning the screen, but inherently, part of the examiner's mental effort is now distracted with coordinating hand movements to accomplish these tasks. Previously, they were very naturally moving their head.

Now, an iPad, or any tablet for that matter is fine for explaining images and similar to patients. The flip side of that is HIPPA compliance. While the flash drive on an iPad is encrypted by default, a password is not necessarily implemented by every doctor. In the BYOD paradigm, this represents several data loss vectors. Lost devices, malicious website scripts, and similar could very easily grab at cached images and patient data.

One of my doctors and his entire staff use Mac laptops and carry them around to the rooms. I asked him why they didn't use iPads and he said they tried them out but they just couldn't get enough information onto the small screen, regardless of the DPI of the screen. At some point, the text just gets too small to read and zoom/pan is not a good solution. Plus, data entry was a problem. So, they use laptops.

One of my doctors and his entire staff use Mac laptops and carry them around to the rooms. I asked him why they didn't use iPads and he said they tried them out but they just couldn't get enough information onto the small screen, regardless of the DPI of the screen. At some point, the text just gets too small to read and zoom/pan is not a good solution. Plus, data entry was a problem. So, they use laptops.

I do wonder what they will make of something like the Lenovo Yoga. Still, medical is one sector were the existing TabletPC form factor found a home.

...As for the arguments that Retina DPI at 18" viewing distance is an end-all be-all of displays, consider the use case. Ever watch a doctor or radiologist looking over images on film or a large monitor. They constantly move their heads closer and farther and pan around. Sure you can emulate this with pinch-to-zoom...

Emulate, schmulate! With the new iPad, one can take any x-ray at any scanned resolution and view area details by swiping a finger to center and zooming by expanding a thumb and index finger to the limit of the viewier's vision. Try THAT with film -which is typically viewed on a wall viewer at a distance of 3 feet.

Are you saying Medical Professionals prefer to be "constantly zooming their heads closer and farther and panning around?" Ah, I don't think so. x-rays are static media. They have no choice.

With an iPad, they DO have a choice: They can either slap static film on a wall viewer -or be able to zoom and rotate images to their hearts content AND carry multiple stored images in a small package under their arm to the patient's room for a consultation. (-or view them remotely in at bedside from MRI or x-ray storage media via WiFi.

A lot of the arguments are relatively specious. Mammograms (for example) require high pixel density, high contrast ratio, 10 bits/pixel grayscale, linear calibration and so on - but mammography isn't all of radiology. A CT and a high-resolution MR are typically 512x512 in-plane. PET is usually 128x128. You don't need a 'retina' display for that. Moreover, high-resolution radiology is useful for looking for microcalcifications, but you're not looking for microcalcifications if you're looking for a femoral blockage on a CTA. There's a lot more to radiology than planar X-ray.

The iPad is not going to replace dedicated primary reading stations, but it will sit nicely in a niche, just like the Sectra visualization table or the various Kinect experiments for intervention. You don't need a 3MP Barco to show a patient their broken bone or to ask a colleague for an opinion on a finding. You certainly don't need one to read a patient's EMR.

As for HIPAA compliance: a whole body CT scan can run you 200Mb. You're not going to be keeping the data on the iPad. You will be logging into a hospital teleradiology system which will require authentication, and using secured connections.

Not to step into it, but I think the whole discussion regarding iPads, resolution and dpi is a bit off. While as a physician, I certainly love the iPad-ification of health information as a concept (although I own a Transformer Prime secondary to the keyboard), it will require a massive disruption (in a + sense) of healthcare to effect this change. Realize that due to reimbursement models there is minimal to no incentive to proceed with this, and in the short term it only adds costs. I work at a Harvard-affiliated hospital, and let me tell you, the main topic of conversation is all about cutting costs, not making new capital investment irregardless of promised benefits.

Long term, the tech industry, a few physicians, scientists and most importantly massive numbers of patients (consumers?) have to team up and provide incentive from the outside, preferably through patient advocacy and "voting with their feet".

Certainly healthcare industry at 2.5 trillion dollars/yr in the US alone, is a fantastic candidate for disruption, efficiency and creation of novel startups and most importantly innovation platforms. The main challenges are 1) patient-payor-hospital-physician model - which is broken, 2) Resistance to change by institutions, and hell, I'll say it, physicians (who at 60-80h work weeks, desperately desire to constrain new things they have to learn and worry about), 3) govt'/regulation (hippa, consumer healthcare information models ~ just look at the fight over sequencing), 4) insurance companies (their very existence is predicated on the existence of excessive intermediation and lock out of physicians, hospitals and patients from each other).

My final (minor) point is that the discussion about the iPads is actually completely irrelevant in the bigger picture. I know the tech perspective is that a gadget will solve all problems, but the issue here are platfoms and regulation, not if iPads are appropriate for radiology image assessment. We need all sorts of screens (from iPads for individual docs & patients to 50inch radiology monitors capable of showing 4-6 megapixels), but that's all a sideshow compared to the systemic chaos currently reigning in medicine.

Not all work done with digital radiography is diagnostic, there are plenty of work flows that can use lower quality screens.

I have seen some amazing new zero footprint AJAX and HTML5 viewers from the likes of AGFA and J4CARE that allow practically the full range of PACS workstation like DICOM viewing capabilities on a browser including iPads and Android devices.

Having worked with medical records before (as a records tech), an iPad (or other high-res, high-contrast tablet) would definitely help in some areas, but might be disastrous in others.

Doctors are notorious for incomplete and unreadable records. Typing out commands and symbols will slow down a doctor too. Thus, readability (if a stylus is used) won't change, and a tablet won't guarantee any changes in the completeness of records (or whether a record is completed on time.

Access to records is problematic. In most jurisdictions, any number of parties can have access, outside of the doctor and the patient. The police may request access to the records, the x-ray tech will need access, the dietitian, the social worker... the list goes on. Either everything gets networked to servers (which introduces unlawful access concerns), or everything gets saved to the tablet and thus, nothing can be entered or accessed by other parties should the tablet be in the hands of any one party.

The records tech/management will have to have access to the files at some point, if only for storage. Currently, files are stored in folders and labelled with bright stickers with numbers (terminal digit filing, paired with serial numbering). How will tablets be stored? There's no way a tablet is going home with the doctor, nor is there any way a tablet will be left around.

Archiving records will be easier though. Digitized records are currently and typically scanned and stored on optical drives (laserdiscs, essentially). Tablets should be able to do this easier, as all of the files are digital. The problem is connection. Can tablets connect to that server, in a secure manner? What will be the policy for connection, or access?

Solutions like XDSi and PACS/VNA's with zero footprint WADO viewers come in, your images are centrally indexed and can be requested and stored on a VNA to be accessible from any recent generation web enabled device the accessed audited and controlled and the DICOM tags can be respected by the VNA to allow access control.

DICOM encapsulation and direct storage of other image and document types by these systems helps with non radiology images.

The zero footprint viewers ensure the files never end up on the endpoint.

That's a change of mindset. You're in charge of your own data. Not an array of medical specialists, not legislators, not financially oriented organisations... You get all the data (if you choose), you get to store it (the way you choose), you get to understand it (as much or as little as you choose), you get to release it to medical specialists (as much as you choose)...

If your prevention is working well, there may in fact not be a lot of data to store.

There are a bunch of storage solutions out there. Amazon, Azure, Rackspace... and it's not hard to use. Getting easier. Storage can be sorted out and it's already feasible to do it cheaply, especially if your account is directly with the root provider.

Finding a way to get that data, is a mixed bag. I find that sometimes it's dead easy, other times it's hard. If that mindset is changed, those who want to can take charge of their own lives, to a greater degree. (You might not get all the doctors personal notes about you, that's fair.)

Those who don't want that much control, should also have their options of course.

Viewing you pax images on the iPad is fantastic. Logging in and using Epic can be a soul crushing miserable experience. It is common to take considerable amount of time loggin in and waiting for pages to load. Lag even with spectacular ping times and upload speeds. It often feels like running recent versions of programs on windows XP with 125 mb of RAM. Feature requests can take years even something as simple as an accurate IO or graph for vitals. Unfortunately Epic is one of the better systems. I think there are some vast opportunities if anyone can break down the barriers with a more customize friendly and better curated system.

Emulate, schmulate! With the new iPad, one can take any x-ray at any scanned resolution and view area details by swiping a finger to center and zooming by expanding a thumb and index finger to the limit of the viewier's vision. Try THAT with film -which is typically viewed on a wall viewer at a distance of 3 feet.

Are you saying Medical Professionals prefer to be "constantly zooming their heads closer and farther and panning around?" Ah, I don't think so. x-rays are static media. They have no choice.

With an iPad, they DO have a choice: They can either slap static film on a wall viewer -or be able to zoom and rotate images to their hearts content AND carry multiple stored images in a small package under their arm to the patient's room for a consultation. (-or view them remotely in at bedside from MRI or x-ray storage media via WiFi.

There are specific ergonomic reasons those x-ray light boxes are at head height and mounted vertically in most instances. If you can't wrap your head around that, then you probably shouldn't be commenting. Holding an iPad in front of your face or tilting your head down to look at it has negative consequences for the amount of mental focus placed on viewing the subject matter, just as throwing your fingers around does.

roman wrote:

aaronb1138 wrote:

..... The NTSC color gamut used to measure the iPad screen is only a fraction of the visible spectrum.

NTSC color gamut??? iPads have nothing to do with analog televisions. They're RGB devices from the start.

NTSC doesn't imply analog televisions when it comes to color gamuts. Apple touted the new iPad (3) as having 44% more color gamut than the iPad 2. That percentage is based on the NTSC color gamut. It displays 110% of NTSC, 94% of sRGB, and 64 % of Adobe RGB color spaces. In sRGB, the iPad (3) only has about 26% more color gamut because it is a different scale.

If you want to talk brightness and contrast production, the Asus Transformer Prime vastly beats the iPad as a medical image reproduction device. To understand, it is like light pollution and star gazing. In a metropolitan area, you can see what, maybe a couple hundred stars (stars and galaxies that is) under the best conditions. You go to the middle of the New Mexico desert, and you can distinguish 1,000s of stars. That is the need of contrast and brightness in medical imaging.

I don't know how to more clearly explain to people that they don't necessarily have a clue what they are talking about. Next time, Google first, then talk.

I've been working in the PACS/Medical IT industry for the last ten years. One thing that I feel is missing from the discussion here are the options in the DICOM standard for displays.

GSPS, or gray scale presentation state and GSDF or gray scale display function. These are additions to the DICOM standard to provide a consistent level of image quality and a way to accurately measure and manage the accuracy of the display. This is especially critical for mammography and FDA certifications.

To be honest, if I knew my Radiologist was reading my images on an ipad, I would want a second opinion. It's fine for a wet read, but I don't want my treatment based on the impression they get from an ipad.

BTW, the PACS company I work for is actively developing a viewer for the ipad.

I've been working in the PACS/Medical IT industry for the last ten years. One thing that I feel is missing from the discussion here are the options in the DICOM standard for displays.

GSPS, or gray scale presentation state and GSDF or gray scale display function. These are additions to the DICOM standard to provide a consistent level of image quality and a way to accurately measure and manage the accuracy of the display. This is especially critical for mammography and FDA certifications.

To be honest, if I knew my Radiologist was reading my images on an ipad, I would want a second opinion. It's fine for a wet read, but I don't want my treatment based on the impression they get from an ipad.

BTW, the PACS company I work for is actively developing a viewer for the ipad.

I don't think many people are aware just how important the gray scales are. It may be a software limitation but you simply can not see some problems on a regular monitor because of it. I have had some luck showing people problems on an iPad that were simply not visible on a regular monitor such as bleeds with diffuse axonal injury.

Might I suggest marketing a regular screen with a full grayscale screen to improve workflow.? The full screens are useless for charting.

We need nationally standardized charting more than iPads. We also need Doctors to frigging type rather than write. You want to talk improving healthcare? Implement these things first, worry about iPads later.

I can quite see this taking off as a grass roots thing... a lot of people will happily take privacy risks if it means having all their medical data collated, up-to-date, and able to be made immediately available to medical staff if that improves their quality of care... and really things like continuous heart/bp monitoring should be pretty simple to achieve by now.

My biggest concern is not so much privacy as it is hacking of the wireless sensors and devices. My perception is that many the technologies in use were developed with functionality in mind. I would hate for my physician to mis-prescribe my medication because someone intercepted the wireless communication from my device.

Eh... you're more likely to have the wrong drug (or dosage) prescribed, or the wrong directions associated with the right drug when it comes to electronic prescribing. Seriously, this is sort of like worrying about dying by lightning strike when you're morbidly obese.

It should be mentioned that the FDA has in fact certified the iPad for "mobile" diagnostic reading. A workstation is still the preferred method, but after putting the iPad through some rigorous testing, it does pass muster, at least for CT, MRI, and Nuc Med/PET. Mammography and "plain films" appear to be excluded here, but then those do require not only a very high pixel density to more closely mimic film, but also a wider gray scale than the typical monitor can accommodate. I don't recall the exact figures, but I think the 10 bits/pixel mentioned by another commenter seems right. As opposed to 8 bits/pixel that typical computer monitors support (and I can only assume that the iPad is "typical" in this manner), the higher scale allows for thousands more shades of gray to be seen, leading to greater potential for seeing subtle details that would be missed on a "typical" monitor.

tl;dr, but my wife did, and she's a pediatrician.They just switched to Epic from another electronic health records system, and dozens of professionals (nurses, mostly) have been working extra hours TYPING patient records in from the old system, using two computers. This is apparently the recommended way of doing this, per Epic.

My wife's comment is that Epic "takes the art out of medicine." I believe most visit types (e.g. "two-year-old well child check") have a template in Epic which the doctor just runs down, like a list of checkboxes. If they don't work straight off the in-room screen, they'll be doing hours of work sometime filling in the template, so many doctors (I suspect) just run through the list rather than having a interested conversation with the patient (or parent). My wife is fighting this, and she's one of the ones taking her family time to "do charts" so that her patient care doesn't suffer.

Epic IS really good for administrators. There's all kinds of data and charts and metrics you can get (as noted in the article). The health care administrators can measure the "performance" of the doctors in many ways, none of which have anything to do with how compassionate, alert, successful (etc.), they are, or how healthy their patients are.

My wife commented that the ER mobile app (on page 2) was really cool. Often you'll see your doctor, then they'll disappear for an hour or more. This app would let you see that they're doing something other than having coffee with the nurses. Also, especially in teaching hospitals, there are always lots of extra people (students, residents, etc.) and the app identifying them with pictures was neat, especially since it looks like it will update as employees come on and off shift.

Finally, she commented on the telemedicine and how it would be great to have increased access to specialists, especially as she works in a rural office.

My wife's comment is that Epic "takes the art out of medicine." I believe most visit types (e.g. "two-year-old well child check") have a template in Epic which the doctor just runs down, like a list of checkboxes. If they don't work straight off the in-room screen, they'll be doing hours of work sometime filling in the template, so many doctors (I suspect) just run through the list rather than having a interested conversation with the patient (or parent). My wife is fighting this, and she's one of the ones taking her family time to "do charts" so that her patient care doesn't suffer.

Finally, she commented on the telemedicine and how it would be great to have increased access to specialists, especially as she works in a rural office.

I'm glad she is embracing the checklists even if she feels like some of the social feedback side is lost in practice. Every medical study regarding flow chart and check list use shows very significant improvement of care, diagnostic time, and accuracy.

..... The NTSC color gamut used to measure the iPad screen is only a fraction of the visible spectrum.

NTSC color gamut??? iPads have nothing to do with analog televisions. They're RGB devices from the start.

NTSC doesn't imply analog televisions when it comes to color gamuts. Apple touted the new iPad (3) as having 44% more color gamut than the iPad 2. That percentage is based on the NTSC color gamut. It displays 110% of NTSC, 94% of sRGB, and 64 % of Adobe RGB color spaces. In sRGB, the iPad (3) only has about 26% more color gamut because it is a different scale.

If you want to talk brightness and contrast production, the Asus Transformer Prime vastly beats the iPad as a medical image reproduction device. To understand, it is like light pollution and star gazing. In a metropolitan area, you can see what, maybe a couple hundred stars (stars and galaxies that is) under the best conditions. You go to the middle of the New Mexico desert, and you can distinguish 1,000s of stars. That is the need of contrast and brightness in medical imaging.

I don't know how to more clearly explain to people that they don't necessarily have a clue what they are talking about. Next time, Google first, then talk.

Oh, you're talking about absolute display brightness, which is pretty meaningless since the human eye is pretty good at adjusting to the dynamic rage of any display device.... Especially when you turn the lights off in the room; which is something already done for film x-rays.

Not to mention that both the ASUS and the iPad are 8 bits per color devices, so there's already a scale limitation (especially for greyscale) when it comes to luminosity range. But at least it's one that the iPad can tackle in software, since it can easily simulate higher bit color by applying slight dithering techniques on Retina-class displays-- which at such resolution would be virtually unnoticeable.

But regardless of the ASUS's abilities, I highly doubt the medial diagnostic industry is going to spend millions of dollars of engineering effort on an Android tablet app.... In fact I could see them developing a Win8 Metro app before thinking about Android.. but that's an aside that has nothing to do the hardware and more to with with platform establishment and perceived future relevance.

Is no one concerned about the privacy and security of all this information? There needs to be real economic and possibly legal incentives to take security seriously or it won't be long till we see massive leaks of medical information.

Not to mention that both the ASUS and the iPad are 8 bits per color devices, so there's already a scale limitation (especially for greyscale) when it comes to luminosity range. But at least it's one that the iPad can tackle in software, since it can easily simulate higher bit color by applying slight dithering techniques on Retina-class displays-- which at such resolution would be virtually unnoticeable.

The thing is though, I can't see any really *need* to report a plain x-ray on an iPad, so I can't see the business case for going through making an FDA approved iPad for diagnostic reporting of plain x-ray. The reality is, if a study needs to be looked at "mobile", it means it's most likely an urgent ER patient and there is no radiologist on site, and it will almost always be a CT. If they're taking a plain x-ray, they're looking for something very obvious, obvious enough that the ER physician will be able to see it in the hospital without requiring a radiologist's look. If they're suspecting something like an Abdominal Aortic Aneurysm, they'll be doing a CT, not a plain. Perhaps the situation is different in other countries, but I've never even heard of radiologists looking at plain x-ray for "on call". On call work is pretty much all CT, with the odd ultrasound here and there, both of which are perfectly fine to look at on an iPad.

My biggest concern is not so much privacy as it is hacking of the wireless sensors and devices. My perception is that many the technologies in use were developed with functionality in mind. I would hate for my physician to mis-prescribe my medication because someone intercepted the wireless communication from my device.

I would trust the cloud more than a human doctor when it comes to managing dosages. Is a human doctor who has hundreds of patients really more reliable than a database? Nothing beats a database.

Having worked with medical records before (as a records tech), an iPad (or other high-res, high-contrast tablet) would definitely help in some areas, but might be disastrous in others.

Doctors are notorious for incomplete and unreadable records. Typing out commands and symbols will slow down a doctor too. Thus, readability (if a stylus is used) won't change, and a tablet won't guarantee any changes in the completeness of records (or whether a record is completed on time.

Access to records is problematic. In most jurisdictions, any number of parties can have access, outside of the doctor and the patient. The police may request access to the records, the x-ray tech will need access, the dietitian, the social worker... the list goes on. Either everything gets networked to servers (which introduces unlawful access concerns), or everything gets saved to the tablet and thus, nothing can be entered or accessed by other parties should the tablet be in the hands of any one party.

The records tech/management will have to have access to the files at some point, if only for storage. Currently, files are stored in folders and labelled with bright stickers with numbers (terminal digit filing, paired with serial numbering). How will tablets be stored? There's no way a tablet is going home with the doctor, nor is there any way a tablet will be left around.

Archiving records will be easier though. Digitized records are currently and typically scanned and stored on optical drives (laserdiscs, essentially). Tablets should be able to do this easier, as all of the files are digital. The problem is connection. Can tablets connect to that server, in a secure manner? What will be the policy for connection, or access?

All data can be encrypted and stored in the 'cloud', nothing need remain on any individual tablet. Tablets and other access points can be set up with biometric (or other) security procedures. The problems you describe are all policy and security related, quite solvable even with today's technology.

Not to mention that both the ASUS and the iPad are 8 bits per color devices, so there's already a scale limitation (especially for greyscale) when it comes to luminosity range. But at least it's one that the iPad can tackle in software, since it can easily simulate higher bit color by applying slight dithering techniques on Retina-class displays-- which at such resolution would be virtually unnoticeable.

The thing is though, I can't see any really *need* to report a plain x-ray on an iPad, so I can't see the business case for going through making an FDA approved iPad for diagnostic reporting of plain x-ray. The reality is, if a study needs to be looked at "mobile", it means it's most likely an urgent ER patient and there is no radiologist on site, and it will almost always be a CT. If they're taking a plain x-ray, they're looking for something very obvious, obvious enough that the ER physician will be able to see it in the hospital without requiring a radiologist's look. If they're suspecting something like an Abdominal Aortic Aneurysm, they'll be doing a CT, not a plain. Perhaps the situation is different in other countries, but I've never even heard of radiologists looking at plain x-ray for "on call". On call work is pretty much all CT, with the odd ultrasound here and there, both of which are perfectly fine to look at on an iPad.

Sure, but you're missing the big picture. Once the data is uploaded to the cloud, it can be analyzed by anyone (the radiologist could be in New Delhi, or Hong Kong), including artificial intelligence programs. This is the beauty of the 'network effect'. The power and efficiency of the system strengthens the larger it becomes. This will solve the coming healthcare crisis in the western world.