In a post two weeks ago, we were critical of some aspects of the JASON Task Force’s (JTF) Final Report on healthcare interoperability. Two members of the JTF reached out to us in order to clarify the intent of the report as it relates to EHRs and the use of a “public” API to help make healthcare applications more interoperable. During a long conversation, we had a chance to discuss the issues in detail. Following that discussion we took some time to reconsider our opinions.

We now have to agree that the JTF itself was not EHR vendor dominated and have corrected the previous post. The Task Force was comprised of a wide range of stakeholders including several providers. Unfortunately, however, the testimony the JTF received was overwhelmingly from HIT vendors, consultants, or their proxies. We doubt this was intentional, but simply the vendor community having a more vested interest in influencing the JTF. But it does lead us to the conclusion that it is incumbent upon the JTF to proactively solicit provider testimony before policymakers act on the recommendations of the JTF report.

Despite our long conversation with these members of the JTF, we still have a difference of opinion on one key issue: The central importance of the EHR with regards to public APIs and interoperability.

The original JASON Report points squarely at EHRs as the source of interoperability ills. It also called for EHRs to adopt the public API. By our count, the JTF Final Report uses three different ways to describe where the public API should sit: a “Data Sharing Network”, “CEHRT”, or “clinical and financial systems.” In our follow-up discussion, JTF representatives maintained that the intent to include EHRs is clear and that the task force struggled on this issue of how broad their mandate was.

The JTF decided to cast a broader net than just the JASON Report’s initial focus on EHRs. But they did not clarify an already complicated issue, nor did they unequivocally single out EHRs as where the need for a public API should begin. We think that their intention to include EHRs is sincere but maintain the position that the JTF should explicitly recommend that EHRs expose services and data with the public API. Without such clarity, the fuzzy language used in the JTF report could end up being adopted in future rule-making or legislation, creating the potential for uncertain outcomes.

Our Thesis:
Good, bad, or otherwise, the EHR is the dominant application supporting clinical workflows and the source of most patient healthcare data.

Every provider we have ever talked to says that improved patient care and more effective care coordination would be possible with better access to other providers’ EHRs. On the other hand, we have not talked to many providers who say that better patient care and better care coordination would be possible if only there was better access to other providers’ financial systems. The majority of providers have never heard of a Data Sharing Network (and no, we do not believe Direct can fill this bill) so the public API is pretty much dead in the water there as well – though most any HIE/CNM vendor worth their salt would welcome a public API.

So let’s be perfectly clear in the JTF report – if we want EHRs to adopt a public API, then let’s just say so rather than beating around the bush. To do otherwise sends the wrong message to the market – that EHRs are somehow not central to the interoperability problem.

The JASON Report created quite a fuss in the HIT marketplace as some screamed foul and others were encouraged that maybe, just maybe the JASON report may force movement to more open systems. To clear the air, the JASON Task Force (JTF) was formed to solicit industry feedback for policy makers. The JTF released their findings earlier this month.

The JASON Report was an AHRQ- and HHS-sponsored study of healthcare interoperability issues. Its basic conclusion was that the existing EHR-based HIT infrastructure should be superseded by something more open and amenable to use by other applications and across organizations. The JASON Report advocated radical solutions to the interoperability crisis: using MU3 to replace existing EHRs and requiring a uniform set of APIs for EHRs across the industry.

Vendor response was rapid and unified. HITPC appointed a task force representing stakeholders from across the industry (virtually all have been on other ONC workgroups, so somewhat cloistered) who worked with alacrity through the summer. The tone of vendor testimony before JTF reflected a level of alarm that contrasts sharply with HCO’s non-participation.

JTF and its vendor members have some legitimate beefs: the JASON Report is not exactly disinterested. It substantially reflects the view of the clinical research community which sees itself as the long-suffering victim of EHR intransigence. The JASON Report glosses over genuine, if crepuscular, progress in healthcare interoperability. Another point that we believe has not been made forcefully enough by EHR vendors is that they are constrained by their HCO customer’s ability to change. The organizational obstacles to healthcare data liquidity are significant and EHR vendors move only as fast as HCOs despite their claims to the reverse. However, we think that JTF is wrong to deflect attention away from the EHR-oriented APIs.

JTF’s proposed alternative to EHR supersession involves something it calls Data Sharing Networks (DSN). These are a rebranding of the HIE supplemented with a uniform set of APIs to support access to something never specified in much detail. JTF suggests that these APIs be based on the replacement to HL7 – FHIR.

Without doubt, FHIR represents a significant improvement over HL7 along multiple dimensions. But the idea that FHIR alone can cure the interoperability ills of healthcare is all smoke. Behind this smokescreen, EHR vendors are hoping that people eventually lose interest or stop talking about interoperability. With this bit of redirection, JTF has basically let the EHR vendors off the hook.

This begs the question: Where is the best place to have a uniform set of APIs reside, the DSN (HIE) or the EHR?

Our answer: Both!

The HIE is really a stopgap measure in the sense that discrete access to EHRs and other data sources across organizations via a uniform set of APIs and SOA will greatly reduce the need for an HIE. If applications could access all of a patient’s data directly from native data sources in different HCOs, there isn’t much point in maintaining separate and comprehensive CDRs at different sites in the overall healthcare system.

But rather than move in this direction, the JTF favors the politically powerful EHR vendors at the expense of the HIE vendor community.

No doubt creating a set of uniform APIs to EHRs would be costly. Upward and backward compatibility, a hallmark of every successful IT platform, requires deeper pockets than most EHR vendors can muster. But some EHR vendors are better positioned to support such APIs than others. Many hospital EHR vendors could make the investment. Smaller, community-focused, or client-server based EHR vendors and their customers though would struggle.

Our HIE research has shown – year after year – that data flows downhill from hospital to community. Hospital-based EHR data is valuable to community-based clinicians. It is also extremely valuable to those hospitals to ensure that physicians in a community get discharge summaries to minimize readmissions and associated penalties. Hospital-based EHRs are a good place to start with uniform APIs. The reality is that community-based EHR data could also be better used in hospital settings to facilitate care. This is especially true as we move away from fee for service to more risk-based payments models.

Unfortunately, facilitating data flows between community and hospitals is something we’ll be patching together with string, baling wire and duct tape for the duration. The JASON Report and subsequent JTF report have not moved the ball forward on this issue. It is our opinion that there is little that the policy folks in Washington D.C. can do with additional prescriptive meaningful use requirements. HHS would better serve the market by using financial incentives that promote healthcare organizations to demand better interoperability capabilities from their vendors as it is the customer that vendors really listen to, not D.C., policy wonks.

By Anna McCollister-Slipp
Anna McCollister-Slipp is co-founder of Galileo Analytics, a visual data exploration and advanced data analytics company focused on democratizing access to and understanding of complex health data. She is a member of the judging panel for the $10M Qualcomm Tricorder XPRIZE.

The digital health revolution has failed… so far. The industry that has grown up around it — to cheer it on and promote its potential — is thriving. But while those who organize conferences, found coalitions and work as consultants gain acclaim, write books and give TED talks, patients and physicians wait for the promise of the digital health revolution to become a reality.

We’re tired of waiting.

For those of us with chronic disease, a digital health revolution is the best chance we have. We need it to succeed. We’re desperate for innovation that works. We have experienced tremendous developments and intuitively grasp the potential, but when we peruse the app store and download a few, their usefulness rates as “meh” at best.

We stare longingly at Apple’s new Health app on our iPhones, only to discover it can’t access our data. So back we go to tracking our information on multi-page printouts, or Post-It notes, or in our heads. We receive our lab results via fax, phone, in the mail, or if our doctors are willing to take the risk — via email. We see news stories detailing the government’s investment in the digitization of health and are awed that so much money and discussion can produce such limited results.

In the past five years, we have committed $33 billion taxpayer dollars to digitize our nation’s health care data. The need was unquestionable, and the potential gains are tremendous. However, the system that has emerged has essentially replicated — in digital form — the acute care-focused health system that has been failing us for decades as we grapple with the growth in chronic disease.

Few of the hospitals receiving government incentive payments to install digital health tools are willing or able to incorporate the data generated by a patient’s personal medical device into that patient’s electronic health records, even for data-intensive diseases like diabetes.

At the same time, according to StartUp Health, since 2010, we have invested nearly $13 billion in mobile and digital health ventures aimed at building apps to promote health. But those who could most benefit from these new tools — those with chronic disease — aren’t using them. In fact, most of the “health” apps available to date are for those who are healthy.

A 2013 IMS Institute study showed that of the nearly 44,000 “health” apps in the app store, less than half were legitimately related to health. Of those that were, most were focused on prevention or wellness, with fewer than 2,000 aimed at individuals with a diagnosis. And of those that were downloaded, few were used regularly. A separate study by Research2Guidance, which looked at diabetes apps, was even more damning. Despite numerous surveys citing diabetes as the ultimate example of a disease that will benefit from mobile health, only 1.2 percent of diabetes patients with smart phones use digital health apps because of the need to manually input data.

It’s not that we aren’t tracking our information — we are. A recent Pew study shows that while most Americans living with chronic illness track certain health metrics related to their disease, 41 percent use a pencil and paper, while 43 percent say they track of things “in their head.” (Both of which tend to work better than most health apps available today.)

Life with Digital Dysfunction

So how does this all play out? Let’s use me as an example: Like many patients with Type 1 Diabetes, I have a number of co-morbid diseases, complications and diagnoses. Each day, I take 15 medications. I use eight medical devices (four that are prescription, four that are not). Two devices are literally attached to my body 24/7, and the rest are never far from of reach. In 2013, I saw 13 different physicians and had a total of 63 doctor’s appointments. I had multiple blood draws tracking more than 100 lab values — all the while being sure to eat right, get plenty of sleep, and do several forms of exercise.

How much of this did I manage digitally? Not much. I’m swimming in data that could be helpful, but that data is mostly inaccessible. All of my devices generate data in one form or another: my continuous glucose monitor generates glucose levels every five minutes, 24/7. My insulin pump records the dosage of my insulin and stores the data for months. My blood glucose meter stores the glucose measures I take between five to 10 times a day, and my fitness tracker, digital scale, heart rate monitor and blood pressure cuff all generate electronic, structured data that could be easily combined into a single timeline to illuminate important patterns that could help me manage my health. It could be helpful, but it isn’t. Accessing the data stores is clumsy at best. I can’t even download my CGM data to my Apple computer — the software only works on Windows. Even when I can access the data, the process takes hours, and combining it manually for most people is impossible.

And it isn’t just about the devices. I receive most of my medical care at a large, academic health institution located less than five miles from where our nation’s health IT policy is generated, but I still can’t access my electronic health record online or communicate with my physicians electronically. And, the hospital’s IT department refuses to give my endocrinologist access to the free software required to download my CGM data on his computer.

Despite the fact that the major diagnostic labs in the country have been sharing data electronically with physicians for years, the only way I can get my lab results is through emails from my physicians who choose to risk a HIPAA violation to give me the information I need to manage my health. None of my physicians use electronic scheduling, despite the fact that secure online scheduling tools have been available for years. And only one permits me to request prescription refills electronically.

Now here’s the good news: All of this is fixable. The technology part is easy. We know how to make this work, but we lack the societal will to make it happen. The government can do much to push the system along, but device manufacturers, technology companies and hospitals need to do the rest. We spend billions to find breakthrough cures for the future, yet fail to follow through on the “easy” wins that can take us so far today.

Curing disease is difficult. Making data streams accessible and interoperable is not.

OAKLAND — The Alameda Health System thought a $77 million investment in new health records technology would transform an old chart-pushing public hospital bureaucracy into a state-of-the-art electronic medical network.

What administrators got, instead, was a mounting financial crisis with the new electronic system one of several culprits.

“To put it simply, we have run out of cash, we have maxed out our credit lines with the county of Alameda,” said James Lugannani, one of the newest members of the hospital board of trustees, speaking to fellow trustees this month. “Now, time is of the essence.”

Lugannani, whose day job is as a financial adviser, is expected to join outgoing CEO Wright Lassiter III on Monday morning in asking for help from Alameda County government to stanch a financial hole by restructuring and delaying payment of a long-term debt owed to the county.

Health care executives, including Lassiter, have said hiccups in the implementation of a new Siemens Soarian electronic records system by Pennsylvania-based Siemens Healthcare are not the sole cause of the hospital network’s current woes, but the IT troubles play a big part in the cash flow crisis affecting Highland Hospital in Oakland and other hospitals and clinics run by the consortium. Other reasons for the liquidity problems are delayed reimbursements from the federal government and the system’s recent takeover of two hospitals in San Leandro and Alameda, administrators have said.

The consortium formerly known as the Alameda County Medical Center signed a 2011 contract with Siemens Healthcare for its suite of Soarian software that shares patient records electronically and processes medical bills.

The investment was propelled by the 2009 federal stimulus law that gives money to hospitals that improve their technology and penalizes those that did not begin switching to better technology by 2012.

Siemens says the software it delivered to the East Bay hospitals is not the problem.

“The systems are operating at AHS within the parameters of the initial project scope and there is no malfunction within the technology,” said a written statement sent by the company Thursday.

But “the activation did not go as well as planned,” Alameda Health System’s Chief Information Officer Dave Gravender reported to the hospital board of trustees earlier this year.

He was explaining billing problems involving the Soarian Financials system that went live in July 2013.

Gravender did not return calls for comment this week. Neither did Mark Zielazinski, who was chief information officer when the Siemens contract was signed but left for Marin General Hospital in 2012.

A former interim information chief, Howard Landa, who is also a urologist and the chief medical information officer, did agree to speak but was later instructed to decline a scheduled interview.

“AHS doesn’t really feel that it has anything to add to the conversation,” said Jennifer DuBois of AMF Media Group, the network’s public relations consultant.

In an interview last month, the health system’s newly hired chief financial officer, David Cox, said the IT problems are complicated and significant, affecting about $50 million in operating expenses that otherwise could have been used to pay down debt.

In a nutshell, he said, “the system makes it difficult to collect the right information that you need to bill a claim and makes it hard to identify what kinds of errors are occurring. …. It’s very disjointed right now. A lot of mistakes are being made.”

The health system board authorized another $1.5 million contract with Siemens this month for its consultants to help work through the problems with hospital staff.

This newspaper filed a public records request with the Alameda Health System on July 9 for all of its contracts and contract change orders with Siemens. None of the information has been provided.

How much the health system has spent on the new electronic health record system is unclear.

In his report earlier this year, Gravender said the total capital and operating budgets for the project amount to $77.1 million, which includes both the Siemens Soarian launch and separate technology from NextGen Healthcare, another Pennsylvania company.

Several current and former Highland physicians said the problems with the new electronic records system are not just with reimbursements but also affecting patient referrals and safety.

One physician said lab results get posted into the new Siemens clinical system, but doctors are not electronically notified of the results.

“There’s not a single part of the hospital — inpatient, outpatient, ER — that has fully functional (electronic health records),” said the doctor, who asked that her name not be used because of job security concerns.

Other employees said the problems are improving as the hospital works through the bugs. Electronic record system mishaps have plagued hospitals around the country in recent years, especially those such as Alameda Health System that waited for the federal incentive deadlines to invest in modern records-sharing technology.

Those IT problems are now feeding into deeper fiscal and cultural conflicts between hospital administrators and Alameda County government.

The health system split off from the county in 1998 but still depends heavily on funding from the county, including a long-standing line of credit that allows the hospital system flexibility in balancing its cash payments.

The debt was supposed to be reduced to $110 million by June and to $30 million by June 2018. As of the end of September, however, the debt was $173 million, said Alameda County Deputy Auditor Steve Manning.

“It’s owed to the county. I want to make sure it gets paid,” Manning said.

Lassiter, who is stepping down as chief executive officer in December after nearly a decade at the helm, has sought to transform the hospital system so that it can compete with corporate giants Kaiser Permanente and Sutter Health in attracting East Bay patients who now have more insurance choices under President Barack Obama’s health care reform law. That vision was one reason the consortium recently acquired San Leandro Hospital and Alameda Hospital.

Alameda County officials, however, want the public hospital system to more directly and openly embrace its historic role as a safety net for the poor and uninsured.

Now that the hospital consortium says it is losing money and looking to delay its debt payments, the county is likely to demand a more powerful role in overseeing the hospital system’s future.

The Bay Area News Group and New America Media collaborated on this story. Matt O’Brien is a staff writer for the Bay Area News Group. Contact him at 510-208-6429. Viji Sundaram is the health reporter for New America Media. Bay Area News Group staff writer Thomas Peele also contributed to this report.

A pedestrian wears a surgical mask as he crosses the street in front of Texas Health Presbyterian Hospital.

Credit Nathan Hunsinger/The Dallas Morning News

Hard Cases

Dr. Abigail Zuger on the everyday ethical issues doctors face.

Will history someday show that the electronic medical record almost did the great state of Texas in?

We do not really know whether dysfunctional software contributed to last month’s debacle in a Dallas emergency room, when some medical mind failed to connect the dots between an African man and a viral syndrome and sent a patient with deadly Ebola back into the community. Even scarier than that mistake, though, is the certainty that similar ones lie in wait for all of us who cope with medical information stored in digital piles grown so gigantic, unwieldy and unreadable that sometimes we wind up working with no information at all.

We are in the middle of a simmering crisis in medical data management. Like computer servers everywhere, hospital servers store great masses of trivia mixed with valuable information and gross misinformation, all cut and pasted and endlessly reiterated. Even the best software is no match for the accumulation. When we need facts, we swoop over the surface like sea gulls over landfill, peck out what we can, and flap on. There is no time to dig and, even worse, no time to do what we were trained to do — slow down, go to the source, and start from the beginning.

On the hospital wards, mixed messages abound. A couple of months ago, I was on the receiving end of a furious, expletive-laden outburst from one sick patient, the printable fraction of which ran, “Can’t you people read?”

This man had by then recounted the long story of his bad leg to three separate teams of doctors and nurses. I was the 14th interrogator by my count, and despite my standard opening gambit (“I know you’ve been over this before”) I was the one to flip his switch: The patient ordered me and my team out of his room and pulled the covers over his head.

Who can blame him for assuming that in this day and age, once told, his story needed only to be retweeted. But medical care requires dialogue. Although we plucked some information from the glut of words in his chart and cobbled together a plan, we didn’t do him justice, not by a long shot.

The fact is that even if all the redundant clinical information sitting on hospital servers everywhere were error-free, and even if excellent software made it all reasonably accessible, doctors and nurses still shouldn’t be spending their time reading.

The first thing medical students learn is the value of a full history taken directly from the patient. The process takes them hours. Experience whittles that time down by a bit, but it always remains a substantial chunk that some feel is best devoted to more lucrative activities.

Enter various efficiency-promoting endeavors. One of the most durable has been the multipage health questionnaire for patients to complete on a clipboard before most outpatient visits. Why should the doctor expensively scribble down information when the patient can do a little free secretarial work instead?

Alas, beware the doctor who does not review that questionnaire with you very carefully, taking an active interest in every little check mark. It turns out that the pathway into the medical brain, like most brains, is far more reliable when it runs from the hand than from the eye. Force the doctor to take notes, and the doctor will usually remember. Ask the doctor to read, and the doctor will scan, skip, elide, omit and often forget.

The same problem dogs other efforts to reduce the doctor’s mundane history-taking responsibilities. For instance: Why not leave it to the nursing staff to ask all those dull questions about smoking, drinking, social activities and recent travel? They will write it all down. The doctors will review.

And then the next thing you know, that unimportant background information explodes all over the nightly news, because the doctors failed to review, or failed to remember what they reviewed, and key travel details simmered unnoticed in the bowels of some user-unfriendly electronic medical record.

Over and over again we are forced to admire the old traditions. As we tell the students, it’s not that complicated. You say hello, you sit down, and you have a conversation.

A few months after our expletive-spewing patient got better and went home, our team went to see a more cooperative young man admitted to the hospital with a fever. This one had gotten sick after a camping trip in California, and the words “camping” and “California” were repeated over and over again in his chart, escalating into the general conviction that he had come down with a serious fungal infection that can be acquired from the soil in some parts of Southern California.

If this patient had refused to talk to us, we might have been tempted to treat him for that infection, which would have been a big mistake. Fortunately, he politely led us through his entire hike, which proved to have skirted the habitat of this fungus by hundreds of miles. We could tell his other doctors to stop focusing on his travels and pay attention to his heart murmur instead, the real clue to his problems.

Like good police work, good medicine depends on deliberate, inefficient, plodding, expensive repetition. No system of data management will ever replace it.

PEMBERTON, NEW JERSEY – The problems continue at the Philadelphia Veterans Affairs benefits office despite scathing congressional testimony more than two months ago about mail and records manipulation, the woman who blew the whistle on the problems told members of the House VA committee Friday.

I “regret to tell you that things have not changed, and that accountability is greatly lacking for the management officials involved,” said Kristin Ruell, a quality services representative in the Pension Management Center at the Philadelphia Regional Office, which manages the Wilmington benefits office near Elsmere. “The practices of data manipulation have continued at the Philadelphia RO.”

“We do understand the … seriousness of the concerns about the operation in Philadelphia that have been raised,” responded Diana Rubens, regional office director since Aug. 26. “And I want to assure you, we share those concerns, and we’re quickly taking action to address those issues.”

The Wilmington office, dwarfed in size by the Philadelphia office it falls under, wasn’t mentioned during the hearing. Wilmington, however, has sent disability compensation claims to Philadelphia since at least fiscal year 2011 – the same year in which “boxes” of claims sent to Philadelphia were found unprocessed and piled up, said Ruell, who first described the problem on Capitol Hill on July 14.

That means some of the Wilmington cases – at least 10 disability compensation claims from fiscal years 2011 and 2012 and about 300 pending appeals in fiscal 2013 – could be in that same sort of limbo, Ruell said.

“Any case that comes in our building, I notice the same issues, regardless of where it’s from,” Ruell said in an interview following the field hearing, held at the campus of Burlington County College in Pemberton, New Jersey, an area rich with vets served by the Philadelphia office.

“I see problems across the board,” said Ruell, who began working for VA in August 2007. “These issues happen because employees are rushed, and they’re forced to meet a production standard at the end of the day. So sometimes, it’s not about going the extra mile for the veteran, because they won’t have a job if they fail their standards.”

The claims traveled in the opposite direction as well; 512 cases were “brokered” from Philadelphia to Wilmington in fiscal year 2013. The transfers to and fro, which became a major issue in 2013 when VA started getting roundly criticized for its large claims backlog, did not go unnoticed by disability compensation claims workers.

“It seems like it was kind of like a shell game, where they’re just shifting these cases from Philly to Delaware – and then saying, look, we’re making progress,” said Christian Dejohn, a claims handler in Philadelphia’s Veterans Service Center. “We think that a lot of people in the Philly office are aware that was going on. Of course, we were very disappointed.”

“It’s shuffling,” said Ryan Cease, like DeJohn an Army vet and a veterans service representative in the service center’s appeals department who, along with DeJohn, has cooperated with congressional investigators. “It’s basically shuffling.”

The hearing, before Reps. Jon Runyan, R-New Jersey, and Dina Titus, D-Nevada, was to hear “additional concerns” beyond those raised last summer, when Ruell told the full committee she been made aware of improper shredding of military mail, data manipulation and beneficiaries receiving improper benefits payments – and has been subjected to four years of retaliatory harassment as a result.

The data manipulation issue stemmed from a directive, since rescinded, that, misapplied, allowed staffers to give unadjudicated claims a more current data – a “discovered date.”

“A memo was used to minimize the average dates pending of the claim to make the regional office’s number look better,” Ruell told the committee in July.

VA’s inspector general substantiated those concerns during an unannounced visit in June. Its investigation continues, said Linda Halliday, the IG’s assistant inspector general for audits and evaluations.

Runyan expressed particular concern over the IG’s identification of several instances of duplicative pension payments – the result of duplicate records in the center’s electronic system. “If neither workload management nor fiscal stewardship are priorities, what do you see as the priority there?” he asked Halliday.

“I believe what is driving this is to meet production metrics at the expense of making the right decisions and processing the veteran’s claim according to how it should be processed,” Halliday replied.

In other words, DeJohn and Cease said and as Ruell indicated, production goals processors are expected to meet.

“The point system is a real problem,” DeJohn said. “The VA point system.”

“The point system basically evaluates your productivity,” Cease said. “It also covers your accuracy. So for a person to say you have a productive day based on how many points you did per day, a lot of people would cherry-pick and say, well, I’m going to pick the easy work, put aside the hard work, and just gain points.”

An “easy” claim, he said, would be one with fewer individual medical conditions.

DeJohn said he was fired in 2012 for “alleged low numbers,” winning his job back after 1½ years.

The system, the two claims workers said, remains in place – as do the repercussions felt by those speaking out. This, in spite of new VA Secretary Robert McDonald’s promise to protect them.

DeJohn said he’s received death threats. Ruell has felt more subtle retaliation.

“They’re very creative in the things that they do to employees,” she said after the hearing. “They make it look like it’s a legitimate, legal thing, but … I never feel like I’m wanted in that building. I’ve never felt appreciated for anything I’ve brought forward. I basically show up because people rely on me to do the right thing and help report things.

“That’s why I come back,” she said. “I would never choose this job again, if it wasn’t for helping veterans.”

IG spokeswoman Cathy Gromek said to look for the IG’s final report on the Philadelphia regional office in late November or early December.

Contact William H. McMichael at (302) 324-2812 or bmcmichael@delawareonline.com. On Twitter: @billmcmichael