PROCEEDINGS (8:05 a.m.)

Agenda Item: Call to Order

MR. REYNOLDS: Good morning. I’d like to call this meeting to order. This is
the meeting of the Subcommittee on Standards and Security of the National
Committee on Vital and Health Statistics. The committee as you know is the main
public advisory committee to the U.S. Department of Health and Human Services
on national health information policy.

I am Harry Reynolds, co-chairman of the committee, along with Jeff Blair. I
work at Blue Cross Blue Shield of North Carolina. As I will ask each member of
the committee to note whether or not you have any conflicts, and I do not
today.

I want to welcome Jeff, fellow subcommittee members, HHS staff and others
here in person. I do want to inform everyone that this is a public meeting and
that we are broadcasting on the Internet, so please speak clearly into the
microphone as you make your comments. Also, the meeting is being recorded and
transcribed.

Jeffrey, before we get into the agenda, I would like everybody to go ahead
and introduce themselves. We will start with you.

MR. BLAIR: Jeff Blair, co-chair of the subcommittee, member of the full
committee. I am Director of Health Informatics for Loveless Clinic Foundation,
and I don’t have any conflicts of interest.

DR. STEINDEL: Steve Steindel, Centers for Disease Control and Prevention,
staff to the subcommittee and liaison to the full committee.

DR. CARR: Justine Carr, Beth Israel Deaconess Medical Center, member of the
full committee, no conflicts.

DR. WARREN: Judy Warren, University of Kansas School of Nursing, member of
the subcommittee, and I have no conflicts.

MS. GREENBERG: Marjorie Greenberg, National Center for Health Statistics,
CDC, and Executive Secretary to the committee.

MS. SQUIRE: Marietta Squire, CDC, NCHS and staff to the subcommittee.

MS. MITER: Elizabeth Holly Miter, CHI representative.

DR. KILBORN: John Kilborn, National Library of Medicine and visitor.

MR. OFAMA: Bill Ofama, Blue Cross Blue Shield Association.

DR. BOYD: Lynn Boyd, College of American Pathologists.

MS. BYRNE: Terry Byrne, Rx Hub.

MS. ECKERT: Karen Eckert with Medi Center.

MS. HARPER: Amy Harper, National Library of Medicine.

MS. BICKFORD: Carol Bickford, American Nurses Association.

MR. OMUNDSON: Todd Omundson, American Hospital Association.

DR. HUFF: Stan Huff with Intermountain Health Care and University of Utah
in Salt Lake City and member of the committee and subcommittee. No conflicts.

MR. REYNOLDS: Since we started promptly on time, Vivian appears to be ready
to speak, but she doesn’t get to for seven minutes.

What I wanted to make sure we did, we do have a hearing coming up in April.
We talked a lot yesterday about the upcoming things that we wanted to take care
of. I wanted to spend this period of time, because if anybody has to leave for
a plane or anything before we get to the end of the discussion, I really want
to make sure we have that solidified.

The things that we feel are there for April is, we have the updated letter
that we talked about yesterday on HIPAA, that we said that we would change a
little bit of the flavor to, but go ahead and make sure we solidified that, and
worked on it in sync with the annual report. We are not planning to have the
annual report in April, but work on it together, try to get the findings for
this letter clearly stated and clearly structured.

The second is to hopefully look at an initial draft of at least a format or
a structure for matching patients to records, so Judy, as you guys move toward
that. What we would like to do is set up some time to discuss these, especially
this matching patients. We are going to actually move into what we are going to
say, rather than just putting it out there and say everybody wordsmith it, we
need to discuss it. So we will put time on there to have a discussion amongst
ourselves, rather than just look at a draft and go through line by line and
assume that is what we are doing. We need to come to some synergy on what we
are going to try to do.

The next thing that we want to do, and we had had this listed on the chart,
you remember we have all been using this chart we have to keep an eye on what
is going on, we felt that it was time to start hearing about NPI. There is a
lot of discussion out there about where it is.

MR. BLAIR: National provider identifier.

MR. REYNOLDS: Yes, national provider identifier, yes, what is the status.
We will get a cross-section of presenters to come in and talk about that. It is
my understanding that we may be receiving a letter as a committee from some
entities that have some concerns, so we will make sure that they get a chance
to — even if they send us the letter ahead of time, they have a chance to be
part of the presentation, because this is the next big rollout that is going on
and is underway, if you heard Karen touch base on it the other day.

The other thing we wanted to ask the committee, Jeff and I, if you remember
— and it plays off of the DSMO update that is going to happen, the whole idea
of how do we streamline — is there a way to streamline the process, does the
process need to be streamlined as far as the length of time that it takes for
the rulemaking. We heard some brief discussion on that at the end of one of our
other sessions; do we want to keep that discussion going. There is
harmonization of standards going on, but exactly how things do or don’t change.

MR. BLAIR: Could I just add, to clarify?

MR. REYNOLDS: Yes.

MR. BLAIR: When we are talking about streamlining the process, it is in
particular streamlining the regulatory process, going through what we now have
to go through NPRMs if a new version of the standard is to be adopted. That is
the process that we are referring to.

MR. REYNOLDS: Yes, so whether or not we want to hear any more discussion on
that or not is the thing.

Then we have had Stan heading up secondary uses of clinical data. I guess,
Stan, what we would like to hear from you is, we have heard some testimony, how
do you see this coming to fruition. Yesterday we came to some way to see an end
to matching patients to records; what do you see the next steps being and how
do you see us coming to some — I don’t want to say closure, but what do you
see the journey, let me say it that way.

DR. HUFF: There may be one or two more testifiers that we want to hear
from, but I think in general we have probably heard from several folks, and we
have had suggestions or recommendations from the people who testified. So I
think it is at a point where we could just have a discussion and start coming
up with what we think the summary points are from the people we have heard.

I have some slide sets that show the things that we have heard, and we
could start formulating observations and any recommendations that we wanted to
do from that. So I think we are pretty close to being able to make some
recommendations, or at least some summaries of what we think the issues are.

MR. REYNOLDS: Would it be worthwhile if we had you go ahead and put that
together for April? We hear it, and then we decide whether or not we need to
hear from anyone else based on that? If not, then we go ahead and start
formulating what we want to do with it?

DR. HUFF: Yes.

MR. REYNOLDS: Again, we have got some of these big subjects to keep in some
kind of package so we can decide what we are going to do.

DR. HUFF: Yes, I’d be glad to do that.

MR. REYNOLDS: Marjorie.

MS. GREENBERG: The only thing I am thinking is that since this does cross
over several other groups, especially the quality work group as well, maybe
Justine could work with Stan on that. Is that okay, Justine?

DR. CARR: Yes

MR. REYNOLDS: What I would hope we would do is, whatever we come up with,
if we came up with a set of — I wouldn’t even say recommendations yet, a way
to position this whole discussion in a much more succinct way than just the
words, it might be something that we maybe even pass out to the other
committees for their consideration. Then at some point it may be something that
comes to the full committee if it crosses enough of them.

I think this is one where I would totally agree with you, maybe a letter
from Standards and Security is not what we end up with in the end. We may end
up with some joint letter that could come out of a couple of the committees, to
say here is what we have heard and here is how we think it covers a number of
areas, because it all has to go through the main committee anyhow. So whether
it is drafted by one or multiples, may be a way to approach it.

MS. GREENBERG: I think it is a topic probably also for a retreat.

DR. CARR: In Quality it was helpful for us to put our thoughts together,
but have it worked through with the whole committee to get everybody’s
feedback. The other group with relation to this would be privacy, I assume. I
think there are so many different contingencies with it that we would want to
have a starting point.

MR. REYNOLDS: That is all we are talking about. We have conducted the
testimony, so putting it together in some succinct package that it could be
used by the full committee in pieces or in one to discuss.

DR. COHN: Apologies for my lateness. I am listening to everyone, and I do
agree, Harry. We would of course observe that this area is actually something
that is in our charter, about re-use of data.

I wonder if everybody realizes that we are not talking about the definitive
report on secondary uses of data and then we are done and then we move on. What
we are talking about is how we move the ball forward, which is really the big
piece here in a very practical way. We are not going to get to the end game in
September, in January of next year, in January of 2009. It is going to be an
interactive process of continuing to break down the barriers as we move
forward.

So yes to all the things we are saying, but let’s not try to make it the
report to end all reports, thinking that somehow we can solve this problem, as
opposed to moving the puck forward.

MR. REYNOLDS: That is where I was headed. We have heard testimony, so let’s
put a document together. This could end up being a subject, based on some of
our discussion yesterday, about how are we organized as a full committee with
our subcommittees, that may end up being pieces and parts of groups that move
it forward. It may not fit nicely in one committee, and it may not be one
committee that even keeps the subject going.

So I think if we put together what we have basically found and heard and
what the industry says and so on, and then take it apart as a full committee or
the Executive Committee, to figure out this is what the subject looks like,
this is how it looks like if it plays in our world, who owns the subject going
forward and how do we play with the subject going forward, as you say. But it
is an ongoing thing. This is a journey, on this one. This isn’t going to be a
letter and walk away.

So we will plan, Stan, to include your presentation and working with
Justine on that for the April meeting, okay? Any other comments on what we do
for April?

Our first presenter is Vivian Auld from the NLM. We look forward to hearing
from you.

Agenda Item: NLM Update on Standards Related
Activities

MS. AULD: Good morning. This is going to be a quick update on what I
covered for you back in July of last year. My thought is that I can go through
the slides that I put together for you relatively quickly, and then give you
lots of time to ask me questions.

One of the people in the audience, you might have noticed, is John Kilborn,
who works with us on RxNorm. He is our resident expert on that, other than
Stuart, I should say.

What I am going to do is give you a quick overview of some of the recent
events that are going on, and how NLM has been interacting with those or been
affected by those, an update on the UMLS and RxNorm, a brief introduction to
DailyMed which Randy is going to be talking about even more, and then the
status on the NLM HL7 contracts and the various mapping projects.

MR. REYNOLDS: Vivian, since I am not as familiar with the subject as a lot
of people, so if you could help weave this picture as you go through, some of
the nomenclature, maybe keep it in perspective.

MS. AULD: I don’t use any acronyms, I promise. As you undoubtedly know,
back in September of last year the American Health Information Community was
designated, with advisory committees of industry leaders and agency heads. We
have been following what they have been doing very closely because we want to
make sure what the use cases are that come out of this, so that we can make
sure that the various projects that we are working on are fully supporting
those use cases.

Also, in October of last year the Office of the National Coordinator for
Health Information Technology, Dr. Brailer’s office, awarded contracts for the
standards harmonization, which resulted in the HITSBE committee. NLM is one of
the four designated federal representatives to that committee. Betsy Humphrey
specifically is the representative. Also, Dr. Brailer’s office put in place the
contracts for compliance certification and privacy and security solutions.

Then the other thing that happened back in October was that the Commission
on Systemic Interoperability, which is housed in the National Library of
Medicine, released their final report, and that is available on the website. We
are taking steps to make sure that all of the background information is
archived on our site so that that will be available in the future as well.

Back in November of last year, the Medicare E-Prescribing Foundation
Standards were named. ONC awarded four contracts for National Health
Information Network architecture prototypes. In January of this year, the
Interagency Health Information Technology Policy Council was established, and
as a subset to that, HHS created a Health Information Technology Policy
Council, and HLM is represented on that, again with Betsy Humphrey as the
representative.

Again, as I was saying, the point isn’t to tell you about these things that
you already know about, but just to point out where NLM fits in with this.

MR. BLAIR: Are we to save our questions until the end or as you go through?

MR. REYNOLDS: Let’s wait until the end.

MS. AULD: The next thing I wanted to talk about was the UMLS. As I said
back in July, we are in the process of moving specifically the Metathesaurus
from a research project to our production system. What this entails essentially
is moving this from our research branch, which is the Lister Hill Center, to
our production branch, which is the Office of Computer and Communications
Systems and Library Operations. We are also migrating to a new set of computer
hardware and software and adding new staff to support the improvements to the
documentation, training, quality assurance and customer support.

As I said, this primarily affects what is going on with the Metathesaurus
release files. The research branch is still going to be very heavily involved
in the UMLS, but mainly in the research end of it rather than the production,
so that we can make it bigger and better.

It has been going very well. We are pretty much on schedule, and we are
very happy with how it is going so far.

We are in the process of releasing the 2006 AA version of the UMLS
Metathesaurus. There is over a million concepts in it at this point, 140 source
vocabularies, and they represent 17 different languages.

One of the things that we are introducing with this version is a new set of
tools for the Metathesaurus. One of them is called the MRCXT Builder, which
provides contextual or hierarchical relationships to facilitate the
construction of user displays. It allows users to produce a custom MRCXT file
from the user subset of the Metathesaurus, and it replaces a very large and
unwieldy MRCXT file.

MR. BLAIR: MRCSG?

MS. AULD: CXT.

MR. BLAIR: CXT. That is a new term for me.

MS. AULD: It is an internal file. The point to pay attention to here is the
fact that we are making it much easier for people to customize their view of
the Metathesaurus. Rather than having to work with the entire 140 sources, they
can subset it and work with just the piece that they want to. That is the main
purpose behind making this builder, so that you don’t have to take this huge
file and work with it, but rather you can just take the part that you need.

MR. BLAIR: Thank you.

MS. AULD: The other new tool that we are introducing with 2006 AA is the
RRF subset browser. RRF stands for rich release format. Again, this allows the
user to create a subset of the Metathesaurus and then to take that subset and
look at it right then and there. They don’t have to load it into a new system.
They can just use this tool that we have created, this browser, to take a look
at it. We are making it so that you can take your subset of the Metathesaurus
and look at it rather than having to look at the entire Metathesaurus.

DR. CARR: Is that available now? I missed that.

MS. AULD: It is coming out with the 2006 AA release, which should be there
any day. They actually tried to release it earlier this week and had to pull it
back, but it is going to be the ere later today or Monday, I’m not sure which.

Along the same lines of making the Metathesaurus much more user friendly,
we are making changes to the UMLS knowledge source server, which is our system
that runs on — it is an Internet web application that allows you to search the
Metathesaurus and the other knowledge sources within the UMLS.

As I said back in July, we are making changes on the back end so that you
can implement web services to make it easier for people to access the system.
On the front end, we are setting up portals which will allow users to customize
their view of the Metathesaurus so that we can provide a view that we feel
makes the most sense for people, but acknowledging the fact that most everybody
else has a different point of view, and they will probably have systems or
needs that are different for their individual settings, they can customize it
to those needs.

We are also hoping that this will be — we are in the process of setting
this up so that not only are you able to search the Metathesaurus that lives on
the NLM server, but you can create a local subset and then use this tool to
search that subset. Right now, we are in very early, early alpha stage of
developing this. We are hoping to have a beta version available by the end of
summer this year.

This is your first look at what the screens are going to look like. We will
have tabs across the top that will allow you to look at the Metathesaurus, the
specialist lexicon and the semantic network. As I was saying, these will be
from our viewpoint what we think it should look like, and then also you can
create your own tabs for your own portals for what you want those subsets to
look like.

This is an overview of what the results will look like. We reserve the
right to change all of this, so don’t worry that you can’t see it.

MR. BLAIR: Vivian, as part of your explanation of what these offerings are,
could you also maybe help those of us here in the audience to understand who
you feel the users will be and what will they be using it for?

MS. AULD: That actually leads directly into my very next slide. Beginning
roughly a year ago, we set up a web-based site for users to supply us with
information for their annual report. Licensees of the UMLS are required to
submit an annual report telling us how they use the UMLS, and answer a series
of questions. So rather than just giving them an extremely vague understanding
of what it is we wanted, we created a website that asks very specific
questions.

What I have here on this slide and the next are an overview of the
summaries from the 2004 annual report. The total number of valid responses we
received were 1,427, which represents 54 percent of the total licensees as of
December 2004. The top three types of affiliation, 40 percent were academic, 26
percent were commercial, and 11 percent were not for profit. The top three
types of activities that the licensees were participating in, 42 percent are
research and education, 20 percent are software developers, 17 percent are
health care providers.

We asked them how much experience they had with the UMLS. Fifty-three
percent have already started using the UMLS. The median duration of usage was
12 months, and 47 percent have obtained a license to the UMLS but they have not
started using it yet.

The top three types of information that the UMLS is used to process, and
multiple answers were allowed here, were findings and diagnosis, 55 percent,
procedures were 34 percent, laboratory tests and reports were 29 percent. The
top three uses of the UMLS, again multiple answers were allowed here, 53
percent were terminology research, 35 percent were mapping between
terminologies, and 33 percent were creating their won terminology.

Then we also asked which terminologies within the Metathesaurus do the
licensees use. Forty-seven percent said they used everything, 24 percent said
that they used SNOMED CT only, 12 percent said they used category zero only —
category zero are those sources that do not have any usage restrictions — and
eight percent said they used SNOMED CT and the category zero sources.

So that gives you an overview of who is using the UMLS and how. We will
probably in another two or three months have an update from 2005 on this. So
that is all I was going to say about the UMLS.

Next is RxNorm. We are currently providing monthly releases. We have
increased the number of RxNorm standard names by 28 percent, which comes out to
around 1100. That is not the right number; over 100,000. We are working on
maintaining harmony with the UMLS Metathesaurus, and we are continuing to
improve the process and the product code.

What we are planning to do in 2006 is to work towards weekly releases by
the end of the year, establishing a link between DailyMed and RxNorm that will
make the FDA the primary source for updated drug information going into RxNorm,
and I will talk more about DailyMed in just a minute. We want to establish
tighter integration for more rapid inclusion of RxNorm data into the UMLS
Metathesaurus, and we want to continue improving quality assurance procedures
and data integrity within RxNorm.

We are also continuing to improve the coverage of drug delivery and
dispensing devices and multivitamins within RxNorm. We are expanding RxNorm to
cover over-the-counter medications based on information supplied by selected
sources, and we are also assisting with the Medicare Modernization Act pilot
tests where needed.

MR. BLAIR: Could you elaborate a little on that?

MS. AULD: I can’t, but maybe John can.

MS. GREENBERG: Do you want to come to the table?

MS. AULD: Do you want to come up?

MR. KILBORN: The pilots were just announced in January, and they are just
getting off the ground. Their use of RxNorm for each of them doesn’t come until
some months down the road. So all I can say is, there isn’t much to say right
now about the pilots in RxNorm.

MR. BLAIR: That’s okay, but they are going to be part of the pilot tests?

MS. AULD: Yes.

MR. BLAIR: You will have scenarios where you will be testing them.
Hopefully some of those will be decision support?

MR. KILBORN: Yes. There is one specifically looking at vendors, different
software vendors and how the use of RxNorm in their systems can be used, how
RxNorm can be used in their systems.

There is a nice range of pilot projects that have been assigned. In other
words, it is not just one thing done four times over. There are four really
different environments in which RxNorm and other drug standards are being
tested.

DR. CARR: We are hearing more about them later this morning.

MS. AULD: Thank you. The next thing I want to talk about was DailyMed. For
those of you who can’t see the screen, this is the website that we have for
DailyMed. The URL is dailymed.nih.gov.

This is a website that is designed to provide high quality information
about the marketed drugs. It uses the structured product label from the FDA. It
is important to note that the information that is contained in DailyMed is
really intended for the prescribers and dispensers of drugs. It is not intended
for the patients. So if you look at the tab that talks about information for
patients, it is really telling more about what the prescribers and dispensers
should say to patients, rather than providing information directly to patients.

The site went live in November of last year. It currently contains
information for ten different labels, but we expect to have a complete set by
the end of the year, which will be roughly around 4500.

The way that this works, which Randy will be talking more about, is that
every year all of the manufacturers are required to resubmit their product
information or their labels. So as they come in, we will be populating DailyMed
so that it is complete.

We are going to be establishing links to the published literature, to the
Clinicaltrials.gov and to other relevant sources for each drug that is listed
so that you can get a complete picture rather than just what comes from the
manufacturers. Randy will give you more about that shortly.

A little bit about the contract that we have established between NLM and
HL7. As I told you back in July, this contract has two parts. The first is
designed to align HL7 message standards with the CHI standard vocabularies.
This piece of it was initiated by NLM.

Specifically, we want to specify which subsets of the standard vocabularies
are valid for particular message segments, and also to replace HL7-maintained
lists of coded values with subsets of standard vocabularies where feasible. For
this part of the project, which Stan Huff is one of our technical leads on
this, the short overview of this is that we essentially have completed the
preliminary pieces that we need to have in place so that we can get this work
done.

We have incorporated the HL7 version 2.5 and 3.0 code sets into the UMLS so
that we can take a look at them. HL7 has defined exactly how you will bind the
different code sets to the messaging segments. We are just about ready to do
the bulk of the work, where we will actually be looking to see which
vocabularies go with which message standards. So we are in a very good place
for that.

Two things that came out of this is that HL7 pretty much for the first time
took a really hard look at the code sets that they have in place, and
determined that they really need to put some robust tools in place so that they
are better able to maintain and distribute their code sets. So they are working
on putting those tools in place.

We are also working on setting up a collaboration on the standard
vocabulary exchange format. The first meeting for that is going to take place
on Monday and Tuesday, so we are in very good standing for that part of the
contract.

The second part of this overall contract is to create experiment guides for
transmitting an entire electronic health record between disparate systems. This
is being done on behalf of HHS. Susie Burke-Beebe is the main HHS technical
lead on this.

Back in April of 2005, we did a proof of concept demonstration, and that
went very well and it was received very well by the community. We are currently
in the process of lining up the partners to develop the tools and conduct a
live demonstration that will take place later this year, probably in August if
we stay on schedule. The negotiations are still underway to line up those
partners, but we are very close. Probably within the next week or so we will
have those lined up. So I’m not going to say who it is we are talking with.

The last thing that I wanted to talk about is the various mapping projects
that we have underway, or planned.

The first set deals with mapping CHI standards to HIPAA code sets. The
SNOMED CT to ICD-9-CM is the first one of those. The draft map has been
completed. College of American Pathologists did the mapping for this, and they
sent that map to AHIMA, who is the designated expert for NCHS.

MR. BLAIR: Vivian, just a clarification. You said mapping SNOMED codes to
ICD-9.

MS. AULD: ICD-9-CM, yes.

MR. BLAIR: Right, as opposed to the other way around?

MS. AULD: Correct.

MR. BLAIR: is that the full scope of what is planned, or is the mapping of
ICD-9 to SNOMED also in the future?

MS. AULD: I don’t think there is a use case where anyone would need the map
to go in the other direction.

MR. BLAIR: I understand, okay.

MS. AULD: If one developed, then we would work on it, but I don’t think
that one exists at this point.

So we have the draft map. AHIMA, who is the representative for NCHS, has
taken a look at this map and they have written a report with several
recommendations for improving the process. At this point College of American
Pathologists is working on a response to that. Once we have both pieces in
place, we are going to take all of this back to Marjorie Greenberg’s group and
make sure that she is happy with it. I am quite optimistic that within the next
month we will have a draft map up so that people can begin taking a look at it.

The SNOMED CT to CPT map has been — there have been negotiations going on
between the College of American Pathologists and the American Medical
Association to determine who is going to put the map together, which will
probably be College of American Pathologists, and who will own the intellectual
property rights to the map, which will be the AMA. We are very close to being
set on those negotiations as well. Once everything is in place, then we can get
started on that map.

The LOINC to CPT map, Stan Huff’s group has been working on this. They have
provided an initial draft map which has about 2,000 concepts mapped. It is
essentially taking the maps that existed in the community and putting those all
together. So what you have is the most common concepts that are already being
used and putting them all together, and then making logical extensions from
that. The map is currently being reviewed by the American Medical Association
and we are going to be having a conference call with them probably in the next
two weeks. I don’t know if Stan knew about that. Very shortly after that we
should be able to put that draft map up so that people can begin reviewing it
as well. Then we are looking at what is the next piece of this map that Stan’s
group should be working on.

You are all looking very confused.

DR. WARREN: Will AMA own the copyright to the LOINC CPT map?

MS. AULD: Yes. The other maps that we are working on are SNOMED CP to other
vocabularies. One of those is to MedDRA. We have not begun working on this for
the simple reason that people have come forward with use cases, but there is no
data to support those use cases. So we are trying to find a combination where
somebody has a definite use for the map and data that we can test the map on
once the map is created, before we do the work on it. We were working with the
NIH Clinical Center, but their priorities had to shift for very good reasons,
so that one didn’t materialize and we are still looking for others.

MR. BLAIR: Vivian, help me understand something. I have seen in each of the
cases where you have said SNOMED to this, SNOMED to that, SNOMED to that, and
they were all terminologies that are being used today. So I can understand that
there is a use case there.

However, if I was a vendor and I was looking at getting the advantage of
SNOMED for a number of different applications, I would think if the mapping is
to these others, then it doesn’t give me the ability to take advantage of the
strength, the robustness, the specificity of SNOMED.

If I’m wrong please tell me, but it seems as if while these mappings have
some degree of use, they don’t open the door to the potential that any of us
had thought of for SNOMED.

DR. HUFF: No, I think you are thinking of it backwards, Jeff. They are
doing exactly what they should be doing. The assumption is that to get the most
out of SNOMED you use SNOMED as your primary terminology within your database.
Then for instance if you want to go to ICD-9 codes you map from the SNOMED
codes to the ICD-9 codes and submit ICD-9 codes for billing purposes.

So again, it doesn’t make sense for instance to do an ICD mapping to
SNOMED, because you never start that way. ICD-9 is a classification that builds
from all the primary data that you would have recorded in SNOMED. So you always
go the other way, you always go from SNOMED to ICD-9.

MR. BLAIR: What you said made perfect sense. Maybe I heard it wrong or
listened wrong, but I thought that you were saying it the other way around.
Vivian, did I hear it wrong?

MS. AULD: All of these maps are going from SNOMED to the other vocabulary.
So at the clinical encounter you are recording the information in SNOMED, and
then ideally you will have a system that automatically uses these maps to
translate into whatever of the other vocabularies make sense.

MR. BLAIR: Is it possible for a blind person to be dyslexic? Thank you,
that helps me clarify things.

MS. AULD: The next mapping that I wanted to talk about is SNOMED CT to
ICPC. This is a map that CAP has been hearing quite a lot of interest from
various groups, especially at an international level, people who would like to
make use of this map if it exists.

What we have is, we have existing a map from SNOMED CT to ICD-10 and we
have another map from ICD-10 to ICPC. So what we have done at NLM is, we have
taken these two maps and put them together. Kim Wah Fung, who is one of our
clinical experts, has done a very high level analysis of the resulting map
where there are gaps where initial work would need to be done, where it does
not work at all. We are going to take that information and in the next few
weeks we are going to have a conference call with CAP and a few other
interested parties and decide how we want to proceed from here, what work needs
to be done, who is going to be funding it, how much it is going to cost, all
that sort of thing.

MS. FRIEDMAN: Vivian, sorry I’m not up yet. Would you repeat that one?
You’ve got the map from SNOMED to 10?

MS. AULD: We have the map from SNOMED to ICD-10. We have another map from
ICD-10 to ICPC, and we have put the two maps together, which results in a map
from SNOMED to ICPC. What we are doing now is taking a look at that map from
SNOMED to IPCP to see how successful the resulting product is.

MS. FRIEDMAN: The reason I’m asking —

MR. REYNOLDS: Let’s hold our questions.

MS. AULD: I’m almost done.

MR. REYNOLDS: I just want to make sure we get through this, then I think a
lot of people have questions.

MS. AULD: Another map that there has been a lot of interest in is with
SNOMED CT to Medcin. At this point in time, we do not yet have Medcin into the
UMLS Metathesaurus, so we are not able to work on a map for this at this point
in time. Medcin is a very unique vocabulary, and it is kind of like putting a
square peg into a found hole. It is not working very well, so we are still
working on that.

Then thanks to CAP we have maps from SNOMED CT to various of the nursing
vocabularies that we are working to incorporate into the UMLS. Before we can
put those in, we need to make sure we have updated versions of all the nursing
vocabularies.

So that is the ten-cent tour of the mapping projects. I included in here an
overview of the key NLM assumptions about mappings. I am not going to go over
them, but it is important to understand that this is how we are approaching
this, so that you have a better understanding. It helps explain some of what I
was just talking about on the last page.

I want to make sure that I take a moment to thank AHRQ and HHS Office of
the Secretary, because they have helped with funding for a lot of the projects
that I have been talking about. If it weren’t for their funding, we wouldn’t be
where we are right now. So thank you to them.

Now I am ready for questions.

MR. REYNOLDS: Okay, we’ll go to questions.

MS. FRIEDMAN: My question has to do with ICD-10 because there is a lot of
interest in ICD-10 and the status of the map. You said that the map exists from
SNOMED —

MR. BLAIR: Maria, could you please classify? Do you mean 10 or 10-CM?

MS. FRIEDMAN: Well, my question is, which map exists? Which is it?

MS. AULD: The map that we have is from SNOMED to ICD-10. I believe it is
jointly owned by CAP, the U.K. National Health Center and WHO. They have given
it to us with the understanding that we will take a look at it and analyze it
for purposes of creating the map from SNOMED to ICPC. But I don’t believe that
we have the right to distribute the map at this point.

MS. FRIEDMAN: That was my question.

MS. AULD: At this point there isn’t any work being done on a map from
SNOMED to ICD-10-CM. We are waiting until ICD-10-CM is released to the public,
which is going to happen later this year possibly. There is other work being
done too that relates to that.

MR. REYNOLDS: Are you following up on that?

MS. GREENBERG: Yes.

MR. REYNOLDS: That would be appropriate then.

MS. GREENBERG: I was going to ask, I know you have a version of ICD-10-CM,
but we are now updating it based on the updates we have made to 9-CM and the
pilot tests, et cetera. That should be available by June. By then the map to
ICD-9-CM should have been updated and finalized?

MS. AULD: Yes.

MS. GREENBERG: Then you can start the map to 10-CM?

MS. AULD: Yes. When you have that version of ICD-10-CM, if you send it to
us, we can put it in the UMLS and have that work started at the same time.

MS. FRIEDMAN: How long does it take to get that mapping done?

MS. GREENBERG: I assume that would also be working with CAP and AHIMA
again?

MS. AULD: Say that one more time?

MS. GREENBERG: I assume the process for that mapping and validation, et
cetera, would also be working with CAP and AHIMA?

MS. AULD: Yes, definitely.

MS. GREENBERG: And of course NCHS.

MS. AULD: Definitely.

MS. GREENBERG: So I guess the question was, once you have them all in the
UMLS, what time frame does that take? Hopefully after the experience with the
map to the 9-CM, the learning curve will have been improved, so it might be
quicker?

MS. AULD: I don’t know that I have a good answer for that. It depends on
the map, it depends on who is doing it, it depends on their resources.

MS. FRIEDMAN: It sounds like it is not anytime soon.

MS. AULD: I wouldn’t put it that way.

MS. FRIEDMAN: A year?

MS. AULD: Less than a year.

MS. FRIEDMAN: Thank you.

MS. AULD: It depends on how big the two pieces are, too.

DR. COHN: Some related questions to ask, and so little time. I’ll start out
by publicly disclosing, as you all know I am a member of the CPT editorial
panel, actually just leaving that assignment, but I think I am going to have to
recuse myself for the next two years anyway, as I understand it, by rules.

But I had a general question related to these mappings. I want you all of
course to observe that I work my day job for Kaiser Permanente, which is a
large health care organization that has taken what the Secretary has announced
starting in the year 2003 and 2004 as vocabularies that people ought to be
using in clinical information systems, and has been in the process of trying to
implement them, and overall successfully.

However, clearly one of the pieces that I think we are all concerned about
is the issue of mapping from clinical to administrative terminologies that has
been defined by HIPAA. I am obviously pleased that we are moving to the point
where we are getting draft mappings. From a business or organizational
standpoint, it is hard for me to feel the draft mappings are exactly
sufficient. I guess I am wondering, even if we have a final mapping, knowing
that in any organization in the United States that deals with health care there
is a big C word. It is called compliance. It involves CMS and it involves them
providing oversight to organizations that delivery health care.

I guess I am wondering, at the end of the day in this process, are we going
to have mappings that health care providers can rely upon as being compliant
with CMS in this process? Or even after it becomes a draft final mapping, is it
going to be able — NLM supports this, but CMS reserves the right to question
the validity of these mappings, et cetera. Can you comment on that, Vivian?

MS. AULD: Our goal is to absolutely have CMS approve these maps. So we have
had CMS in the room working with us at all of the kickoff meetings, with the
exception of the LOINC to CPT meeting, which took place in Chicago, so they
weren’t able to attend that meeting. But they are still working with us.

To this point, we haven’t had anything to give to CMS to look at. We
haven’t had any additional information to give to them, so once we have the
drafts, the very next thing I want to do is get CMS engaged again, so that they
can work with us to make sure that they are taking a look at the drafts, that
they are working with us to make sure that the validation process that we are
going through and the testing process that we are going through meets their
criteria, so that at the end of the day when we say that this map is final, CMS
can say yes, we agree that this is a final usable map.

DR. COHN: So talking about the one that is coming up, it sounds like next,
which is the ICD-9-CM mapping, how long will that sign-off take? You haven’t
even put the draft out yet.

MS. AULD: Probably the LOINC to CPT map will be the first one to come out.
I honestly don’t know how long it is going to take, because this will be the
first one. So we need to work to get the process in place more.

MS. GREENBERG: I am following up on this. Is it fair to say though that
this is going to be 100 percent without human intervention?

MS. AULD: What do you mean?

MS. GREENBERG: Can one expect that there would still be some need for human
intervention? There is the assumption that you have your data in SNOMED
automatically 100 percent of the time produced with the correct ICD-9-CM codes
based on the mapping.

MS. AULD: It depends on the type of map that we are creating. Some of the
maps such as the LOINC to CPT, these are concepts that it is very clear that
there is a one to one mapping. So there doesn’t need to be any human
intervention.

A lot of the SNOMED to ICD-9-CM, there is a certain decision that has to be
made on the tail end of that map. So there is still going to need to be some
intervention in that. So it just depends on the nature of the map.

DR. STEINDEL: I have a followup. I would like to understand a little bit
better the word approve, when you say CMS is going to approve this map.

I think there are two levels of approval. The first level of approval is,
it looks like a good map, yes, the SNOMED CT codes relate to these ICD-CM
codes, and everything is hunky dory. The other level of approval is, the first
level of approval plus, we will accept it without question for billing
purposes. Do you have any idea which level of approval CMS is going to use?

MS. AULD: Do you want to answer that, Karen?

MS. TRUDEL: I think I need to turn that back and ask for compliance
purposes what is needed.

MS. AULD: I think it is the second, so we are going to need to do whatever
it takes to make sure we give you the second, which is part of why I can’t give
you a clear answer on how long it is going to take, because CMS knows what we
are doing. They know that they are going to need to become re-engaged in this.

We are making this up as we go along, figuring out what needs to happen. So
we need to see what needs to happen before we make it so.

DR. STEINDEL: Thank you, Vivian, because that was the reason why I asked
the question. I wanted to make it clear that there was as part of the
validation step a lot of complexities, especially in regard to billing.

MS. AULD: Right. That is why we are putting these up as draft maps, so that
people can begin to make use of them, they can begin to test it, they can begin
to tell us what does and doesn’t work for them, so that we can adjust as we go
forward.

MR. REYNOLDS: I’d like to make maybe a layman’s observation, since I did
admit I was dump up front on this.

MS. AULD: Please.

MR. REYNOLDS: As I have been on the committee, a lot of these subjects have
been brought in individually. We talk about standard product labeling, we talk
about this, so today I think may be the first step.

But I think if I was to summarize what I heard as a layman from your
discussion, it is that today marks the first time I have heard a real synergy
between the approach of all these different things, to where it is actually
being worked in a way — and I would like Randy to play off of that, and as we
hear CHI, to play off of that. There is a synergy. Then when you start putting
it on things like DailyMed, which get out in the light a little more, it looks
like more of a reality of a synergy of all these things coming together, so
that as you think of things like secondary uses of clinical data, and you think
of the actual administrative processes that go on in the country, it is really
starting to get some traction and come together. So they are no longer just
individual subjects. It is more of a capability and an interrelationship that
really makes this thing have some life, not just individual pieces and parts.
Is that a fair statement?

MS. AULD: That is a very fair statement. That is exactly why I wanted to
give these updates to you about what NLM is doing, because we are trying to
take these different things that you all have been talking about and
researching, and do what we can to put them into practice.

MR. REYNOLDS: I want to stop the discussion right now on this, because we
have got Randy.

MS. GREENBERG: Weren’t there a few of us who had questions?

MR. REYNOLDS: My only problem is, I’m not sure how we are going to get
through the presentations we have if we — okay, yes, if anybody feels they
have a significant question, I will continue the discussion.

MS. GREENBERG: I think this relates to the — because we have CHI here as
well, and I didn’t see on your list any plans — you mentioned other
vocabularies that might be in the works or that there was interest in, and I
didn’t see the international classification of functioning disability and
health as even on the list. That is a mapping that has been requested by — and
I realize you are not ready to do it right now, but I was wondering why it
wasn’t on the list, because it has been requested by the CHI group and by the
NCVHS, and I know there is international interest as well.

MS. AULD: It is actually a mapping that is in my head and that I haven’t
put on this slide. I should have put it on the slide and I will for next time.

MS. GREENBERG: Thanks.

MS. AULD: But before we work on that map, we need to get ICF into the UMLS,
which you know we are working on.

MS. GREENBERG: Yes. I just wanted to see it on the list.

MS. AULD: Yes, I apologize. It should have been on the list.

MS. GREENBERG: And Medcin isn’t in the UMLS either, apparently.

MS. AULD: Right, exactly.

MS. GREENBERG: Thank you.

MS. AULD: Fair question.

DR. STEINDEL: Vivian, I have two questions that are unrelated. First, I
would like a little bit of clarification on the ownership of the intellectual
property by AMA regarding CPT maps. If a user has a license to CPT, will they
be allowed to use the maps without additional payment to AMA?

MS. AULD: Yes.

DR. STEINDEL: So that is what you meant by that?

MS. AULD: Yes.

DR. STEINDEL: I just wanted to make sure there was no additional charge for
the use of the map. Thank you.

Then the other question is, one thing that I was interested in hearing
about that was not touched on is, does the Library have any indication of the
penetration of SNOMED since the introduction of the federal license? The only
numbers that you gave that might be vaguely related to that was from the annual
report data.

If I can comment on that, first of all, there was only just over 1400
people who responded to the annual report information. I also noted that over
half of them were first-year users. I will say that I have had a UMLS license
for I don’t know how many years, and I can tell you how many annual reports I
filled out. I think it was one my first year. So that gives an indication of
what I consider to be somewhat the bias of this report.

MS. AULD: Yes.

DR. STEINDEL: The second is, I noted there was only 26 percent that were
commercial, 371 which comes out to be the number from that, and I have
absolutely no idea of how representative that is of the number of people who
are developing terminology systems. That is an extremely important thing to
know with regard to the license.

Then the other comment is, when we look at the breakdown of what — when
you specifically ask what terminologies you use, 24 percent which sounds like a
very good number use SNOMED CT only. But then if you take a look at it from a
numerical point of view and what I have stated, and you shook your head in
agreement with regard to bias, that came out to 342 people.

So I have no idea of what the indication of SNOMED CT penetration is from
this data. Does the Library have any indication on that at all?

MS. AULD: No.

DR. STEINDEL: Thank you.

MS. AULD: I’ve told you everything that we know at this point,
unfortunately.

MR. REYNOLDS: Vivian, nice job, very helpful, even to those of us that
didn’t understand all the details. It really was very helpful.

MS. AULD: Thank you.

MR. REYNOLDS: Our next presenter is going to be Randy Levin, who is going
to update us on the FDA structured product label. Randy, you have the benefit
of all the questions everybody asked Vivian, so you ought to be able to just
smoothly transition and have no questions.

Agenda Item: FDA Update on the Structured Product
Labeling

DR. LEVIN: Right. First, I want to just thank you for inviting me to give
this presentation update on the structured product labeling and medication
terminology work at the FDA. I guess we are holding questions until the end, is
that right?

MR. REYNOLDS: Yes, we will try again this time.

DR. LEVIN: I will try to go over three areas. One is the medication
terminology standards work at the FDA and what FDA is trying to contribute
through this effort. Second is going in a little more detail on the structured
product labeling and what that is and what that holds. Then to finish up with
an update on the FDA systems that we are developing or using for medication
terminology standards.

First I am going to go over our collaborators on this work. As Vivian
pointed out, the Agency for Health Care Research and Quality has provided
funding for a lot of the work that we are doing. Again, without that funding we
wouldn’t have made as much progress as we have made.

We also are of course working very closely with the National Library of
Medicine, as Vivian was pointing out with the DailyMed and the collaboration
with RxNorm. We work with the Department of Veterans Affairs. They have
contributed terminology to the work that we are using in the structured product
labeling, and working with Mike Lincoln and Steve Brown for a number of years
on this project.

Another collaborator is the National Cancer Institute, and providing a lot
of our terminology services, and helping us wit the medication terminology.
Margaret Haver in the audience here has been our contact at the National Cancer
Institute.

We also work with the United States Adopted Names Council, USAN, for
various standards, as well as the United States Pharmacopeia. We also have been
taking our work to this effort to an international level, to try to bring the
pieces together, Harry, as you were saying, to try to bring them together on an
international basis. There is an international conference on harmonization for
registration of pharmaceuticals for human use. It is called ICH. We have been
working in that group to try to harmonize on all these terminologies that I am
going to be talking about.

Finally, we use the terminology services at the NCI, the Enterprise
Vocabulary Services, for our terminology.

Here are the main terminologies that we have been working on to support the
efforts that you have been talking about in this committee. Again, coming back
to bringing the pieces together, this is what we have been hearing to help
support the efforts of the various electronic and health care and IT
initiatives. We have been working very hard to develop and promote these
standards. The standard for ingredient, the ingredients in products, the
ingredient strength unit — I’ll be going over these in a little more detail —
dosage form, route of administration, package type, package unit, product,
package product and generic drug. So the terminologies for each one of those,
and I’ll just go into a little more detail on each one of these.

For the ingredient name, we standardize the names for ingredients. We work
with the USAN Council and the FDA to standardize the name for ingredients. Our
terminology is in the FDA substance registration system that maintains the
terminology and the codes for each ingredient that we regulate.

I gave some examples here. We would have ampicillin. That would be the
established name, and then we have a unique ingredient identifier for
ampicillin, which we provide. But we also provide things like an ingredient
identifier for ampicillin sodium.

Also, again just to try and put some pieces together, and maybe Beth will
talk about this later when we talk about the allergies group, but we have been
working with the Environmental Protection Agency to try to collaborate with
them to use this terminology. They have a lot of the substances, environmental
substances, trying to collaborate with them on this to unify the terminology.
We also have been working inside the FDA with not only drug products, but food
products as well, food substances, I should say.

Another terminology is ingredient strength unit. We were the authority for
that terminology, and the terminology is in the NCI thesaurus. So there would
be a strength unit like milligram, and there would be an appropriate code for
that in the NCI thesaurus. But strength unit also goes to things like a tablet
as a unit, so if you are writing an electronic prescription you say, take two
tablets, that is a unit and we have terminology for that unit. If that is used
in electronic prescribing, the same terminology we are using in the structured
product labeling, it comes back to your point of trying to bring these things
together. So we purposely put the strength unit in the structured product
labeling, so that it can be used in other sources like electronic prescribing.
We talked about this at a previous meeting, about the possibility of using this
terminology for the electronic prescribing.

Dosage form, another one, a terminology that we maintain inside the NCI
thesaurus. An example would be a chewable tablet as a dosage form. That is also
given a code. That is in the NCI thesaurus.

Going on with the terminology, we have route of administration that the FDA
maintains. These are all used in the structured product labeling, used by the
manufacturers. It is maintained in the NCI thesaurus.

Package type, another one, is in the structured product labeling, used by
the manufacturers. Fits in if you are doing electronic prescribing, how these
products can fit together from the manufacturer to the label to the
prescription to the dispensing unit.

We also have package quantity units, which is going down to the tablet and
capsule and the terminology for each one of those.

The other terminology standards in structured product labeling include the
product itself. So we have a terminology for the not only the product name, but
also a code for every product. This is based on our national drug code system
that is regulated by the FDA.

Then down to each package product, so the product may be for example
Cinemet 25-100. It has its own code. If you package that in bottles of 100, it
has a specific code, and that is again part of the national drug code.

Finally, the generic drug. The name is something that we work with with the
manufacturer to have a generic drug name. For example, for Cinemet it would be
carbadopa or levadopa.

All these terminologies are used in the structured product labeling. The
structured product labeling is a standard for exchange of medication
information. It is an HL7 standard based on version three, the reference
information model. It is based on the clinical document architecture.

It includes three areas. One is the content of labeling, so all the
narrative text of the labeling that you see in a package insert would be in the
structured product labeling. This describes how to use the drug and when to use
the medication. Then each of what we call drug listing data elements are coded
in the structured product labeling, so that gives you information about the
ingredients in packaging and all the codes that I described are in the
structured product labeling.

Then finally, we are adding now something called highlights data elements.
This is based on a new regulation that we just finalized and that will go into
effect this summer, where we will provide a structured format for information
about indications for the drug, the drug class, drug interactions and adverse
reactions.

The drug listing data elements I have already gone over include the
products, so you have the name and both the brand and the generic name and
codes, ingredient, and that is both the ingredient and the active moiety, so
that is the ingredient with and without the counter ion, and the codes, the
ingredient strength, dosage form. Something I didn’t go over earlier, but also
the appearance of a solid oral dosage form. So if something is imprinted on it,
what the color is, the shape, the size, if it is scored, if there is coating,
if there are any symbols on it. These are pieces of information that are used
by poison control centers and other ones to try to identify what medications
people are on without having the actual labeling.

Route of administration, also the DEA schedules are part of the drug
listing data elements. As far as packaging, the packaging quantity and the
packaging typing code.

I am just going to show you an example of a structured product label just
to help in showing what this is. This is a structured product label. Here is
all the text that you would have about the label. This is what you would see in
a package insert. Then we also include the data elements, the things I was
talking about. This has a produce code, it has a dosage form, it has the route
of administration, it goes through the active and inactive ingredients, the
strength of the active ingredients, the appearance and the packaging
information.

Then this is the human readable part. It is also all coded in XML, so this
is the source of that document we were just looking at. It is all XML, so that
computer systems can pull off that information and use it.

MR. BLAIR: Randy, did you say the codes were actually in the SPL?

DR. LEVIN: Yes.

MR. BLAIR: On the current website, the library?

DR. LEVIN: Yes.

MR. BLAIR: I don’t see them.

DR. LEVIN: When you are viewing the information on the website right now,
what you are looking at is the human readable part. This part that I showed you
on this example here is part of the style sheet, to view the codes
specifically. So you would see the code here, it gives the product code and it
gives the national drug code on here as well.

MR. BLAIR: Thank you.

DR. LEVIN: In the labeling is also codes for the ingredients, but we did
not display them. But they are coded in this structured product labeling. They
can be displayed. It is just a matter of changing the style sheet for that.

Again, the highlights data elements. This is the result of a change in our
labeling regulations that require highlights. This is key information about the
drug to be presented up front in the labeling. This is a major change in our
regulations. It is going to be implemented over a number of years. What we will
do electronically for this is encode certain information that comes from this
key information, like what is the class of indication, where are some of the
limitations of use of the drug, what is the pharmacological class like
mechanism of action or physiologic effect or chemical class. That will also be
important when we talk about allergies. Key interactions with drugs and foods,
what are the contributing factors, what is the consequence of an interaction,
and key adverse reactions, the most common of your reactions. So these would be
coded similar to what we did with the drug listing that would be provided with
the structured product labeling.

We have three systems that are in operation or being developed that will
help us to accomplish this and keep the terminology up to date, so that we can
keep the structured product labeling flowing to the DailyMed and National
Library of Medicine, as Vivian was pointing out.

One is the FDA substance registration system. Another is the electronic
information processing system, and the last is our electronic drug listing.
Again, a lot of the system developments, a lot of the funding, at least on what
we have done up to date, has come from AHRQ, so just to acknowledge that that
has helped us to get to where we are.

The FDA substance registration system as I mentioned before is a computer
system for generating these unique ingredient identifiers, or UNI. We base the
uniqueness based on the molecular structure of the ingredient, if there is a
structure known. If not we rely on identifying nomenclature and other processes
and definitions to uniquely identify each substance. This goes not only for
drug substance but food substances as well.

Then right now, the status of that system is in operation. We are importing
the unique ingredient identifiers for the active ingredients for approved
prescription drugs into that system. We have most of them in the system
already. We have many of these ingredient identifiers posted through the NCI
thesaurus.

The next steps for this system is to bring in the active ingredients from
non-prescription drugs and from inactive ingredients. As I said, we are working
also with the EPA and our Center for Foods to bring in food substances and
environmental substances.

The electronic information processing system is our system that we use to
manage the structured product labeling. It is also in operation. We are in the
midst of importing this structured product labeling from all currently marketed
approved prescription drugs. We have been working with the National Library of
Medicine with DailyMed. Once the label goes through this system, then we send
it to the National Library of Medicine when they post it on the DailyMed. They
post the actual SPL, the structured product labeling XML file, as well as the
view of the file, as Steve was just pointing out.

We expect to have the legacy data for the approved prescription drugs into
our system in December of this year, and we will be sending them to the
National Library of Medicine. Right now we have about a thousand structured
product labeling in our system.

The next step, we also use the system to manage any changes for the
approved prescription drugs, so this way, when any labeling gets changed, we
can very quickly send it over to the DailyMed and have it updated. We want to
move the information over daily to National Library of Medicine; it is why it
is named DailyMed. We also want to — our next step is to import and manage the
structured product labeling for all marketed drugs.

Last, we have the electronic drug listing system. This is a computer system
that we will be using to manage the inventory of all drugs marketed in the
United States. This includes drugs, biologics and animal drugs as well. So we
are right now developing the system. We are re-engineering our process from a
manual paper based system to an automated electronic based process, so that
takes some doing. Our next steps are to develop and implement the system. We
need to update our regulations to coordinate the process to bring the whole
process into an automated state. So those are our next steps there.

We divided the system development into two phases. The first phase was the
data collection for the approved prescription drugs which was funded by AHRQ,
and we completed that in — it says October 2006, but it was October 2005 when
we completed that. So that is in effect and completed. It was on time and in
budget. My IT people wanted me to say that.

Then our phase two is to do the automated drug listing system. Funding, I
have AHRQ there but a little question mark by them; we have not heard yet from
AHRQ. With the funding though we would see completion of this within the first
quarter, first part of 2007.

The regulation changes to bring the drug listing process to an automated —
bring it into the current century. We need a change in our regulations which
are quite old. Our planned proposed rule change release is this year, in the
third quarter of this year, and the final rule, if everything works just right,
then we are looking at 2007 that we would go with the final rule.

Our estimated time lines. Based on that we have funding and we get the
regulation process in gear, at the end of Q3 of this year we would have the
proposed change in drug listing regulations. By the end of this year we will
have the structured product labeling available for all approved prescription
drugs with all the coding, with all the drug listing coding. For Q1 of 2007 we
would have implemented the automated drug listing service, and by the end of
2007 we would have the final regulations for supporting that service, and by
the end of a year later we would have the structured product labeling for all
the marketed drug products with the drug listing data elements.

Those time lines, especially the regulations, my regulatory people say that
would be — it would have to be completely everything go, just perfect, but
those are the time lines that are possible.

DR. LEVIN: Environmental substances, these are things like the pesticides
and other things that people can be exposed to, not the foods and drugs.

MR. REYNOLDS: You are going to talk a little bit more about that?

DR. LEVIN: Yes.

MR. REYNOLDS: The second question I have is, as you start — when you
talked about the highlights of your data elements on your slide ten, listing
things like key adverse reactions and so on, as you think of e-prescribing as
it starts to become more of an issue, and as you look at your timings up here,
the coordination between the drug knowledge databases, PBMs, structured product
labeling and so on I would think would be critical, because more and more this
stuff is going to be done in real time directly with doctors.

So as you are coming out with new interactions, how is all this stuff —
back to my earlier question to Vivian and others, these are exciting things,
but then how they get into the flow of what is happening on a day to day basis,
and does it or doesn’t it affect life situations?

DR. LEVIN: First of all, just to mention about the highlights part, it is
based on new regulations and implementations over a long period of time. So you
won’t see that for all the products for awhile. The newer products will have
the highlights before the older ones.

The way that the process currently works as far as for new events or new
adverse events or some interactions that you should know about, the FDA works
with the pharmaceutical companies. It goes into labeling, is put into the
labeling. But in the paper world, labeling trails, it takes awhile to get out
and get to everyone.

With this, once the manufacturer and the FDA get the information and put it
into the — it goes for the labeling, it goes into the labeling in real time at
that time. Then once it is updated we give it to the National Library of
Medicine. Once a day we would make that update, and then it goes to the
National Library of Medicine.

One of the things as far as the FDA was thinking about this whole process
is that the collaborations with the pharmaceutical companies, FDA on their
collaboration, National Library of Medicine providing it the place to download
the information, but we are looking for the information suppliers to pick up
this information and pass it on to the consumer. So really it is a
collaboration with the information suppliers. We don’t expect the DailyMed to
be the only place to get this. Once you get it in SPL, it is all electronic. We
hope that the information suppliers would take it from there.

MR. BLAIR: I don’t have a question, I just have compliments to you. It has
been three or four years when we have been receiving testimony about the
potential of structured product labeling. During a lot of that time it was
difficult for our questions to be answered, but you sure have answered them
beautifully today. Thanks to AHRQ for providing the funding, and NLM for being
able to produce these structured product labeling along with RxNorm. So many of
the things that we have been looking for for years, you just now listed a
timetable. I just wanted to say thank you.

DR. LEVIN: Thank you. You, this certain, has helped a great deal too with
your recommendations to move these things forward.

DR. STEINDEL: Harry, you asked my hardball question. So we will move it
down to the two simpler questions. One is a question and one is a comment.

The first question is, when you say all drug products marketed, does that
include stuff like herbals that is not regulated by the FDA but comes under
your purview?

DR. LEVIN: When we say drug products, it is as defined by the FDA. So that
means it is drugs. That is a special qualification that you are able to market
your product as providing some response to a condition that is in the Food Drug
and Cosmetic Act.

Now, botanicals can be drugs if they show that they are safe and effective
for a specific claim. Also, you are bringing up the issue of maybe dietary
supplements, which the FDA regulates as well, but it is not in the same way as
they regulate drugs.

DR. STEINDEL: Would they be included?

DR. LEVIN: They would not be included in this. They are not included in our
regulations. We have the unique ingredient identifiers for botanicals, and we
want to include food supplements and dietary supplements in our identifiers,
but they are not under drug listing regulations.

DR. STEINDEL: So we still have the potential gap that this committee has
heard about before, about a lot of the alternative medicine substances.

DR. LEVIN: The FDA does regulate that, but different than drugs. The
systems and the structured product labeling in the systems can work for those
products. It is just that the drug listing regulations do not cover those
products.

DR. STEINDEL: Thank you. The second one as I said was a comment. I think
you alluded to it a little bit. One aspect in phase two of the electronic drug
labeling was dependent on funding from AHRQ, and you weren’t sure about that
funding of AHRQ. What is the FDA doing about absorbing this funding internally
as part of their line of business?

As we all know, federal agencies need change, and we don’t know tomorrow
whether AHRQ will have the capability of providing funding like this, and it
would be much more appropriate if it was internalized.

DR. LEVIN: Yes, and we are working on that.

DR. STEINDEL: Thank you.

DR. WARREN: You had up your website for the structured product label. Is
that website available? Is that how people can come in and find out what is on
the label?

DR. LEVIN: What we currently do is our collaborations with the NLM. So we
take these structured product labeling and hand it to them, and they put it on
the DailyMed website, which is their website.

DR. WARREN: So what you showed us during your presentation was on the
DailyMed?

DR. LEVIN: That came right off the DailyMed.

DR. WARREN: Because I am getting questions from people I work with back
home about how can we keep up with all the stuff that is going on. So I just
wanted to be sure I knew where you were pulling that from.

Are you going to have any information on the FDA website that will point
them towards the DailyMed for this information?

DR. LEVIN: At the FDA we were an information supplier as well. We are going
to have our own website with the information that we feel is important for the
people that come to our websites to look for information.

DR. WARREN: I have just been cruising, and all I could find were
PowerPoints.

DR. LEVIN: Right now you go to drugs at FDA, you go to the CDER website.
But we are looking to supply this information ourselves as an information
supplier, but the process is manufacturer, FDA, SPL, hand it to the National
Library of Medicine. That is the one place you can go, everyone can go and get
the up-to-date information, and then we are going to supply that information in
a format that we think benefits the people looking at our website, where we are
hoping other information suppliers will use it for their website.

DR. WARREN: I just think the work that has happened over the last couple of
years has been phenomenal, so I am with everybody else. It is really great
listening to this presentation and hearing so much of what we have envisioned
coming true. So I just want to be sure that when we talk about this to people,
they can come to either NLM’s website or FDA’s and find it and go, wow. Thank
you.

DR. LEVIN: Again, the thanks come back to the committee, because the
support that you have for things, we get funding or we get people to pay
attention to the regulations. We nail our progress on that. There is still
progress to be made, though.

DR. STEINDEL: This is kind of a PR followup. Randy, you mentioned during
your talk, you had about one thousand structured product labels that are ready
to transfer to the National Library. If you actually go to the Library’s site
right now and you look at those ten that are there, your comment to yourself
is, gee, this is interesting, but ten doesn’t do me any good. I would suggest
getting the next thousand over and do it in batches.

DR. LEVIN: The process we are having over this first year is to pull in our
legacy data. So we are checking every one to make sure it is correct. It takes
us a little while to get that in. Our goal is to by the end of the year have
those labeled.

DR. STEINDEL: My suggestion is, feed it in pieces so people see that it is
growing. That is just PR.

MS. FRIEDMAN: This is a followup on a comment you made, Randy, about the
support that NCVHS has given to help move this along. Is there anything else we
can do for you?

DR. LEVIN: We still have work to do. There is still work to do. I don’t
know what I can tell you. We have resolution as I said here that will bring our
drug listing process to this century, but we need to have that regulation to
get to the dates that we are talking about here. That means a lot of
cooperation from a lot of people to move that process along. Then the funding
that we talked about.

MS. TRUDEL: My question builds on Steve’s comment. If one thousand is the
numerator, what is the denominator?

DR. LEVIN: It depends on how you — because drug companies can put a number
of products in one label. I think Vivian brought up 4500, but it could be even
more than that, depending on how they package their labels together.

The thing is that at the end of the this year, the complete set for
approved prescription drugs is what we are looking for. The individualized SPL
might be packed full of many products. One SPL might have only one product, so
it varies, the number of actual products. But there will be over 10,000
products that we will have out there.

MR. REYNOLDS: Randy, thank you very much, nice job. We are back on
schedule, that’s good, and nobody else had their hand up. So Beth, if you will
please join us. Thanks for joining us, and we look forward to your update.

Agenda Item: CHI Update

MS. HALLEY: Chairman Reynolds and Blair, thank you so much for this
opportunity to present you an update on the CHI efforts that have been ongoing.

I would like at this time to introduce Dr. William Heetderks from the
National Institute of Biomedical Imaging and Bioengineering at NIH. Dr.
Heetderks has acted as the co-chair for the multi-media work group that has
been in existence for almost three years, so he has a lot of patience and a lot
of perseverance, and we appreciate him joining us today.

What we would like to review today is, we would like to review a
presentation of the final draft report from the multi-media, looking for your
endorsement of their recommendation. A copy of the report is with you as well.
We also would like to talk about several revised CHI reports. We have had a
request from AHRQ. We are going to go through those revisions. They have
requested to already adopted reports. We would like to present an update on our
allergy work group, our disability work group, talk a little bit about some
implementation guideline templates and guidelines that have been recommended to
CHI, talk a little bit about some collaboration activities with some
registries, and some efforts we have been working on with the Office of the
National Coordinator for Health Information Technology, and we will talk about
some of the next steps we see for CHI.

In terms of the CHi work groups, I do want to mention that in addition to
Dr. Heetderks, Dr. Dick Swazha who is also with NIBIB has recently returned to
a position at Oak Ridge National Labs. He has continued also as the co-lead of
this work group.

In terms of the AHRQ reports, I just want to mention that Mike Fitzmaurice,
who is not here today, I know that he is a member of your organization, he is
the requesting reference for the revised reports we will discuss. I would like
to mention Lenora Barnes and Marsha Ensley from the VA have been our active
c-chairs for the allergy work group, but I would be remiss without mentioning
the efforts of Randy Levin and Captain Bill Hess, who have been very
instrumental in our efforts, as well as Dr. John Kilborn, who is also in the
audience, from RxNorm, and in terms of disability, Dr. Lawrence Dessey from the
SSA, and Jenny Harvell from ASPI, who are not able to join us today, but I
would like to recognize the effort of Marjorie Greenberg, who has also been a
very active member of our group. So if there are questions I can’t answer, I’ll
be sure to ask Marjorie.

The first thing and probably the most important thing we would like to
review with you today is the final draft of the multi-media report. There is a
copy in front of you. As I mentioned, Dr. Heetderks and Dr. Swazha are the
co-chairs, but we have also had a significant input from NIH, NLM, VA, DoD, the
FDA, CDC and CMS. I also would like to recognize Peter Kuzmac, who has just
joined the meeting, from the VA, who has been very instrumental from the
technology standpoint, very involved in the VA/DoD efforts to bring this
recommendation forward.

Let me just step back one second and refresh your memories. The multi-media
work group did present a preliminary report to you in December of 2003. There
were several recommendations for the work group to consider. Since that time,
there has been a tremendous amount of comparison group between all the groups I
have just mentioned to bring forward what they feel is a very substantial
recommendation.

The scope of that recommendation — and the next few slides I am going to
go through are excerpts from your report. I just want to highlight a few
points. This is a format that CHI has presented to you before, so I just want
to go over a few of the highlights of the report that is in front of you.

The primary application of the standard is for combining data from multiple
medias, including images, photography, audio, videos, faxes, et cetera, into
the patient record, with the objective of assuring interoperability and
information exchange among the federal agencies. I think that has been a goal
of the CHI work groups across the board.

The process as I mentioned presented the preliminary report to you in 2003.
These are the steps that have been addressed in the last several years to get
us to this point today. There was an initial comprehensive review of all the
potential standards, a list of the general data types and subtypes used in the
patient record. They performed a comparison of the remaining standards against
the checklist that was created, and reinforced by this committee, and several
issues about completeness and options to accommodate gaps, which I will talk
about in just a minute, which was what this group had suggested the work group
go back and review, and come up with some suggestions and methods of addressing
those gaps.

The other effort that was attempted and successful over the last year or so
was to incorporate the CHI messaging, the messaging work groups to take a look
at this recommendation, and insure the compatibility from the messaging
standpoint. So you will see in your report, the full report in front of you,
all of the members from this work group, and it does include members from the
messaging work group as well.

There were technical gaps that they looked at, including a DICOMM proposal
to address three-dimensional data elements. This has been issued for
registration and fusion of 3D type data. This is a gap that the work group has
identified.

In terms of a security gap, there are security related issues being
addressed by DICOMM. By the way, I’m not even sure I started this by saying our
recommendation is for DICOMM, and I apologize for that. That is probably a very
significant point, that DICOMM Is the recommended standard, and it was the
recommended standard that was presented to you as the preliminary
recommendation.

In terms of the security, DICOMM is looking at both the user identification
passwords, information access restriction and some VPN issues. These are being
addressed by DICOMM. Then there is some continued implementation work with the
audio, video and waveform data. If there are questions from a technology
standpoint, I am happy to turn this over when we finish to Peter Kuzmac, who
may be able to answer any questions you may have from a technology standpoint.

The obstacles related to these gaps, and also from a broader perspective,
is the acceptance and implementation of this standard from all biomedical
manufacturers will take time. In addition to that, the substantial time may be
required for an ongoing — and what we will see in a minute is, for systems
that are non-compliant to DICOMM, how are we going to handle that during this
transition period. Maybe the vendors have not addressed DICOMM with their
systems, and we will take a look at that. Addressing that has been incorporated
into the report.

The recommendation is to determine both the storage and exchange standards
for multi-media information, as I mentioned. It is using the DICOMM standard.

In terms of addressing the gaps, what the work group did, this took many
months of collaboration and consensus, and the outcome is, they developed six
scenarios for transition situations where systems may not be compatible in the
current situation with DICOMM. So you will see in your report six scenarios
broken out, and ways to address those six scenarios in a transition time frame.
One scenario which is the most hopeful scenario is that both systems are DICOMM
compatible, so there would not in the short term be any conflict there, a
situation where a system may be HL7 compliant and DICOMM compliant, a situation
where they are both complaint but one might be HL7 compliant and one may be
non-DICOMM compliant, and a DICOMM to a non-HL7 non-DICOMM situation, and a
DICOMM non-HL7 to a DICOMM and HL7. That may sound a little confusing, but each
of those six scenarios are broken out in your report, and the ways the
committee are addressing them are also written up in your report.

The bottom line, the work group is coming forward to you with a
non-conditional recommendation for DICOMM.

I think that’s it.

MR. REYNOLDS: Could you help me with what non-conditional means?

MS. HALLEY: As a matter of fact, this is the first report coming to you
since you adopted the 23 vocabularies and four messaging standards, so the
process may not be clear. But in the original process, there was the ability to
have a full adoption, a conditional adoption and I believe a provisional. In a
conditional situation you would have to list — or the work group would have to
list what the specific conditions were that had to be met for you to approve
that recommendation. In this case, the work group felt like by addressing the
gaps up front in the report, they feel that DICOMM is a valid recommendation.

Do you have any questions specific to this? Would you like me to continue
the presentation for the other areas and we can come back to this?

MR. REYNOLDS: I don’t know. You are asking us to approve it?

MS. HALLEY: That is correct. In the steps for CHI adoption, we have taken
this report to the CHI work group and it has been adopted and approved by them.
We have presented it to the FHA, which CHI is now part of the federal health
architecture. We have presented it to the lead partners. The next step in the
process is to bring it to your committee for endorsement. I believe as an
outcome of that, at least in phase one, this committee presented an endorsement
letter to the Secretary of HHS.

MR. REYNOLDS: That was going to be my next question. So once we endorse it
today, —

MR. BLAIR: I don’t know how we can. I know I have not had a chance to read
it and think about it. So I think it is something where — I certainly don’t
have any objection to it, and i am really pleased and excited, but my thinking
is that we need time to read it and discuss it, and maybe in our April session
we can do a vote to indicate our support. I would be hesitant to —

MR. REYNOLDS: What does that do to your time frame, Beth?

MS. HALLEY: We are following your time frame.

DR. WARREN: I would just like to support Jeff. There are several of us on
the committee that were not present when all of this started. I would like to
review the minutes of 2003 when all this work started, as well as read the
report, so that we can do you justice on the recommendations. So either we
schedule some time in the April meeting, or we do a conference call or
something.

MR. REYNOLDS: Is April okay with you? Then I would definitely like to set
discussion time.

MR. BLAIR: This does not in any way reflect a hesitancy to support it.

MR. REYNOLDS: No, it does not.

MR. BLAIR: It is just that we really need to make sure that it is an
intelligent vote.

MS. TRUDEL: I just wanted to point out from an historical process
perspective that that is a perfect sensible approach. What we did in the last
round was provide the reports prior to the presentation, so that the committee
members had a chance to actually read them.

MS. GREENBERG: I just wanted to say, the letter will actually come from the
full committee. That is late June at their meeting again. But on the other
hand, if that creates a problem for you, it could probably be approved — once
the committee has read and discussed it in April, your recommendation can
probably be approved in a conference call.

MR. REYNOLDS: One other question I had. It says non-conditional from the
people that were involved. Would there be any other entities that would dissent
this recommendation? In other words, if you step back from your own work, are
there entities that would hear this? You are here today so we are hearing from
you, and we know you went through this process, but some of us weren’t involved
here in 2003. So I want to make sure we always ask the question.

So step back away from your own work. Is there any reason that somebody
else in the entire environment would say, I don’t know if we can support that,
I don’t know if that is going to hurt us, do this, do that. That would be
helpful for me today even before I read the report, to have you step back — or
anybody on the committee that knows enough about it, or in the audience or
anything else, I think that would be very helpful.

Steve, do you have a comment on that?

DR. STEINDEL: Harry, if I can address that, because that is in one sense a
general question, and not totally specific to the multi-media report.

The recommendation being brought forward in this fashion from CHI as an
approved document by CHI and upstream within the CHI approval structure
indicates that there is complete federal consensus for this approval. So the
federal process says, we are happy with coming forward to you saying this is a
non-conditional recommendation.

The reason we decided very early in the CHI process to involve NCVHS is
because the people that made this decision were federal. So we are going
outside to the NCVHS as our outreach to say, we are now presenting this to the
public, —

MR. REYNOLDS: That is why I’m asking the question.

DR. STEINDEL: — and are there any — and so it is actually coming back to
this subcommittee and asking the question of you.

MR. BLAIR: Let me add a little piece on history also. It was approximately
two years ago when there was quite an array of standards that CHI was
considering. There was a nucleus of them which were things that NCVHS had gone
forward, which was the PMRI standards and the clinically specific
terminologies, the core standards. Then they went way beyond that with a lot of
other important terminologies that we hadn’t — and standards that we hadn’t
been able to deal with yet.

When they reviewed them with us, there was a large body that we were able
to go through and approve one of them, where we didn’t feel comfortable it was
multi-media. We raised a number of concerns and questions, and apparently they
have now come back and basically said they have done their homework and built
consensus.

So it is my expectation that when we go through this, we will probably feel
comfortable. But we do need to go through it.

MR. REYNOLDS: But playing off of Steve’s comments, is it part of our
process then, Marjorie, to make it clear that this will be on our April agenda,
make it clear that we have time, open mike or whatever, if people do have
dissenting views?

MS. GREENBERG: Yes.

MR. REYNOLDS: That we could make sure that that is available?

MS. GREENBERG: Yes, that that is very appropriate. A question I had was, I
assume obviously you have worked with DICOMM in developing this, that would be
obvious, and the CHI process that they are the proprietor of the standard. But
in the CHI process it is difficult for the federal group to do outreach. So as
Steve said, that is the role here.

But is this report available in a form that if there were others outside of
the federal enterprise, recognizing that this is for exchange of information
within the federal enterprise, but of course it has implications more broadly,
as all the CHI standards do, would this report be available to others outside
of the federal enterprise?

MS. HALLEY: That is a good question. I hate to answer it inappropriately. I
do know that if this report was approved, this would be the actual report that
would become the public document that would be on the — it was on the OMB
site, and now HHS has taken over that CHI site. So this would be the exact
content that would be put on the website.

In the interim, I think historically — and Karen, you can help me, too —
the reports were considered a private report until it was endorsed, and then
the reports become public.

MS. GREENBERG: There is a synopsis though that was publicly available, as I
recall.

MS. TRUDEL: Yes. But let me step back with my historical hat on again. One
of the things that we did in the first round that we have not done this time
is, we did two briefings for NCVHS. One, when the actual working group felt
comfortable that they had a proposal to step forward with. That part we did not
do this time. Then we did a second briefing after the partners council had
actually approved the recommendation.

The other thing I want to point out, especially for the people who weren’t
here last time, unlike e-prescribing and HIPAA, the purpose of the NCVHS focus
was not to get complete buy-in from the industry, but simply to have a touch
point so that people could tell us if the federal government was veering off in
a direction that was totally contrary to everybody else. So there is not as
much of a requirement of complete consensus that there was in e-prescribing, I
would say.

Perhaps the thing to do would be to take the — yes, we did start with
scrubbed versions of the report last time. The full reports were never made
public, because there were proprietary discussions about various candidate
standards that we didn’t think were appropriate to air.

But if you could prepare a synopsis and place that on the NCVHS website
prior to the April meeting, I think then you have got the ability to get that
kind of buy-in before there is a vote.

MR. REYNOLDS: Yes, because without almost setting the whole hearing aside
and have you go through it step by step and the whole world shows up to decide
what they think, — and Karen, see if you agree with this, and Marjorie — if
we post it, we post whatever version Beth prepares, we post that, we then have
the time set aside for discussion that allows us to bring any questions we
have, plus anybody else that would want to make their comments as input, and
then at that point I would guess that we would either vote in the affirmative
or we would hear enough that either Beth or Karen or somebody would say we need
to step back — now that we have heard this, we need to step back a minute, if
anything was that dramatic.

Is that an acceptable process? We get something out in advance of April, we
set time aside for the committee to make any questions or comments that they
have, that that would be available, and then with our goal of approving this in
April, unless you recognize something that may want you to call a timeout, then
come back in our hearings in July and so on. Is that a fair assessment of a
process to use?

MS. HALLEY: We can work on that version of the report, we would be happy to
do that.

I just want to make one clarification. The preliminary report was presented
to this group from phase one. So the information that is on this slide about
the different scenarios were really to address several of the issues that this
committee brought forward to the work group. I did want to just clarify that.

MR. REYNOLDS: Good. Is the committee satisfied, and are you as the owners
of this satisfied?

DR. HEETDERKS: I think that sounds very appropriate. Going back to what
Steve mentioned, within the federal government the different agencies, there is
consensus. But certainly if we go to the broader community, this would have a
big enough impact that there will be a spectrum of opinions, and it is
appropriate to hear those.

MR. REYNOLDS: One other quick question I have before we close this. Does
any of this work relate to the whole attachments process that is going on with
HIPAA? Karen is moving towards her mike to answer that.

MS. GREENBERG: It could be in the future.

MS. TRUDEL: It could be in the future, but recall again that the HIPAA
claims attachment is an industry-wide standard, and the CHI standards are
federal only.

So again, as you talk about the tipping point of CHI, a CHI recognition of
DICOMM could tip the construct of a future claims attachment, for instance, for
radiology or for whatever.

MR. REYNOLDS: Right, because in the end if you think of electronic health
records, which may be based philosophically on a lot of the CHI, and then you
think of the administrative transaction, we want to pull that data together,
you have to keep some synergy at least in the discussion. Is that fair? Very
good. Thank you. Now you can proceed with the rest of your update, now that we
understand our mission, and have a process to actually carry it out. So will
you note for our April hearings, since Marie is not here? Thank you.

PARTICIPANT: So noted.

MS. HALLEY: The next point we wanted to bring forward to this committee was
requested by Mike Fitzmaurice of AHRQ. It was about revising some information
that has been in previously adopted reports.

In going through this process, we realized there needed to be a way of
versioning our reports. So the reports that are listed below, there are eight
reports. By the way, the revisions are very minor content wise, but they are
very significant to AHRQ. I just want to make a point that the first eight
reports that are listed here, anatomy, physiology, diagnosis, the non-lab
interventions and procedures, the lab results content, the nursing, medications
for clinical drugs and medications active ingredients, medical supplies and
devices, all just add the word AHRQ to the reports, in providing recognition
for their support, both financially and support wise to those efforts.

The last two reports mentioned here, drug classification, structure product
labeling, there was a request to update some dates which were in the report,
which we have done.

We have not copied all of the reports and given you all ten of these
reports. It would be hundreds and hundreds of pages. But I did want to go
through where we put the changes in on these slides here.

Each of these reports, if they are approved by this committee, will become
version 1.2 of the CHI reports. The reasoning in the CHI work group was that if
content of a report would change but not the actual standard adopted, then we
would just increase the versioning, 1.1, 1.2, 1.3. If the actual standard for
some reason would change, we would completely revise the report to maybe a
version two or a version three. That might be indicative of the — I believe
there were seven reports that were brought to your attention, but were not
adopted, the standards were not adopted. In their continuing work, if a
different standard would be recommended, we would change that versioning.

So I just want to mention that to you when you look at the actual report,
it would be version 1.2.

These are the reports that have had changes requested. The anatomy and
physiology, diagnosis, non-lab, the lab result and nursing, AHRQ has requested
the following sentence be added. It is referring to in the executive summary
the acquisition of SNOMED. The current sentence reads, SNOMED CT will be
available from the National Library of Medicine, NLM’s, unified medical
language system, the UMLS Metathesaurus, at no charge to anyone in the United
States who agrees to the license. AHRQ has asked to add the following sentence:
This no-charge feature has been supported by HHS and, in parenthesis, NLM, NIH,
OD, CDC, ASPE, AHRQ, CMS, FDA, IHS, SAMHSA, HRSA, DoD and VA. Again, the intend
there is to just get recognition for the support given to this process.

In the following reports, the word AHRQ has been added to these sections:
In medication clinical drugs, under the ownership, the sentence reads, as
steward, NLM works in close collaboration with other agencies. There is a
parentheses, for example, AHRQ was not listed, only the VA and FDA was listed,
and AHRQ has asked to be added to that list.

In the same report, clinical drugs, on page five the sentence reads, AHRQ’s
norm is a public domain system developed by NLM in conjunction with the VA and
FDA, and AHRQ has been asked to put “and AHRQ”, which was their
wording, and in consultation with HL7. Again, we are just adding the words and,
and the Agency for Health Care Research and Quality.

In the active ingredients report, under ownership and executive summary, in
the sentence, the recommendation standards are in the public domain and are
administered by the FDA and supported by AHRQ, and supported by AHRQ has been
requested to be added. In the medical devices and supplies, the ownership
executive summary, the GMDN is managed and its content maintained by the
international maintenance agency with significant FDA representative, and they
have asked that and AHRQ support be added to that sentence.

Yes?

DR. STEINDEL: A couple of questions there. The first one is, what is
version 1.1? These were going to be versioned as 1.2. I would think they would
be 1.1.

PARTICIPANT: Could you speak into the microphone? We are not picking you
up.

DR. STEINDEL: The question I asked was, what was version 1.1. My thought
was that these were going to be version 1.1, not 1.2 as you stated.

MS. HALLEY: Yes, that is a good point, Steve. We probably want to go back
and rename that to 1.1. That would indicate first revision.

DR. STEINDEL: The second was, under medications, active ingredients, I
thought we were going to add to that, and supported by AHRQ and maintained by
the National Cancer Institute.

MS. HALLEY: Which report was that?

DR. STEINDEL: Medication and active ingredient. I think we should add the
comment to this committee that all of these changes were approved by the
individual work group chairs and by CHI.

MS. HALLEY: Thank you, Steve. I will get that wording if that is
appropriate, and thank you for that clarification. We did bring all of these
changes to the CHI work group, and prior to that to all of the co-leads that
ran each of these individual work groups.

The last two changes were updates to some dates that were in the report.
AHRQ felt that these things had already been addressed, and we should remove
the following two phrases. Under drug classification, remove the phrase, will
be a near future release of the UMLS planned for July ’04, since that is now a
past date. And a similar comment on the structured product labeling,
recommended we remove the outdated wording of, currently being balloted through
HL7, January 04, update April 04, ballot passed.

MS. AULD: Could you go back to the previous slide? For the report for
medications, clinical drugs recommendation on page five, I believe it should
say NLM in conjunction with the VA and FDA and funding from AHRQ, rather than
—

MS. HALLEY: In development?

MS. AULD: Yes, because AHRQ helped with the funding of that.

MS. HALLEY: Any other comments on the wording changes that have been
requested? We can make some of these additional changes. I will get them back
to Mike Fitzmaurice, and if it is appropriate at your April meeting, perhaps we
could —

MR. REYNOLDS: It appears, unless somebody else on the committee feels
differently, most of these changes are technical type changes. They are not
changing the substance, they are referencing more appropriate comments.

MS. GREENBERG: I guess I would question whether these would even need to
come before the National Committee, other than informational at most.

MR. REYNOLDS: Karen, you are agreeing? I want to make sure Jeff understands
who is doing what in this conversation?

MR. BLAIR: Nodding heads have to be vocalized, please.

MR. REYNOLDS: And Steve is also affirming that.

MS. HALLEY: I will take that as a motion to take back to the CHI work group
that if some content changes but none of the substance changes in the reports,
we will not need to bring forward to you all.

MR. BLAIR: Beth, you can consider that this was done in the spirit of
administrative simplification.

MS. HALLEY: I will take that back, thank you.

MR. REYNOLDS: And the appropriate heads nodding on the committee. We have
got about ten more minutes.

MS. HALLEY: Then we will quickly go through our updates. The allergy work
group has been working very, very diligently. This is a complex area. I know I
did an update for you in September, so I’ll just mention some of the activities
since September, since I think we have been through most of this already.

There are two very complex areas with allergies that we have been dealing
with since September, but I do want to mention that I think we are very close
to bringing you a recommendation, which I think is exciting for all the folks
that have been working so hard on this.

The two complex areas are drug classification, which Randy made a comment
on, and I think with some of the enhancements to the SPL and the ability to
identify chemical classification for the drugs within the NDFRT, and point them
through the FDA SRS, is really going to help with this, and the committee I
think recognizes that. That has been one of our stumbling blocks, to identify
how to classify a drug when a patient says, I am allergic to sulfa drug, as
opposed to Bactrim, how do we codify those classifications.

The other part that has been complex for us to deal with is the patient
reported as opposed to the clinically verified allergy. So we hear some
interesting comments, like patients will come in and say I am allergic to my
spouse and things like that, and it is going to be very hard through whatever
coding system we come up with.

But right now what we are doing is, we are coming up with a use case to
break the patient reported allergies out, and how best to address that in this
system.

Aside from that, I do want to show you — and I know it is probably hard
for you to see this, and we certainly will present this larger to you in April
when I hope we will be coming forward with a preliminary report. Essentially
this is a graphic that Captain Hess from the FDA put together with some
ingenuity and some support from our members, from CMS and the VA and some of
our other folks. It is a graphic to summarize the 30-some plus standards we
have looked at for allergies.

I think it really is a nice depiction, and helps to show what the group is
trying to get — the point they are trying to get to. Around the edge you see
HL7 2.0 and above is the framework we are working in. Then within that, this
work group as we talked about before is taking a little bit deeper approach to
coming up with a standard, and they have looked at some of the data fields
within HL7 that is going to require information. This is a depiction of the
different types of allergies, both food, coming initially from the Langwell
system, through the FDA SRS, being assigned a unique code, and that would be
the allergen code for the foods. It goes through the four other allergen types.

The drug and the biologics, the device and the environmental toxins will
all eventually have a unique code assigned to them. I think the one thing
outside of the FDA SRS which Randy made a comment to, the EPA has also been
actively involved, and the FDA SRS and the EPA SRS is looking at a process to
combine the substances that would be similar related to allergens, that the EPA
felt like they should maintain control of, but the FDA SRS will also hold those
allergens in there as well.

The last point is about the allergen class, which has been a really complex
area to work through. I think we are getting there.

I know we only have about five minutes, so I am going to move on, if that
is appropriate.

MR. REYNOLDS: That includes questions.

MS. HALLEY: The disability work group is doing some really exciting things.
I want to thank this group, because it was some of your recommendations of
things to look at that this group is considering.

As Marjorie mentioned earlier, the ICF and the SNOMED are two camps within
disability. They address different terminology needs. That ability to map those
into a single source, hopefully to SNOMED CT, is really the goal. Until that
happens, the group is also looking at a couple of other ways to address
disability issues.

One is the ability to actually codify the types of questions that are being
asked on these different types of forms that are being used, whether they are
clinical like the MDS, the Oasis, the ERFPI, or whether they are the work in
comp form that DOL and Social Security Administration forms. So we are taking a
little bit different look at it. Instead of looking at just what the
terminology needs as output, is potentially codifying the input.

We are going to have a presentation next week from the LOINC folks, and two
weeks after that we are going to have a presentation by HL7 CDA in how to
potentially combine that end of the standard.

I just want to mention, we have been working on an implementation guideline
template with the work group. It is in draft form. One of the pieces to that is
contract language that all the federal agencies can use. We have been getting
requests from different federal agencies, so we are putting together some
suggested contract language for putting CHI in acquisition wording RFPs.

In terms of collaboration, we talked to you before about USHIC being the
repository for the SNOMED standards. CADSR, looking at possible using that as a
front-end portal to USHIC, and also the NIST health care standards landscape
list, the CHI standards. We are working with USHIC and NIST to insure that they
both link to each other.

I just want to mention, in terms of the Office of the National Coordinator,
I just want to refresh your memory. When CHI moved over to FHA, it is now under
the Office of Interoperability and Standards under Dr. Brailer’s office. That
is headed up by Dr. Loontz.

We have been working with the HITSBE group, which is one of the contractors
for standards from a national perspective. We did a CHI presentation to that
group on Tuesday, and also looking at insuring that the standards that come out
of HITSBE and the NIN hopefully will leverage the work that CHI has been
working on for the last several years.

That is really it. In terms of next steps, we are looking forward to your
endorsement of the recommendation. We will be continuing the work group efforts
for allergy disability. The supplies and medical devices is coming up, would
like us to re-establish the work group because they are coming up with a unique
identifier that is combining several unique identifiers. They feel like once
that is decided, it will be an easy decision to come to you with a
recommendation. And we will be continuing our efforts with the Office of the
National Coordinator.

MR. REYNOLDS: I have one quick comment, and then Jeff. This is our only
break of the morning, so we will take it, because also we need to get our next
DSMO presentation done. A number of people are calling in at 11:45, and as we
found with Judy yesterday that gets a little squirrelly if we are not read on
time for them.

I want to go back to the changes we talked about earlier, the technical
changes, we termed them. When those are made, do we need some confirmation from
either Karen or Marjorie, or just directly from Beth, that there weren’t any
substantive changes? And then we say fine? Is that how we close that in April?
I think we need some affirmative —

MS. GREENBERG: I was just thinking it could be an informational thing, we
made these changes.

PARTICIPANT: Could you talk into the microphone, please? You’re not picking
up.

MS. GREENBERG: These are really minor, so you could leave it to their
discretion, I guess. The whole process involves all the federal agencies, so if
nobody seemed to feel these were very substantive, then that probably is good
enough. But I was suggesting they could probably provide you something in
writing.

MR. REYNOLDS: That is all I am looking for. I would just like some stamp
that says, we heard you, we went back and looked at it, we didn’t make any
substantial — then if somebody came back later and tried to find some
documentation and closure, rather than just saying, go on and take care of it.

MS. GREENBERG: Right, there is a paper trail that they have informed you.
If for some reason they inform you and you say, this looks more substantive to
me than they think, then you can always open it up. But I don’t think you need
letters and all of that.

MR. REYNOLDS: Fine. Just so we can have something prior to April, we’ll be
good. Jeff, you had a comment?

MR. BLAIR: When NCVHS was going through its process of identifying
standards, where we went beyond the HIPAA standards to clinical message format
standards and clinical terminologies which are now part of CHI, we really had
tremendous hopes that the CHI standards would — that the early adoption, and
this was the phrase that we had used way back when, that if the federal
government was an early adopter of these standards, that that would wind up
being a catalyst for the private sector to move forward with the adoption of
these standards as well.

The federal government has done its job. They have through the CHI process
adopted the CHI standards and terminologies.

I am concerned, because the things that I have been hearing is that the
private sector has been holding back from adopting these because they feel that
the government hasn’t asked them to do so. So when you were winding up
testifying and doing the presentation to the harmonization group, the ANSI
HISPP group, you wound up using the word hopeful. I remember the directive to
that group was that they begin with CHI standards as a foundation.

I would like to get feedback. Maybe in our April session when we revisit
the issues of approving the multi-media standards, if at that time there is any
insights you could give us, if there are constraints, impediments or barriers
to private sector adoption of these, maybe it would be helpful for us to
understand that, so that the full vision of what we had intended some time
back, that we could move forward on that.

I think that is my only comment. If you could identify the impediments to
private sector adoption of CHI standards as you know them, I think that would
be helpful for us to know.

MR. REYNOLDS: If you will do that, then I would like to close this portion
of the session. Thank you very much to both of you. I hope we met your
expectations. Some of us are new at this. We appreciate it. We look forward to
seeing and hearing from you in April. So thank you very much.

MS. HALLEY: Thank you, appreciate it. Thank you.

MR. REYNOLDS: We will get back together at 10:45.

(Brief recess.)

MR. REYNOLDS: I would like to welcome Todd Omundson. He is going to be
giving us our annual report from the DSMO. Welcome.

Agenda Item: DSMO Annual Report

MR. OMUNDSON: Good morning. My name is Todd Omundson. I am the Associate
Director of the Health Data Management Group at the American Hospital
Association in Chicago. I am the current chair of the Designated Standards and
Maintenance Organization Steering Committee, and I represent the NDBC on this
committee. I am here today on behalf of the DSMO to deliver our 2005 annual
report.

I’d like to start off with a little background on the DSMO. On August 17,
2000 the Secretary of Health and Human Services named six entities as
designated standards and maintenance organizations, sometimes referred to as
DSMOs. They worked together on the maintenance and development of HIPAA
administrative simplification transaction standards.

These six organizations are comprised of three standards development
organizations or SDOs and three data contact committees. The SDOs are the
accredited standards committee X12, Health Level 7 and the National Council of
Prescription Drug Programs. The three data contact committees are the Dental
Content Committee, the National Uniform Billing Committee and the National
Uniform Claim Committee. A steering committee was formed comprised of one
voting member from each of the six organizations, plus a non-voting liaison
from HHS, specifically the Office of E-Health Standards and Services, and
currently our liaison is Gladys Wheeler, who is in the audience today.

The steering committee convenes at least monthly in order to arrive at a
consensus on all requested changes to the HIPAA standard transaction submitted
through the DSMO change request system. We only look at changes that actually
go through our website, that are submitted directly to us.

The designated standards and maintenance organizations have continued to
follow a routine working schedule since the previous reported dated November
2004. Due to the timing of the annual report to NCVHS, the reporting periods
have varied in length. Accordingly, the monthly averages on the table in page
three of the report are indicative of changes in volume over time. You can see
that the volume has decreased substantially. You can see that in May 2003 to
June 2003 we had a very heavy volume; that was all the work leading up to the
10/23/03 HIPAA commencement date. So this report covers 11 months.

During this period, the DSMO received only nine change requests, which is
substantially down even from last year, when there were 35 change requests. The
monthly volume of submitted DSMO change requests dropped from 4.2 per month to
1.5. The number of change requests completing the DSMO process dropped from 2.2
to less than one per month. This represents a 64 percent decrease in both
categories, which is comparable to last year’s dropoff. It should also be noted
that the DSMO denied one appeal this year.

One of the main reasons for the decline in change requests is as a result
of change requests being submitted directly to the SDOs rather than through the
DSMO change request system. The SDOs have indicated that they are tracking
modifications to show DSMO change requests versus the SDO initiated changes.
The SDOs also produced a change log appendix containing all the changes
incorporated into a revised implementation specification. With each guide,
there is an appendix in the back that lists all the changes from the previous
guide.

Presently all of the approved DSMO change requests are slated for
implementation in the next version of the standard. Other changes not yet
reviewed by the DSMO will need to be evaluated with the updated HIPAA
implementation guides are brought forward as the replacement to an existing
standard implementation guide. Obviously, substantive changes will require an
analysis of costs and benefits associated with the adoption of these changes.

Let me give you an example of one such change. The most recent
institutional claim guide contains a new data element to indicate whether a
particular diagnosis was present on admission or not. The indicator was added
in the response to a comment received during the DSMO comment period. The DSMO
comment period was about a year ago, and it was submitted by me I believe to
add this data element to the guide.

Now, the change has not gone through the DSMO change request system, but we
will certainly evaluate this item when making a recommendation to adopt the
next HIPAA guide.

In fact, right now I know that X12 anyway is putting together their list of
what they call major changes to the guide. Right now it is 14 pages long. They
are doing this with the cost-benefit analysis they are working on. So they have
got the change, they have got the — right now they have the benefits derived
from the change. This will be on that list.

The appendix to our report contains the details on all change requests
completed by the DSMO review process. The table on page four is a comparative
summary of the change request by category of disposition. The DSMO originally
established eight broad categories lettered A through H. We don’t use A anymore
because that was during the fast track period where a change got immediately
put into the standard. I was added in the last year to account for
recommendations for adoption of an updated HIPAA standard implementation guide,
and J for out of scope was added this year.

An example of an out of scope change request would be a request for changes
to transactions not named in HIPAA. This category was added on a prospective
basis so as not to overstate the disapprovals going forward, that is, to keep
category D which denotes no change, to keep that category pure.

The appendix to our report also contains a complete list to these
categories and their definitions as well as a guide to reading a DSMO change
request, and the actual requests sorted by category. You can see that this year
we had two that were out of scope, so we got nine and two were out of scope, so
we had deliberated on seven requests. Very routine requests this year, nothing
really substantial.

The DSMO engaged in several other activities in the past year. One
undertaking related to the new version of the 835 electronic remittance. Last
year the DSMO received a change request seeking our approval to the adoption of
a newer version of a newer version of the existing HIPAA electronic remittance
advice, standard implementation guide, known as the ASE X12N 835. We supported
the request based on a qualitative review by the industry. During the summer,
WIDI conducted a formal cost-benefit survey related to adopting the 4050
version of the 835 as a successor to the 4010 version.

In our letter of December 8, 2005 the DSMO steering committee encourage
NCVHS to proceed with the recommendation to HHS to begin an NPRM naming version
4050 as the next HIPAA standard. We felt that the notice should also include a
request for comment on whether an even newer version, the 5010, should be
adopted instead of the 4050 version. We noted that because the 5010 does not
represent a substantive change from the 4050, and that public comments indicate
that the 5010 would be preferable, the final rule should go on to adopt version
5010 of the 835.

In 2004 the DSMO played an important role in providing recommendations for
changes to the adopted standard implementation guides as part of an emergency
maintenance change process. The SDO members of the DSMO continued to work with
HHS to evaluate redundant review and approval processes. This year the focus
has been on ways to dovetail the FDA and federal NPRM comment periods in order
to shorten the cycle time for newer versions of an existing standard. This area
is still under discussion among the SDOs.

Looking ahead, the DSMO will continue its efforts to develop a more
predictable, manageable and efficient change process. We are anticipating the
issuance of an NPRM that is intended to solicit comments on an expedited
process for updating the HIPAA standard implementation guides. Recall since
HIPAA one, there has not been any modifications or any maintenance to the
standards. The 4010 version is about eight years old, I believe, as of 1998. It
is our understanding that this NPRM will put forward a definition of
maintenance under HIPAA, and will also include details on a proposed process to
handle emergency change requests.

My understanding is that we are expecting that some items, maybe those
considered maintenance, will not necessarily have to go through an NPRM
process, therefore expediting any kind of change. However, the DSMO is ready to
make any necessary changes to our change review process to meet the
requirements of the final regulation.

New versions of several HIPAA transactions are currently entering the
cost-benefit analysis phase in preparation for submission to the DSMO for
consideration as updated versions for existing standard implementation guides.
The SDOs and WIDI are collaborating to refine the methodology piloted in the
835 survey mentioned earlier.

We expect the DSMO activity to intensify in the coming months as newer
versions of the existing HIPAA standards are brought forward.

In closing, this report reflects both the completed and ongoing efforts
which will be the subject of reports at future NCVHS hearings. I know it was
mentioned earlier this morning about the April meeting and talking about
streamlining. The DSMO as a collaborative organization continues to demonstrate
its ability to manage both the business and technical perspectives of the
transaction standards, as well as emergency change and modification maintenance
processes. The DSMO is well positioned to assist the NCVHS and HHS,
recommending changes to the HIPAA adopted standards or new HIPAA standards not
yet adopted.

Thank you for the opportunity to present our report on behalf of the DSMO.

MR. REYNOLDS: Thank you, Todd. Questions from the committee? Jeff, did you
have a comment?

MR. BLAIR: Are you requesting at this time that NCVHS approve the 5010?

MR. OMUNDSON: All 5010s, no. They have to come through the DSMO. The only
thing, we have asked the NCVHS to write a letter to HHS on the 4050 version of
the 835. I don’t know whether that letter has been written yet. It has not been
written yet. What was our choice? We wanted to keep the process going. We can’t
say no, stop, so we said go ahead and keep going, even though there might be a
newer version that might be more suitable. Keep this in mind and keep the
wheels turning, put together an NPRM. In the NPRM say, there is an even newer
version 5010 and seek comments to whether that would be acceptable.

But for right now, that is the only thing that the DSMO has actually
recommended to go forward. There are a lot of things right now as far as all
the claim guides. There are basically three HIPAA claim guides that are at 5010
completed right now. They are working on the cost-benefit analysis for that,
which I think will take several months, and may be submitted to the DSMO
perhaps by the end of the year. So there has not been a request made to us yet.

MR. REYNOLDS: I would assume from process that we would need to hold a
hearing on the recommendation, so that when we wrote the letter we would have
more than just input from the DSMO.

MR. OMUNDSON: The 835, we wrote a letter last December, and it was read at
the meeting last December. We formally recommended that the —

MS. GREENBERG: At the subcommittee meeting?

MR. OMUNDSON: The subcommittee meeting.

MS. GREENBERG: In December?

MR. OMUNDSON: I’m not sure which one it was. December 8. The letter was
written to Dr. Cohn.

MS. GREENBERG: It hasn’t been discussed by the subcommittee, I don’t think.

MR. REYNOLDS: No, we have not discussed it.

MR. OMUNDSON: It was directly to Simon, and it probably was a full meeting.

MS. GREENBERG: The full committee met in November, and it hasn’t met until
now. So it must have been at the subcommittee meeting.

MR. OMUNDSON: It was at the subcommittee meeting, right, but there was also
X12 testimony on an expedited process that was given that day. They had some
pretty aggressive ideas for moving things forward.

It has been frustrating to everybody involved that it has been such a slow
process, feeling our way through it. I don’t think the DSMO — it doesn’t hold
it up at all. As part of the process it takes anywhere from 90 to 120 days to
get a change request through, depending on the time of the month it is
submitted, but I think we are very flexible and very adaptable, and I think we
showed that in 2002-3 when we got 150 of them through, and working hundreds of
hours to get the first modification to the original HIPAA guide, the so-called
addendum.

MS. GREENBERG: December 8 was when the subcommittee met. Can you clarify
why the DSMO came forward with this one request?

MR. OMUNDSON: Mainly because at the last annual report, it was up in the
air. We had gone through it and recommended that it go forward, but then it
came to our attention that before an NPRM could be written, there has to be an
impact analysis done. We spend several months trying to figure out how to do
something like that.

So we put a forward — not really contingent on the cost-benefit, but we
said that we would come back and testify on the cost-benefit. So we didn’t
complete that until the end of the summer, and then wrote the letter to say go
ahead formally with the 4050, because it was getting late. It kind of dragged
on for awhile, and by the time we were done with all this or the WIDI was done
with this, 5010 was in existence.

So we felt our way through this all last summer, and starting testing the
system, what happens when somebody recommends a new version, how does that go.
We submitted it in December, we looked at the qualitative aspect. It looked
good to us, go ahead with it, and we said we need some cost-benefit on this.
Then we did that, and that is why it came in so late in the year.

MR. REYNOLDS: I’m not sure I am concerned with when it came in. I want to
make sure that we adjudicate appropriately what are the next steps to do that.
We have a recommendation, the four letters NPRM are a big deal, as far as the
process.

The other thing is, i believe that one of the charges of the committee is
to make sure that we have full and complete evaluation by a number of entities,
not just a single one that brings it in. So I am asking members of the
committee along with Karen and Marie and Marjorie to — I would assume that we
would have to have some kind of open discussion, and then if we wrote a letter
to the Secretary we would be able to say that we received a letter from the
DSMOs, we have heard from others that are affected, here is what it appears.

I will harken back to e-prescribing, where we heard numbers of
recommendations on e-prescribing. We had pretty much the entire industry in
here, nodding their heads and walking up to the mike and saying we are in. That
is a significantly different letter to the Secretary than just a letter from a
single governing body that says we should move to this. I think it is our
responsibility to get other input, isn’t it?

Marjorie, you have a troubled look on your face.

MS. GREENBERG: The DSMOs have not come forward with a recommendation on the
whole suite of 5010s.

MR. OMUNDSON: That’s correct, just one transaction.

MS. GREENBERG: I am just wondering if it is appropriate to do it
transaction by — one at a time. I don’t think so. I don’t think there is any
intention to do an NPRM for each transaction.

Karen, can you shed any light here? As I understand it, the 5010 guides
have been approved now by X12?

MR. OMUNDSON: Yes, right.

MS. GREENBERG: They then meet to take those to the DSMOs?

MR. OMUNDSON: When they are ready. They are not ready yet, because the
cost-benefit analysis impact study has to come with it to the DSMO. It didn’t
happen last time. We did it afterwards. It has got to come with it and be all
tied in a bow and say, here, DSMO, take a look at it.

MS. GREENBERG: Then the next step once the DSMO approves it would be to
bring it to the National Committee?

MR. OMUNDSON: That’s right.

MS. GREENBERG: So the question is whether —

MR. OMUNDSON: They don’t necessarily have to be brought forward as a suite,
not necessarily. With 4050, I think we would like to see the 5010 version of
the 835 go forward, but I don’t think it was really ready to go. In fact, you
probably can do a cost-benefit study on that in the interim, when the NPRM
comes out for the 4050 and then when the 5010 is ready.

MR. REYNOLDS: Karen, you have a comment, and then Steve.

MS. TRUDEL: It’s not a comment, it is a question. At what point in time do
you anticipate that the package with the bow on it will arrive at the National
Committee?

MR. OMUNDSON: Well, 90 days after we get it. Our process, the DSMO process,
is a 90-day process.

MS. TRUDEL: Right, I understand that, but do you have any sense from X12 as
to how far along —

MR. OMUNDSON: We are working feverishly on it. I am very much involved with
the institutional guide, and I know they are working on it, every conference
call, which is twice a month. They already have 14 pages worth of so-called
major changes and the benefits associated with the major changes. So I would
imagine hopefully by the summer, their June meeting and then we might have it
as early as July, and it would still be reported on at the end of the year.

MS. TRUDEL: I’m not trying to press you, but I know Marjorie is always
thinking ahead about schedules, because the committee doesn’t meet constantly.

MR. OMUNDSON: Right. I think it also depends on whether the NPRM would come
out this summer or what have you. There could be an NPRM coming out in advance.
You’ve got to recommend it first.

MR. REYNOLDS: Let me let Steve make a comment, then I’ve got a comment I
want to make.

DR. STEINDEL: Todd, I would like to see an expeditious presentation of 5010
to this committee by the industry, and you can take that as either the DSMOs or
otherwise, primarily because I think we are all aware of the Congressional
actions that are going on that specifically state that 5010 is the version we
want to shoot for.

MR. OMUNDSON: That’s right.

DR. STEINDEL: It may just come into play without any real full review. It
would be nice if this committee had a chance to comment to the Secretary on the
appropriateness of 5010. We might be able to do that without a full impact
analysis, just maybe the bare bones of one or something. But I think it
behooves us to attack this as expeditiously as possible in that area.

MR. OMUNDSON: The key feature in the 5010 claim guides is the ICD-10
qualifier. Also, the admission indicator is also in that 5010 guide, but there
is some talk that CMS wants to use that indicator as early as next year.

DR. STEINDEL: Todd, if I can comment, I think the key difference is
bypassing 4050.

MR. OMUNDSON: We have.

DR. STEINDEL: That is what I think needs to be put in front.

MR. OMUNDSON: 4050 has been bypassed, not necessarily on the 835, but
nobody considers any 4050 is going forward as the next standard. They have been
shooting for the 5010 as their hopeful best earliest version.

So you would like to see some sort of expeditious review of this this year
obviously.

MR. REYNOLDS: We are not deciding anything. Let me make a comment. We have
a subject that we have been trying to take a look at. We talked about whether
or not we should have it in April. We heard some testimony in the last
subcommittee about streamlining the process. I am sensing a bit of frustration
from you on what the process is and how fast the process goes, and can you pull
a piece out, and can you jump to 5010. That is coming across clearly. It is
also coming across in lots of arenas for discussion, whether it is right or
wrong. It is coming out as a discussion point.

But I think our charge, and you all can help me around the table, our
charge is to hear these concerns, make sure that we have the appropriate open
hearings, so that yes, there is frustration, but as these things come in from
individual entities, to make sure that the subject is discussed in the
appropriate light, both from the government process, federal process, as well
as whether pulling a piece out as good or bad is something that can be acted
upon, and exactly what happens.

So on the one hand, I’m sure people that are talking about really hurrying
would think that is stonewalling, but that I think we do have a responsibility
that we put things in open forum. So Simon has the letter. We heard discussion
in September about streamlining the process. We have talked about whether we
need to add that. Maybe that should be the key focus of our July hearing, that
could include 5010, that could include an entire discussion of 5010 so that it
is not just the 835 piece.

MR. OMUNDSON: I also read some proposed legislation that basically says
that the SDO can recommend directly to the Secretary to adopt a version. It has
been written.

MR. REYNOLDS: We understand that. We know there are people on the Hill
trying to make something — but I guess my point is, regardless of whether
those things pass, our responsibility as a committee is to adjudicate these in
open forum with all players having the right input. Everybody knows what is
going on, so if it comes out in legislation that we are out of the loop or
somebody else is out of the loop, or somebody can do something, that still
doesn’t change our charter unless the Secretary then hands it to us and says,
this has been recommended now, you put it into open forum to see if it is going
to fly.

That is all I’m trying to do. We have a responsibility as a committee. Some
people think that adds to the process in some things and some people think it
detracts from the process. That is immaterial at the moment. We have a charter,
we have a responsibility, and we will adjudicate based on that until I think
somebody tells us that we have a different charter and focus.

So one, thank you for your report. Second, I think we hear you loud and
clear on the fact that 5010 is where you think it ought to go. If there is a
process after discussion, because we would have to have the same discussion on
the 835 as we would on everything for 5010 in an open environment. I think we
need to go ahead and make sure that we get that on our schedule for this year,
based on what we have to do. We are finalizing some significant things in the
April time frame, so this maybe be something that we do. I’m just throwing this
open to the committee for discussion, but I don’t see any other way to
adjudicate the matter now, or to either add to your frustration or alleviate
your frustration, because I think we have got to go through this process as our
charter.

Marjorie?

MS. GREENBERG: Obviously there is a letter that has been sent to the
subcommittee, so the subcommittee has to decide either to defer it until you
get another letter for all of the 5010 transactions, or to obtain information
that convinces you that you should take testimony on this one letter. We need
some process for making a decision on what you are going to do with this
letter.

MR. REYNOLDS: But I think playing off of either Jeff or Steve’s comment,
there is no question that the 5010 is a subject that we need to learn soon. We
need to understand the differences from lots of viewpoints between 4010 and
5010. So I think letter or not, that is a good thing for us to do.

Then as part of that, once we hear that, then I would be in a much better
position to make a statement as to whether a single transaction because of what
we heard is a good thing to do, or you do them all together. So I am trying to
set up a process that allows us to stay within our jurisdiction and allows us
to hear clearly from segments of the industry that want to do things certain
ways. I think that may blend both of them, and that could be something that I
want to hear from all the committee that that is something we could go ahead
and schedule in our July time frame, to at least have the 5010 discussion in
open forum, make sure that we have representatives from the DSMOs,
representatives from X12, and representatives from anybody else that wants to
come in, industry wise or anything else, that wants to discuss what that does
or doesn’t mean. That gives us a significant education, and then allow us to
move forward in our adjudication of what we need to do.

So I would like to hear either any affirmative or negative responses to
that from the committee so that we could go ahead and basically let Maria and
Donna start working on a portion of our July hearing to do that. Comments?
Excuse me, I didn’t mean to exclude staff; staff or committee members.

MR. BLAIR: Affirmative.

MR. REYNOLDS: Everybody appears to be shaking their head yes. So that would
be our approach again, based on our structure, based on our charter, based on
what we have to do. If something happens on Capitol Hill that removes us from
the process or puts us in the process in a different way, then we will adjust
then.

MR. OMUNDSON: Do you want to have some hearings in April and again in July?

MR. REYNOLDS: No, we have hearings in April. What I am saying is, we had
already set subjects that were coming in. This is not maybe a 45-minute
session, what you are asking us to look at in 5010, with the number of people
that need to do it. So we can give you an hour and then just nod our heads and
say we will see you again in July, and I don’t think we would give you
anything, versus making sure that the subject has a block of time in July to
have something out.

MR. OMUNDSON: No, it is not a 45-minute — we have spent hours and hours
and hours rehashing and going over what could —

MS. GREENBERG: So you are thinking by the end of July though, which I think
is the next time the subcommittee is scheduled to meet, the last week in July,
that there would be an analysis of what all the changes are?

MR. OMUNDSON: For then. I don’t know how much we will be able to look at
it. Once it is wrapped in a bow, we could probably turn that around pretty
quickly. We have in the past. We don’t necessarily need the full 90 days.
Ninety days is the cycle that we are on. As far as routine stuff, it is very
low. We are looking at less than one a month.

Basically all our meetings have been this maintenance process anyway. We
could probably fast — I know they don’t like that term anymore, we could
probably fast-track that part of it, get something ready as soon as possible to
go forward. That has to be done. So you would encourage X12 to finish their
task, and for us to get it as soon as we can and then start working on it, and
by July we would have a good feel for things that we pretty much disagreed
with.

We have been following this as it goes along. There is not that much, I
don’t think. But it is new. That is the main thing. It is a change.

MS. GREENBERG: If X12, DSMOs, whatever, could see this late July as a
target date, —

MR. REYNOLDS: In other words, we are going ahead and giving you a date,
that that is when we would like to look at the hearings on this, focused on the
5010. At the same time, Karen, if you or others could help us as a committee,
either in comments in April or in July, as to whether or not based on what is
going on, single situations going forward, can work as a process, that would be
helpful. That would just be helpful input to us, because we have got to — in
all the processes that we deal with, we have got to make sure that we keep all
intact so that we do what we have been asked to do.

MR. BLAIR: Harry, one other thing that might be helpful is, if there is any
clarification we could have at that time as to whether an NPRM is necessary,
that might help, too, because if it is not, then we could move faster.

MR. REYNOLDS: Karen, do you have any comment on that now? No? Okay.

MR. OMUNDSON: Would you like to see a recommendation coming from us in
July?

MR. REYNOLDS: Yes. In other words, the purpose of the July discussion would
be, you have already got a recommendation on the table, whatever other
recommendations you would have. At the same time, we would first want a primer
on 5010, rather than just going right to — for example, right now on the table
is a letter that says go to 5010, while helping us understand for this suite of
transactions what is the difference between 4010 and 5010. That puts us in a
perspective to then listen, because then you start looking at the depth of the
change that the industry is going to go through, not just that a standards
organization is comfortable that this maps to this. Then have everybody else
that would be impacted — you’ve got providers, payors, vendors,
clearinghouses, you’ve got the whole thing now that are going to be affected,
as to what they think about 5010. Then at least we have the entire input for us
to go forward with a recommendation to the Secretary as to this being something
that should or shouldn’t be moved forward at whatever speed. Is that a fair
summary?

So that is what I think we would be planning to do in July. So this is our
notice to the industry, 5010 is going to be the subject of a good enough
portion of our July hearing, to make sure that we get the education and we get
the discussion that we need to have across the industry, because this is one of
the next big bellwethers in moving forward in this whole HIPAA environment, is,
do we fly over 4050 to 5010, and what does that mean and what is the time frame
and what are the issues and what are the concerns around the world.

Are there any other questions for Todd based on his report? We understand
that you are stepping down?

MR. OMUNDSON: Steve is going to delay this so I would still be a lame duck
until April.

DR. STEINDEL: I want to know, if we put this hearing off until July, does
that mean that you still —

MR. OMUNDSON: No. My last official duty is giving the annual report. Even
though we have got our next chair, that person hasn’t been doing a thing. I
have been doing it up until today.

MS. GREENBERG: The new chair?

MR. OMUNDSON: Jean Narcissi is the new chair.

MR. REYNOLDS: In the spirit of continuity, we hope you prepare her well for
July.

MR. OMUNDSON: Actually, we have a special call set up for next Tuesday for
an NCVHS debriefing. They all want to know what’s up.

MR. REYNOLDS: Thank you very much, we appreciate it. On our panel, we have
some people that are going to call in. Maria, can you tell me who was going to
call in, and let’s validate whether they have called in or not. Would the
others that are in the room, please go ahead and move to the front?

So what we can do, we are starting 15 minutes early. Everybody wants to
stay in order, so we have the first two presenters here. One is Doug Bell,
M.D., from the RAND Corporation, and the second is Mike Bordelon. What we would
like to do is, since we have a restricted period of time, we would love for you
to give us the highlights of what you are doing in about ten minutes each, the
highlights. Remember, we are interested in what is being found.

The other thing, you don’t have to educate this committee on e-prescribing.
We tend to have a good sense of what that is. So just help us with —

MS. FRIEDMAN: You may want to hear a little more from Bob Elson, simply
because the award for this project was just made. For the record, there are now
five e-prescribing pilots being funded by CMS, and all of them went through the
RFA process with AHRQ. These were the top five scores by the review panel. For
technical reasons we weren’t able to award them all at once.

MR. REYNOLDS: So first what we would like to do is congratulate you. We
spend a lot of time as a committee on e-prescribing, so we are glad to see it
moving. So we congratulate you for winning and stepping forward and being the
players.

We need to get a good overview of what is going on, especially if you focus
on any impediments you run into, any surprises you have had, or any things you
would like to tell specifically, make sure the committee is aware of. We will
continue to be a part of this process in an ongoing basis.

So Doug, we’ll start with you.

Agenda Item: Briefing by MMA E-Prescribing Pilots

DR. BELL: I’ll lead off. I’d like to thank the committee. It is an honor to
be here. I testified here two years ago, and it has been a very rapid process
since then. On the federal side you guys have worked very hard and moved this
along, and we are honored to be part of this.

We are leading a coalition in New Jersey, we at RAND. These slides by the
way are different than what you have; I have updated them. The slides you have
are what I presented at our kickoff meeting, but it has been a month since then
almost, and that is one-twelfth of our time. We are actually almost
two-twelfths into our year, so this is a very rapid process that we are trying
to follow here.

MR. REYNOLDS: And we assume that what you tell us won’t be good tomorrow.

DR. BELL: Well, hopefully it will just be extended tomorrow, but yes, that
is the half life.

Here are our partners in the coalition, Horizon, Blue Cross Blue Shield of
New Jersey is our anchor in providing our prescriber population. They have an
e-prescribing program that is getting good adoption, so we have the privilege
of going into a situation where e-prescribing is rolling out and is being very
successful.

The two primary vendors in that program are iScribe and Allscripts
Touchworks product. We also in our coalition have the two intermediaries, RxSub
and SureScripts. UMDNJ is leading that site visit component on the ground there
in New Jersey. We have point of care partners. Tony Sheeth is a consultant with
us. It has really been instrumental. Then we are coordinating things out of
RAND.

I wanted to show you the initial standards. I don’t have to tell you what
these are. These are basically the 11 or so standards that were in the RFA
minus the foundation standards that wound up getting approved afterwards.

Two of these are in production among the systems implemented in New Jersey.
That is the NCPDP, formerly the benefits standards, and the medication industry
function. The fill status function of course is another standard that is
completed, but it is not in use within the coalition, so that demands a
different kind of evaluation that I will get into more. Then three standards
are still under development, so the plans for evaluating those also have to be
different.

I wanted to review the overall goals, because as we go forward it is
important to make sure that we keep the end goals firmly in mind. Taking right
out of the RFA, this I thought was the most important sentence about what
e-prescribing is trying to do. It is to deliver information to the point of
care that enables more informed decisions about appropriate and cost effective
medication. So this we try to always keep in mind as our guiding goal for
e-prescribing. We are trying to evaluate how well these standards help meet
that goal.

Also, in the pilot we want to always keep in mind that we are trying to
provide evidence that enables this committee as much as anything to make
well-justified policy decisions about each standard. We break that down into
how would the standard improve the prescribing conditions, and then how could
the standard be improved if it is suboptimal to deliver better information. So
those are our high-level research questions.

To really get at that, we wanted to explicitly lay out a conceptual model
that gets us all the way from a standard, which is a very technical low-level
thing — that’s not the right word, low-level, because it is the foundation for
a lot of things to build on it, but we want to show that the structure of the
standard basically enables information display, some specific information
display or capture at the provider, and then that in turn enables changes in
work process, which then eventually produces changes in drug use, which is the
main focus of what is in the MMA; is drug use more appropriate, more cost
effective, and we just added in here, we would like patients to adhere more to
their drug regimens as well, assuming that it is a beneficial medication.

But it also has other effects that could influence the adoption of
e-prescribing on labor and other costs, both in the physician, the pharmacy and
other parties involved in these transactions on the total use of health
services. So there may be tradeoffs between medication use and
hospitalizations, especially hospitalizations, and then patient satisfaction
could be affected. All of these things are also important to look at. So we
tried to put everything into one model there.

I put all of our different methods on one slide, so there is a lot here,
but I just wanted to give an overview, and then I’ll go into more depth, time
permitting, on each one of these. But for all the initial standards, we are
going to do work flow modeling to look at how that standard specifically would
impact work flow, and then to try to project to the extent we can, to the
extent we have any information, how it would ultimately influence the outcomes
that I showed you.

We also have a technical expert panel that is going to evaluate each
standard, and I’ll show you more about that.

Then for medication history and formulary benefit, this is what is really
being rolled out in New Jersey. We are doing prescriber site visits before and
after e-prescribing is implemented. We are also going to do some pharmacy site
visits. We are going to look at claims data and some patient satisfaction data
before and after e-prescribing. We will be able to take the dates that
e-prescribing was turned on and look at the practice level. Then we are going
to close with a prescriber survey toward the end of the year.

Then specifically for fill status, we are going to focus on a storyboard
prototype. I’ll talk more about that. For prior authorization however, we are
going to try to build a working prototype and look at how prescribers actually
use it. Then for RxNorm and SIG we are going to focus on lab evaluations.

So here is where we stand right now with our work flow process model. This
is just showing you the diagram, but we have quantitative estimates under this
that allow you to sum up how much labor is involved overall in a physician’s
office, for instance, for different roles.

This shows one page. We have it in five modules now. In our application we
had a working Excel. This is in Excel, and this is sharable. I won’t go into
this anymore, but we made quite a few cycles of revision on this, and we are
continuing to revise based on input from our expert panel.

For the site visits we are also charging ahead toward getting this in the
field, because the e-prescribing systems are rolling out on their own time line
within the Horizon program, and we have to get out there and get baseline data
before all these sites are — we have to find some sites that haven’t
implemented e-prescribing yet so we can get our pre-measurements.

We are taking a purposive samples of six sites that are signed up for
iScribe implementation and six sites that are signed up for AllScripts. In
fact, today our field researchers are doing their pilot run at UMDNJ, so we are
really charging ahead with this. We have developed preliminary instruments, and
we have preliminary IRB approval from both RAND and UNDNJ, so we are getting
very close to going into the field with this.

The data we are collecting is qualitative interviews with physicians and
office staff. We are doing activity logs focused especially on telephone calls,
and then we are also doing some direct observation of physician activities,
although we decided we can’t actually go into the patient room. We would like
to be able to time how much time patients spend especially on things like
talking about adherence, which could be a major impact of these systems, but we
just couldn’t get that IRB approved in time, and essentially we decided to cut
that. But we are going to observe physician activities at a grosser level.

We are also going to do some pharmacy site visits, at least two pharmacies
that are taking orders from one of our higher volume e-prescribers. This will
be a little later in the year. This is not going to be before and after,
because we don’t think the volume of e-prescriptions in any pharmacy is going
to be enough to have a really big impact on their work flow. But we are going
to do qualitative interviews, activity logs, and some direct observation of how
long things are taking.

Now, we have a technical expert panel recruited. Our final list is 15
panelists, and they represent point of care software vendors. We have content
providers on the panel, mostly to help us with RxNorm. We have both of the
larger intermediaries, and we also have NDC Health in there. Then we have five
pharmacies represented.

We decided that the information we are trying to get from this panel, which
is issues of usability and the individual fields within each of the standards,
the knowledge to be able to evaluate that is usually spread within these
companies, so we are going to need multiple people to participate from a given
company. So really it is the company participating and not the individual
people on this panel. That makes it rather unique compared with other RAND
panels, but we are going to see how that goes, and hopefully we will get —
everyone is actually very excited about this, and I think we will get excellent
participation.

We are asking them for narrative answers to open-ended questions as well as
ratings, things like usability. So we are getting ready to launch that process
as well. We will go one standard at a time and go through that process.

At the end of the year we are going to do a prescriber survey, which is
going to be measuring the perceptions of how much information is provided by
the standards that they have been exposed to and how much it is enabling the
outcomes of interest here, informed decision making and cost effectiveness, and
the efficiency within their office.

Then we are also looking at secondary data before and after e-prescribing
gets turned on. We will be able to look at data claims using Caremark data, and
looking at the top 25 drug-drug interactions and whether that rate has changed
from before to after e-prescribing is turned on, and looking at VEERS criteria
for elderly patients.

We are going to look at, to the extent we can, errors of omission, where
maybe the only thing that is going to work is asthma. We can look at errors of
omission just from medication claims, so we are going to do that.

To try to look at other adverse outcomes, we will look at ED visits for
medication sensitive conditions using Horizon’s data and hospital admissions.
We will be able to link that back with Caremark’s data. They have done some
preliminary analysis there. Then we will look at formulary adherence. This may
be more of a long shot of whether it is really going to work, but we will look
at CAPS survey results, which is a patient satisfaction survey that Horizon has
to run every year to see if we can at the provider office level see a change
from before to after e-prescribing has been turned on. But we will have to go
back historically at least a couple of years to have a hope of doing that.

Looking at some of the specific standards that are a little different — do
I have more time?

MR. REYNOLDS: If you would move briskly, that would be good.

DR. BELL: For fill status we are going to do a storyboard prototype.
AllScripts is our primary collaborator here, that looks at what would it look
like to the physician if you had the true fill status transaction versus just
had adherence information from the medication history transaction. It may be
sufficient to do a lot of adherence monitoring.

DR. WHITE: I don’t know who just got on then, but I’ve been on for awhile.
This is John White from AHRQ.

DR. ROTHSCHILD: This is John Rothschild. I just came on.

DR. LAPANE: And this is Kate Lapane.

MR. REYNOLDS: We are going through Doug Bell’s presentation right now, and
then we will be moving through the rest of them in order.

DR. BELL: I think we got an early start.

MR. REYNOLDS: Yes, we actually started early.

DR. BELL: I am just talking about what we are going to do for fill status.
We are going to create these storyboard level prototypes, that aren’t really
built out. Then we will do focus group evaluations with physicians to look at
their perceptions of whether this is acceptable, given all the concerns with
fill status that have been outlined by NCPDP, including whether the excess work
is acceptable to physicians. There might even be liability implications. There
is some concern that just knowing fill status, physicians could be concerned
that it adds to their liability for patient care, then whether the patient
privacy issues seem acceptable to them. So that is fill status.

For prior authorization, we are going to do some initial work just looking
at whether the different forms available from plans can be represented using
278 alone, versus also needing the 275 piece. Then we are going to build a
working prototype and do a small pilot to see how much physicians are willing
to use it. Then of course we will look at the work flow implications, but that
may be mostly projections, based on what data we get.

For RxNorm, we want to look at the completeness for representing a sample
of actual scrip prescriptions and formulary benefit transactions, medication
history, prior authorization transactions, which are all standards that RxNorm
could interact with.

Then we want to do a lab evaluation that tries to look at what was
recommended by NCPDP for evaluating RxNorm, which was to take both ends of a
scrip transaction, add an RxNorm to it on the provider side, and then in the
pharmacy add another RxNorm using whatever algorithm the pharmacy develops, and
then compare those two to look for mismatches and the reasons for mismatches.
So we are going to try to do that just based on historical data, not in a live
environment, at least initially.

That is the primary plan for RxNorm, then structured and codified Sig is
somewhere.

MR. REYNOLDS: Doug, two minutes.

DR. BELL: Okay, that was my last slide. So the plan for structured and
codified Sig is similar. We will try to take the vendor side, use the vendor
system to attach a structured and codified Sig onto an existing scrip
transaction, and then on the pharmacy side try to do the same thing, and then
compare the two and see if we are getting the same answers.

MR. REYNOLDS: Good. We are going to hold questions to the end, since we
have so many, and we would like to get a theme. So next, Mike Bordelon from
Achieve Healthcare.

MR. BORDELON: I’m going to apologize in advance. I’m going to switch
presentations, too, same problem Dr. Bell had. We are moving fast and we’ve got
a new version that is more relevant.

My name is Mike Bordelon. I am the Vice President of Research and
Development for Achieve HealthCare Technologies. Achieve is a small software
company in the long term care space. We deliver full end to end web-based
software for nursing home sick, and run all their clinical and financial
operations.

The Achieve team got together after the RFA came out and decided that we
wanted to try to bring together a long term care pilot study. This was a real
challenge, because none of the infrastructure needed to do any of this stuff
exists. The standards are there, but a lot of the standards don’t even apply in
long term care. So we had to take a completely from-scratch perspective and say
can we in 12 months go from nothing — nothing at the pharmacy, nothing at the
nursing home, nothing in the doctor’s hands, and in some cases nothing at the
payor — to a completely operational system.

I am very proud of our team. I think we have a very energetic and very
forward-thinking visionary team that is attacking this problem very
aggressively. I think that we have a plan to deliver in long term care a set of
solutions that are going to demonstrate the capabilities of e-prescribing here.

Long term care is a different animal. I think you all probably heard that.
It seems like in the last year it is being banged a lot, long term care is
different, but a lot of people don’t know why. One of the things that we are
going to show this year is what those differences are in a very concrete way.

The impact of Part D is dramatic in long term care, moving all of these
Medicaid residents to dual-eligibles. Now they have all these different copays
and formularies to worry about. We are having to worry about all of that and
figure out how to operationalize that in long term care, the perfect business
case for leveraging e-prescribing. I think that we are going to show that there
is huge value here for the long term care environment.

We are going to study all the nuances of e-prescribing in long term care.
The basic study, the way it is structured is, we have two treatment facilities
in Minnesota. One is a very high acuity, high Part A, high turnover facility,
one is a more rural, traditional higher dual-eligible population, so I think we
are going to get a really interesting perspective across those two different
facilities. We also have two comparison facilities that are going to be doing
completely traditional paper-based prescribing, and it will give us a nice
baseline to compare against.

What is interesting is, as we are trying to do the baseline studies, things
are changing a lot right now just because of Part D. So this is a moving target
on top of a moving target, so we are really having to be careful about
normalizing our data and trying to make sure that we are getting a real picture
of what actual impacts the e-prescribing is going to make.

Our original plan was to have a three-phase implementation for this study.
That is changing slightly, and I am going to get into that. We are probably
still going to do three phases, but a lot of it depends on PA, prior
authorization, and I will talk about what some of our plans are, and some of
our strategies around PA.

Let me just jump in. I talked about who Achieve is. The nursing home chain
that we are working with is Benedictine Health. They are a nonprofit Catholic
organization. They have about 40 skilled nursing facilities in the Minnesota
area.

One thing I want to make clear is, this is specifically focused on skilled
nursing right now. We are not doing assisted living, home health, independent
living, anything like that at this point.

One of the nice things about this pilot, and one of the things we had to do
to control the scope and risk is find a partner that could bring a pharmacy
into the mix that we could control what was happening at the pharmacy.
Fortunately, Benedictine has their own pharmacy, it is preferred choice, so we
were able to also partner with their pharmacy software vendor, which is RNA
Software. RNA is having to build all of their end oft this equation as well.

What is interesting about long term care e-prescribing is, you pretty well
control both ends of the equation. The nursing home has a relationship with the
pharmacy. So in this context we are not dealing with a bunch of different
pharmacies. It is the one pharmacy, the one nursing home, and every single
scrip, whether or not the doctors are using the system, is going to go through
this system. So all scrips during the pilot, we will do e-prescribing, even if
we have a 30 percent doctor adoption. It is kind of an interesting twist on the
whole adoption issue that you usually hear about in e-prescribing circles.

We are using RxHub for the router and for getting formulary benefits
information. We brought in an additional payor to try to extend our payor
coverage through Prime Therapeutics and Blue Cross Blue Shield of Minnesota. We
are also working with the state of Minnesota to include their formularies as
well, because ont he dual-eligibles you have to have the Part D plan, but also
the state formularies.

This is a fun slide. This is from some NCPDP work. We are not going to go
into the details of this, because I only have a couple of minutes, but the key
takeaway from this slide is, it really illustrates how complex just writing a
new prescription is in long term care. This is just a new order. This isn’t
discontinuing an order or changing an order or a substitution or any of that
kind of stuff. It is a really tough process. The nursing home is stuck right in
the middle of there, trying to administer this whole process and make sure that
the doctors and the pharmacies and everybody are all synched up and working
together.

What e-prescribing does in a lot of ways in long term care is, it really
takes that load off of the nurse. The big benefit for the industry here is, the
nurse is now on the floor, giving care to the residents, not doing vaccine and
handwriting and tracking stuff.

I’m not going to get into all this, but we are doing a ton of work in NCPDP
to try to model all these processes. We have a long term care work group just
trying to figure out how should long term care e-prescribing work.

Some of the areas that we all say that is a really important aspect of
e-prescribing, some of those conventions are not necessarily always true in
long term care. For instance, these people have been in a nursing home for
awhile, or they are coming from a known source like a hospital, and we know the
med history. So the med history transactions aren’t as valuable as they are in
let’s say a doctor’s office or more traditional setting.

We have had to make a lot of changes to scrips to make this work. We just
submitted a new DRT to NCPDP that is going to go to the process in March.

There is no concept of a renewal in long term care, but there is this
concept of a refill. The nurse is at her med cart, she is running out of pills,
and she needs to tell the pharmacy, send more pills. It is not a renewal, it is
not a new order, it is just, keep sending more pills. So there are some unique
processes that don’t even exist elsewhere.

There are hardly any connected pharmacies at this point. Then of course we
have to account for the Medicaid formularies, too. So it is just a lot of
complexity.

I said we were in this three phase. What we are really doing is a baseline
study now, and two phases after that. The baseline study is starting at this
point. We are going to do two cycles to go through and try to get our baseline
set of data. The first one is a test to just see if we can even gather the
data. We are working with the research team out of Winona University in
Minnesota, and we have got two statisticians, four research assistants, a
primary researcher, that kind of thing.

Then in May we are going to be delivering phase one. That is going to be
doing new and cancelled scrips, formulary benefits. We are going to be doing
basic patient safety DUR checks using First Databank’s database. And we are
going to be testing out a refill request tool. There is no standard for that,
but we want to test the need for a standard around that, so we can start that
development working inside of NCPDP.

Phase two will start later in the summer. We are going to start doing fill
status and the change transactions back from the pharmacy. Because we have
control of both ends of the equation in the software, we are going to be able
to do real fill status back to the nursing home. They want to know what that
status is, so that is going to be a real value. The change transaction is very
important, because the nursing home needs to know what is actually being
dispensed, so these are very important transactions, and they are not in live
use hardly anywhere. So again, just more infrastructure from scratch.

Probably the biggest question mark right now is how do we get prior
authorization in. We want to do a real prior authorization test this year. But
again, this is another one of these. There is no infrastructure. We have a set
of standards that we want to build to, but there is a ton of work flow that
needs to happen at the doctors and the nursing home end, and a bunch of stuff
at the payor end. So we are getting together actually next week and starting to
work through how do we nail down some of the work flow, the technical side, and
solve that problem, and get something at least that we can demonstrate this
year in a real live environment. So it may not be the end-all be-all, but it is
going to be at least a first step that we can all point to and say, that is
something that kind of works, here are some interactions we need to make past
that, and then we are going to have a PA tool that will really work.

Because of a lack of formulary benefits coverage, we have been really
worried about this, so we are extending our formulary database using
MetaMedia’s Infoscan database so we can get closer to 100 percent formulary
coverage in the test. We talked about the refill, patient safety checks.

We are trying to do all electronic signatures in this pilot, but that is
going to be a pretty big stretch, and that is not mainstream in long term care
right now. We think we can get there.

One of the biggest stretch goals that we want to do is, we want to try to
drop an EMAR in on top of this pilot. We are keeping that low key until today.
An electronic med administration record. What typically happens in long term
care is, you have a stack of pages on top of the med cart. People are manually
signing off when the meds are administered. It is basically making that
electronic. When we do this, we will have a complete end to end electronic
system, and the patient safety and quality potentially can go through the roof
if we can do this, but it is really a stretch goal.

Then we talked about prior authorization and getting the full capability
with attachments.

Just a note. I put this in my status report to CMS, but we are working on
trying to find additional funding to do the PA work. The funding that we are at
right now would be just to get a 278 transaction. We really want to take it way
past that, but it is going to take additional funding. We are working with
ASCP, the American Society of Consulting Pharmacists, to help us find that
additional funding.

We hit the research. The big thing is just like most of the rest of the
groups. We are focusing on functionality, does it work, what are the cost
benefits, have we improved the quality of care, and have we improved patient
safety. We have a bunch of things that we are studying behind each of those to
try to determine that.

Just to wrap up, some of our challenges that we are facing right now.
Aggressive time lines. I think you are going to hear that from everybody. There
is a ton of code getting built and tested right now. The implementation is
going to be hard and fast, lots of training, lots of work to just pull all this
together.

I have been worried about payor participation, but I think that our
approach with Infoscan is going to help us through that, and that will at least
shore us up for the term of the pilot, hopefully longer term than that.

We don’t have a huge facility population. We are doing this in two
facilities and one pharmacy. I think we are going to be able to draw a lot of
conclusions from these two facilities. I think they are very representative of
the mainstream, but this isn’t a non-random, very broad sector, and so it is
just part of the study.

Then we talked about prior authorization and then use script.

MR. REYNOLDS: Nice job. We will go to the next presenters, Jeffrey
Rothschild and Gail Fournier.

DR. ROTHSCHILD: Do you have the slides as we go through this?

MR. REYNOLDS: They are putting it up right now.

DR. ROTHSCHILD: This is Jeff Rothschild. I am at Brigham and Women’s
Hospital, and Gail Fournier came on the line a few minutes ago. Are you there,
Gail?

DR. FOURNIER: Yes, I’m here.

DR. ROTHSCHILD: Gail is from MA-SHARE and CSC. We are going to be a tag
team. I am going to start and then hand off to Gail, and then she will hand
back to me.

Ours is called electronic prescribing using the community utility, which is
also known as the e-prescribing Gateway.

MR. REYNOLDS: I don’t know if you were on when we started this session, but
if you would keep your comments to under 15 minutes.

DR. ROTHSCHILD: It will be under ten.

MR. REYNOLDS: Okay, go.

DR. ROTHSCHILD: The second slide shows the research team, in part. There
are a lot of people involved. This is a complicated project. As everyone has
said, the time line is aggressive, so we are trying to get a lot of person
power here.

At the Brigham and Partners Health Care, which is the umbrella organization
for the Brigham and where our information system is located, there is a variety
of people, as you see. Then MA-SHARE and CSC, and Gail will talk about that in
a little while, is her group. John Halamka runs the MA-SHARE project, and is
also the chief information officer at Harvard Medical School, and for us more
importantly, at Beth Israel Deaconess Medical Center, which is across the
street from the Brigham. I will talk about that site in a second. We are also
working with partners at Blue Cross Blue Shield of Massachusetts, RxHub, and
Ken Whittemore and his group at SureScripts.

There are three study aims. The first is to test the interoperability of
the standards, which Gail is going to shortly talk about. The second is the
clinical side. That is to study and compare e-prescribing to CPOE without
electronic submission with regards to safety, quality and efficiency, and
business processes, which include provider efficiency and work flow. I will
talk about aim two a little more shortly.

The study sites and our collaborators. The MA-SHARE is part of NIHAN, which
John Halamka also has a leadership role in, and CSC is providing consulting for
MA-SHARE for several years now and is going to be our partner working on this
project for the Rx Gateway.

We chose to use BI, Deaconess Bible Center, because they have an existing
ambulatory CPOE system. We felt it was a very aggressive time line, it was
going to probably be impossible to introduce order entry into a study and get
that properly done within the time line of doing this study. So we chose a site
that has an existing order entry system. It is called Web OMR.

The one challenge that that gives us is that we are going to start with a
lower baseline of medication errors. The ambulatory order entry system already
takes care of handwriting issues. It has some decision support such as DDIs, so
we are going to start it at a different threshold than if we were comparing
e-prescribing to handwritten scrips.

At the Brigham and at Partners Health Care, what we are going to provide
is, we are going to do the clinical assessment. We also have an existing
adverse drug event monitor that we have used successfully here in our
ambulatory order entry system, and we are going to apply that to — translate
it over to the BI system. That is going to make for a much more efficient way
to try to find adverse drug events, which can be very, very labor intensive as
far as review and medical records in the ambulatory setting.

The next slide is eRx Gateway, and I am going to hand it off to Gail at
this point.

MS. FOURNIER: Gail Fournier, and I am an employee of CSC Consulting, but I
am the program manager for MA-SHARE’s initiative to deploy technology
infrastructure for clinical data exchange and e-prescribing.

We currently operate in Eastern Massachusetts an exchange called NIHAN,
which is a highly successful exchange. It has been in place for seven or eight
years, to exchange administrative data among 49 different payer, provider and
vendor organizations. In this project, in this pilot, we are extending quite
significantly that infrastructure to support the e-prescribing infrastructure
that will be needed to implement a similar type of community exchange in
Eastern Massachusetts.

We expect that this infrastructure utility will in the longer term support
providers, payors and vendors. We have learned the lesson of interoperability
needing to be highly standards based through the NIHAN experience, so we are
going to extend that characteristic as well into the eRx Gateway that we are
proposing, or that we are developing now to support the pilot and Eastern
Massachusetts operations.

We expect that the adopters of the utility will include providers, payors
and vendors operating in Massachusetts. We have lots of background on how we
think that the quality of patient care will be improved, and the complexity and
cost of implementation will be improved for the organizations that participate
in our network. We are also in discussion with some of the EHR vendors who are
operating here to basically understand and define how they will integrate with
this gateway — whether and how they will integrate with this gateway.

One of the premises that we have for the MA-SHARE program is that we make
available all of our work products to others free of charge as we go.

So in terms of the next slide, I think I am on slide six now, the Aim I
study design objectives are really dual, to test the interoperability of the
standards as the RFA requires, and to test the vocabularies and code sets the
RFA requires. So in general terms, as we test interoperability, we expect to do
most of this comprehensive intensive testing in a lab environment, in
conjunction with our partners that Jeff described earlier. So we are really
going after four dimensions here: Is the data accurate, is the right data
getting to the right place, and does it look right when it gets there,
completeness, are all of the functions of e-prescribing supported in the
context of these standards, and if not, what is missing, coherence, how do the
standards interoperate with each other, do they lack coherence in any cases,
and if so what are the remedies to get around that, and usability. By this, it
is a little bit different interpretation of usability than what people are
probably used to hearing.

What we believe is that what CMS and AHRQ want to accomplish here is to be
able to be sure and give instructions to others on how to implement these
standards. So we expect to start measuring the ease of implementation of all of
the standards from the very beginning, so during the design phase and during
the development, testing and implementation. We will be studying the how of
implementation of these standards and making recommendations around that.

In terms of vocabulary and code sets, fairly basic here. What we want to be
sure of is that the vocabularies and code sets that are used in the entire
process convey the meaning that they are intended to along the way.

The next slide is slide seven. I think I covered these in the previous
slide. The matrix that we expect to measure and study in the testing, and then
the types of things that we expect to report on in terms of how the standards
get implemented during the key phases of the study, design, development,
testing and operation.

I think with that, we are on to slide eight, and this is where I hand it
back to Jeff.

DR. ROTHSCHILD: Thanks, Gail. For Aim II and III, which is the clinical and
business parts of the study, we are going to be doing the pre and post trial as
well as the randomized control. So we are going to have six sites. Three of
them will be control sites. We will look at all six in a baseline period, then
we will divide the six into a control side and an intervention side, and three
of the six sites will convert to e-prescribing.

The outcomes that we are going to be looking at, the primary outcome of
interest is dispensing errors. We will at the same time look at other
medication errors and see if there are any injuries from medication errors,
otherwise known as preventable adverse drug events.

At the same time we are going to look at formulary compliance, medication
reconciliation history, and prescribe and dispensing efficiency, which is, if
there is reduction in time between when the order is entered in the system and
the patient gets their medication or it is available at the pharmacy.

Aim III is the business side. We are going to be looking into work flow
studies and time motion studies in office practices, both baseline and
intervention, and to quantify the impact of e-prescribing on prescription
related phone calls and work load, and translate that into office cost.

The last slide. As far as case finding, I mentioned briefly, we have a
computerized adverse drug event monitor which we are going to translate to the
BI Deaconess web OMR order entry system. We will be doing chart reviews. We are
going to compare the prescription to be dispensed to the original medication
prescription that was entered into CPOE.

Then as far as the office efficiency and work load, we are going to be
doing both direct observation and have some activity logs maintained by the
staff, and as I mentioned, some time and motion techniques.

And we’re done.

MR. REYNOLDS: Great, you were people of your word. Thank you very much for
the time. We are going to hold questions until we have heard from the next two
sets of presenters. If you would please stay on the line for questions, that
would be great.

Next, Ken Whittemore and Kate Lapane. We appreciate all of you showing
interoperability by using one laptop. That is a good step.

MR. WHITTEMORE: Thank you, sir, and good morning. Ken Whittemore with
SureScripts, and I have Kate Lapane from Brown University, who has joined by
telephone. I am going to take the first half of the presentation, and Kate is
going to take care of the second half, because as I think is clear from the
start, I am not a researcher.

As we were evaluating the RFA and the opportunity there, we really weren’t
sure initially that this was something that made sense for our organization to
get involved with. We had already been contacted by a number of the folks who
put in applications pursuant to the RFA, and we were honored to be asked to
participate in quite a few of them. As it turns out, we are supporting three of
the other four successful awardees for the grant.

Again, initially we weren’t quite sure that this was a role that it made
sense for us to play in a primary way, but over time a number of our various
partners and through discussions internally, we decided that there was
something that we could probably contribute to the process. So that prompted us
to move forward with our application to the RFA.

Basically, what we saw that as being was the fact that we have extensive
reach around the country. We have a diversity of different players in the
e-prescribing place that we already have connections with and that we are
already doing business with, and we thought that we could probably leverage
that and make our application and our pilot a little bit different.

What we plan to focus on is evaluate how the initial MMA e-prescribing
standards work or parenthetically, don’t work, in a variety of practice
settings across a number of geographic areas, using a wide variety of
technologies. In addition to that, we also want to gather information that if
we were just linking up with one set of providers on the physician side and
pharmacy side, we might not detect, that is, to assess how the variability and
different implementations on both the prescriber and the pharmacy side
influence e-prescribing adoption, and perhaps to develop some best features
both on the physician and pharmacy side through the research we will be doing.

So to decide how exactly we were going to tackle things, we thought it best
to look at our book of business to determine which areas we had the most
significant volumes of e-prescriptions that were being transmitted. We arrived
on at that time five states, Florida, Massachusetts, New Jersey, Nevada and
Tennessee. Following that — actually, before I move from this slide I should
mention Rhode Island. Rhode Island wasn’t initially part of our application. We
knew that there was a group of folks in Rhode Island who were planning on
putting in an application, and we thought it was best to stay away from Rhode
Island for that reason. But for reasons I’ll mention in a bit, after we
received the award, we decided that it would make sense to add Rhode Island. So
we are actually going to be doing this process in six states.

After we determined the locales, we took a look at the various physician
side vendors that were doing the most significant volumes in those locales. We
approached quite a number of them, and quite a few of them reported back to us
that they would be interested in being a part of the RFA pilots. The ones that
followed up on it and through all the discussions decided that they could
support us are the ones that are listed on this slide, those being Allscripts,
DrFirst, Gold Standard, InstantDx, MedPlus and ZixCorp.

On the pharmacy side — and I think everybody on the committee knows that
that is where we come from, the pharmacy side, we had a large number of folks
report that they would be interested in participating in the pilot. Obviously
we only needed the pharmacies who had a presence in the locales that we had
already decided upon, but that pretty much translates to the list that you see
here: Ahold, which is Giant and Stop & Shop, Albertson’s, which now has
been carved up, Brooks, CVS, Duane Reed, Longs, Rite Aid, Walgreens and
Walmart.

Because of the brief amount of time that was available to us to put
together our application, we really weren’t able to reach out to folks on the
independent community pharmacy side and line them up to be part of the process.
But now that we are one of the pilots that was awarded, and because of the fact
that one of our owners is the National Community Pharmacists Association which
represents independent pharmacies, we are going to try to do what we can to
incorporate them into the pilots in some way. We have already had discussions
with our researchers about how to accomplish that.

We also have a number of other participants. We needed to bring some folks
in on the payor side, so we have Aetna, Blue Cross Blue Shield of
Massachusetts, the Walgreens Health Initiative, and then there are a couple of
other groups that are along for the ride, if you will. Partners in Care is a
very large physician group in New Jersey, and NaviMedix is a physician vendor.

With respect to who is actually going to be doing the research while I
stand on the sidelines and watch them, our primary investigator is Kate Lapane,
and she is going to be addressing you in just a minute, with Brown University,
her colleague, Catherine Dube, also from Brown, is also assisting. They are
primarily focused on — well, Kate is primary investigator, so she is running
the whole show, but both she and Catherine are pretty much focused on what is
happening on the physician side of things.

On the pharmacy side of things, we brought in Michael Rupp, who is a
professor at Midwestern University, Glendale, Arizona. He has done a lot of
work on pharmacy work flow and computerization and the effect of various
electronic messaging between payors and pharmacies, and so he has got a good
appreciation for a lot of the standards that we are talking about, and then one
of his colleagues, Terri Jackson, who is a clinical assistant professor down at
the University of Arizona, also brings a strong background in having studied a
lot of the electronic messaging processes that go basically between the PBMs
and the pharmacies. So that is how the research team breaks out.

At this point, I will turn it over to Kate. Kate, can you see the slides?

MS. LAPANE: No. Are we on slide eight? The cluster site visits?

MR. WHITTEMORE: Yes, we are on that slide.

MS. LAPANE: Great. Hi, everybody. I am happy to be here to talk to you
about our project. It is really ambitious. As you can see by all the players
that SureScripts has brought to the table to work on this pilot, we have got a
lot of work ahead of us.

Our approach that we pitched in the application was to perform cluster site
visits, where our goal was to sample, in a way to maximize the variability
across geography, the types of vendors, the community pharmacists, as well as
the experience and volume of e-prescribing that folks have.

Our goal is to provide information regarding the impact of these standards
in the pharmacy arena as well as the physician providers and patients as well.
But we envision — and I’ll get back to what we are going to do with each of
those stakeholders in a moment, but the next slide shows the coordination of
the testing of the interoperability of the standards, as well as how it works
with the research evaluation.

How we are focusing the research is how are the standards implemented, and
understanding how they have affected the work flow practices and the
communication between the patients and providers and the pharmacy and
providers.

The next slide shows SureScript’s oversight. They have a certification
process for the specific standards, and as they get the partners up to speed
with that and pilot test their arrive date, where they say everything is good
to go, the right data are getting to the right place in the right way. Then we
want to let a period of time pass so that we are not going in and measuring
chaos. We want to understand how have these standards been implemented into
practice.

So we pitched doing a performance analysis, which on the next slide I’ll
explain a little bit more. Then throughout other phases in the project, we will
follow up with web-based surveys and phone surveys.

When we pitched this, we envisioned a few phases throughout the project as
SureScripts was ready to roll out testing of the standards. Initially in April
we were going to combine medication history, formulary benefits and then work
towards other standards of different parts of the year. We realize that for
some of the standards, we are going to end up in a lab-based environment to
test the interoperability.

Our strategy has been to identify through our physician software vendors —
they are assisting us in the recruitment phase, to identify practices where
they are using e-prescribing on the physician side, so that they will be the
targets of our on-site visits, using the performance analysis strategy.

Again, those will be across six different states. Originally we had wanted
to go in and do a cluster and collect information on pharmacies, working with
the specific providers. We are working on the details regarding how that will
flow, but conceptually that is what we are after, is to try to get the
perspectives from both sides of the SureScripts tunnel, if you will.

Our goal with the baseline data collection will be to conduct site visits,
interviews, work from business practices. Then again, we will follow up with
changes in practices at certain periods throughout the year as other standards
come out.

A little more about the performance analysis strategy. The goal of this
type of analysis. It is really a mixed-method approach, but the goal is to
capture multiple perspectives on performance, trying to identify potential
problems and opportunities, barriers and facilitators for successful
integration of e-prescribing standards.

We realize that there are the interoperability technical aspects of getting
the technology up to speed, but we appreciate how that technology is actually
implemented. There could be great variability. We have completed
semi-structured interviews with each of the system software vendors, and asked
them about how do they think their products are being used, what are the
different capabilities of the product, could they identify poster children who
would be, yes, these are the poster children for e-prescribing. They know that
there are places that are doing it better than other places. So we want to go
in and look at the variability, look at across the spectrum how are providers
integrating the software into their systems.

We also through that process were able to realize that there is variability
with respect to how much control users have over aspects of the technology. For
example, with the alerting systems and being able to say I only want level one
alerts to filter through, and I don’t want others. So it is great, because then
we can look at the extent to which having that ability to switch the
thresholds, and what does that mean with respect to the expected gains on
patient safety outcomes.

So anyway, back to the performance analyses, what we have proposed is the
combination of direct observation, we have constructed fallback logs to try to
track what the communication and time required between the practice and the
pharmacy requires. In addition, semi-structured interviews with office staff
members participating in the e-prescribing activities, as well as documentation
of work with diagrams, business practices, the roles and the organizational
structure, as well as personnel perspectives on problems, and not only the
problems, but what do they think the potential solutions are to optimize the
use of e-prescribing.

MR. REYNOLDS: Kate, can I interrupt you for a second? Are you using a
handset?

MS.. LAPANE: Yes.

MR. REYNOLDS: We are having a little trouble. Your voice is going in and
out on us a little bit.

MS. LAPANE: Okay, I’ll speak louder. Part of the site visit will include a
focus group with users as well, to try in an interactive way to figure out what
are some of the solutions if e-prescribing is not being optimally used in this
setting.

Part of this process will allow us to focus on the ease of use, any
ancillary materials that are used and the time required, mistakes and re-do’s,
other activities that have been incorporated when people are using
e-prescribing. The questions range from open-ended to specific, closed-ended
questions that try to capture the experiences of people using the e-prescribing
software. This part of the analysis of these data will include qualitative data
analysis.

We are also focusing on evaluating factors that could potentially influence
uptake and dropout. Preliminary data, we have seen that how people embrace this
is very different. We have got high volume e-prescribing practices and low
volume, so we want to understand what is going on there. Then our physician
software providers have told us, people have adopted them and dropped out or
switched, and we want to understand what is going on with those aspects.

I don’t have a slide for this, but one other aspect that we will be
focusing on is the patient perspective with respect to how does e-prescribing
influence communication between the patient and provider, does it stimulate
discussions on cost when the formulary benefit information is being exchanged,
what are the convenience factors, as well as concordance or adherence issues.
We have a draft of that survey that we will be sending through AHRQ for
clearance, so we think that that is another powerful aspect of this study as
well.

The data analyses. We have been working with SureScripts to identify —
well, actually to de-identify data sets that will be able to link to the
characteristics of the providers. We also have had preliminary discussions with
each of the software vendors, to ask them conceptually, if you’re thinking
about patient safety, how best can we harness the data that you already have,
de-identified of course, but to understand the full impact of e-prescribing on
patient safety outcomes. We won’t be doing adverse drug events, but many of
them do capture switching after being alerted, switching after a formulary
benefit alert, et cetera.

I will switch it back over to Ken for questions. Or are those at the end?

MR. REYNOLDS: Thank you very much. Now our last presenter will be Dr. Elson
from Ohio and KEYPRO.

MR. ELSON: Thank you. It is a pleasure to be here. Thank you, Maria, for
that succinct explanation of how we slipped in under the radar screen. The only
thing I would add to your explanation is the emotional roller coaster ride part
over the past six or seven weeks.

I suppose it is appropriate that we are going last. We just got word of our
funding officially about a week ago, so we certainly feel last, sort of like
the tortoise and everybody else is the hare. So we have got a lot of catching
up to do. We have got the same back-end hard stop dates, so presumably we will
catch up.

I should mention that the other projects in the spirit of collaboration
have been very helpful over this past week, in terms of willingness to share
some of the hard work they have done over the past six weeks, and hopefully
that will help us hit the ground running, particularly with respect to metrics
and methods, especially around work flow measurements.

I am going to focus today, since this is our introduction to all of you on
who we are and what we have got to work with. That is going to be a little
sparse on actual methods, but that will come soon.

The project participants, the two primary ones, are UPCP and Ohio KEYPRO.
UPCP stands for University Primary and Specialty Care Practices. It is a large
group of small practices in Northeastern Ohio that are owned by university
hospitals, a health system in Cleveland. I’ll talk a little bit more about UPCP
in a moment. Ohio KEYPRO is the Ohio Quality Improvement Organization. They are
about 80 people strong in Ohio. They are part of the Pennsylvania QIO
organization as well, plus the KE for Keystone in front of the PRO.

InstantDx is the e-prescribing vendor involved. That is important, even
though InstantDx is a relatively small company. They have been a long-time
innovator in the e-prescribing transaction space. They were certainly one of
the first if not the first e-prescribing vendors to be connected to both RxHub
and SureScripts a couple of years ago.

NDCHealth, RxHub and SureScripts are all involved in the transaction side,
although primarily RxHub and SureScripts. NDC, because they represent the
practice management system at UPCP, is involved in some of the outbound
connectivity, and will be involved to some extent in certification testing. But
their primary role in the project is as an interesting source of some
value-added data that we will be able to use for our safety and cost analyses.

QualChoice is a regional health plan with 185,000 members. It happens to be
owned by University Hospitals, which is convenient. It is also convenient that
their prescription benefit manager is MedCo, and the operative assumption when
we pulled QualChoice coalition was that we would be doing prior authorization
testing with them via MedCo. Since MedCo is one of the AHRQ’s sub-founders, and
we knew that RxHub was going to be right in the middle of much of the prior
authorization work, it was going to make it more convenient for all involved.
That assumption didn’t pan out, however, as we were unable to obtain a
commitment from Medco to participate in the prior authorization testing. My
understanding is that at the time the application went in, Express Scripts and
Caremark were representing a similar position at the time. So we made it fairly
clear that the prior authorization testing in our proposal was at risk. We are
going to be revisiting the issue next week in a meeting at RxHub.

Aetna is not involved in any of the transaction testing, but they represent
about seven percent of the patients seen at UPCP. I didn’t mention, QualChoice
was about 12 percent of the patients seen at UPCP practices. So between the two
of them, we have got almost 20 percent of the patients that are being seen at
those practices. We will have access to claims data on it, which will be very
useful for our cost and safety analysis.

Also, Jeff Taylor, the lead pharmacist at Aetna, who has got substantial
experience in doing impact analyses of e-prescribing on formulary compliance,
has agreed to consult on the project.

We also have the Medical Group Management Association, their center for
research and the division of health services research at the University of
Minnesota involved in the project, and a bit more about them at the end.

UPCP represents 73 practices covering 42 communities and involving 284
physicians in Northeastern Ohio. For those of you unfamiliar with the
geography, that is Lake Erie up on the top. Here is the Rock and Roll Hall of
Fame in Cleveland. These represent interstate markers, not UPCP practices.
These little stars represent the communities that UPCP practices are located
in, primarily centered in Cuyahoga County, which is where Cleveland is, but
extending out into surrounding counties as far west as Sanduskee and as far
east as Ashtabula County.

There are 45 primary care and 25 specialty practices involved. They see
between them almost 1.3 million patients every year. The important thing to
remember about these practices is that even though they are owned by a large
delivery network entity, University Hospitals, they really largely maintain the
look and feel of small independent community practices. They share some
management and information technology services, but essentially when they join
UPCP they sign over their leases, they get some malpractice coverage, and they
don’t move.

The average size of these practices is four, the largest is 13. There are
lots of onesies-twosies in the mix. What was nice about this as a test setting
is that we felt that we would be able to examine representative work flow
issues related to e-prescribing in small community practices, yet have both the
technology management and testing advantages of having these practices be part
of a large managed care organization. The ownership by the delivery network
doesn’t really change the fundamental work flow issues at the individual
practice level.

Importantly, UPCP began a pilot test with OnCall Data, InstantDx’s
e-prescribing application. Back in probably mid-04, they first addressed a
couple of their practices, and then began rolling it out in earnest in late ’04
and early ’05. So by the time it came to responding to the RFA, there really
was a mature, stable e-prescribing deployment underway. Stable probably isn’t a
good word, in that there really has been rapid growth, even since the
application went in in October. So we knew we didn’t have to worry about
fundamental implementation issues and fundamental adoption issues, that we
would be able to have a testing ready environment for the initial standards.
And of course, the foundation standards, particularly script and new
prescription and renewal messaging transactions and X12 eligibility
transactions have been operational throughout. Two of the other initial
standards which were considered briefly as being part of the foundation
standards were formulary and medication history were also operational. We are
happy that they got moved into the initial standards category, because we think
that there is a great testing opportunity for those components as part of this.

I am not going to put up a slide with the objectives and goals. They are
very similar to the other projects, and you are all familiar with them. They
were largely prescribed by the RFA.

Here are some adoption numbers that UPCP practices. These are the total
numbers of prescriptions per month, from January of ’05 through last month,
January of ’06. You can see a fairly steady growth from about 7,000 physicians
a month last January — sorry, 7,000 prescriptions a month to over 30,000 this
January. Total for 2005 was about 185,000 prescriptions that were created
within OnCall Data. When we submitted the application in October, the numbers
that we included in the application from August and September were just under
20,000 prescriptions a month. That has now increased by 50 percent just in the
last several months.

Now, if you look at the pattern of adoption, it is very interesting and it
presents some opportunities for analysis. The application has been made
available to 52 of the practices of UPCP. Seventy-two practices have been
quote-unquote enrolled in the electronic prescribing program and have the
option of using it. But in fact, much of the adoption is actually focused on
probably about two and a half dozen of the practices, and physicians
individually are creating between zero and 800 prescriptions a month, and less
than half of the physicians, about 74, are responsible for 94 percent of the
volume. These are numbers from last fall. We are looking at current numbers
now.

These are the primary care practices. These individual bars represent
months. So here is a cluster of months in an individual practice, this is
practice 13. So here is January, February, March, April, May. You can see
fairly high utilization that is stable across time.

Here is another practice, number two, which probably started their
implementation in March or April, and you can see steady increase. But there
are a number of other practices that have had the application available among
these 32 primary care practices, where utilization is either non-existent or
very low.

The same story in the specialty practices, although you have even sparser
adoption in terms of the 20 specialty practices, at least as of last fall, that
had jumped on the bandwagon. So there is an opportunity here to examine
adoption issues.

This year, UPCP’s focus is on increasing utilization rather than rolling
out to additional clinics. One of our concerns was that there wasn’t going to
be an opportunity because adoption was so good, there wasn’t going to be an
opportunity to do pre or post types of analyses. But we will be able to get in
there and do some premeasurements as some of these newer practices come on
board. Plus, we will be able to look at retrospective data, at least from the
practice management system and claims data to look at error and cost issues pre
adoption.

I mentioned that this was foundation standard production ready. This is
from an eligibility transaction, where the application is aware of patient and
prescription benefit eligibility. Last August there were 8630 successful
eligibility transactions with RxHub, out of a total opportunity of 17,000,
which represents about a 44 percent hit rate. This is compatible with RxHub’s
known coverage in their MPI in Cleveland.

Also, you probably can’t see it, but there is a button here for downloading
drug history from the PBM. In this particular application, an end user has to
click on this button and wait two to four seconds for the prescription history
to come up.

In January, just last month, even though this was made available late last
year, there were only 52 prescription claims history requests. So even though
there were thousands and thousands of eligibility transactions, there were a
relatively small number of claims history transfers, even though technically it
was operational. This is one of the things we want to try to understand; is
that because of a work flow issue, is it because of a usability issue, is it
because the medication history isn’t useful in general, or maybe it is not
useful the way it is displayed in this particular application. But there are a
number of issues related to medication history that we want to look at.

These are RxHub’s coverage numbers from last summer in the
Cleveland-Northeastern Ohio metropolitan statistical area, about 50 percent,
which is consistent with the eligibility hit rate that we have seen. One of the
things that we want to look at is what is the impact of the Part D benefit on
eligibility hit rate in this region. There were a number of Medicare
beneficiaries that were represented in RxHub’s MPI pre-MMA. A number of their
employers have terminated their employee cosponsor drug benefit. Unless their
Part D plan is participating in RxHub’s MPI, then some of those patients may
drop out. So it is not at all clear to us whether the hit rate is going to do
down or up, at least in the early going, as a result of Part D.

In Cleveland, the first half of last year there were 43,000 in terms of
absolute number eligibility hits, and those were all coming from OnCall Data at
UPCP.

Just an interesting aside. EPIC, which is a large footprint in Cleveland at
the Cleveland Clinic, and also at Metro Health with another 400 docs on
EpiCare, is reportedly going to be live on RxHub eligibility transactions
sometime this spring or early summer. Once that happens, these numbers are
going to go through the roof.

On the pharmacy side, most of the transactions are going to either CVS,
Walgreen’s or Rite Aid. We are currently working with SureScripts to try to
ascertain which of those three, or possibly more than one, they will turn on
for us to study. So that is one of the things we are working through.

One of the things that I wanted to mention about medication history and why
it is so important that it be studied as part of these pilots is that there is
a clear trend in terms of increasing availability of medication history in
e-prescribing applications. If you look at RxHub eligibility transaction growth
over the last three years, I don’t have the end of year numbers, but probably
over 20 million eligibility transactions, compared to under two million a few
years earlier.

But way back in the beginning there were hardly any medication history
transactions compared to the eligibility transactions. That is because the
early vendors that hooked up RxHub to do eligibility checks have a lot of
additional work in their application to get the medication history pieces
working, just from an application, user interface and data structure
perspective. So there was a 12 to 18 month lag from the time a vendor was
certified to do eligibility checks to the time they were ready to do medication
history transfers. We start to see that over the last two and a half years,
those numbers have started to catch up. So where the ratio was initially 10,000
to one last year, it was closer to 12 eligibility checks for every medication
history check.

In spite of the fact that there were nearly two million medication history
transfers into ambulatory care e-prescribing applications last year, I’m not
aware of a single published impact evaluation. So it is wonderful that the
pilot is occurring, and to the extent that we and the other projects are going
to look at medication history, I think it is a very important opportunity.

What are our data sources that we have to work with in the project? We have
got transaction logs and basic utilization statistics from the e-prescribing
application itself. We have got patient demographics and ambulatory care visit
codes available from NDC Health’s concept application, which is a practice
management system. We have also budgeted to license data from NDC Health’s
integrated health record or IHR product. This is data that they cull off of
their adjudication transaction between the pharmacy and the PBMs, and they are
a rich source of data that we are going to try to link up with some of our
other data to expand our ability to look at safety and cost issues. And of
course, we have got data from QualChoice and Aetna.

It is the new data sources that we are struggling with right now in terms
of work flow measurements, surveys, interviews, call monitoring and the like.
We may end up doing some chart audits particularly for a nested medication
history reconciliation study, but we want to try to limit those.

The consultants on the project, MGMA are Terry Hammons and David Gans.
These are two of the lead authors of the MGMA EMR adoption survey that was
published last September in Health Affairs. So these guys are well versed in
practice based research. Brian Dowd and John Kralewski are two Ph.Ds from the
University of Minnesota that also happened to be involved in the MGMA EMR
survey, but they bring a lot of added value here, in that they have done some
interesting work related to practice culture and technology adoption, and also
to the use of claims data analysis for impact for adverse drug event detection.
And of course, Jeff Taylor is going to help out on the formulary front.

Mike Nochomovitz is the pulmonologist, is the director of UPCP. He is
extremely energetic. He is on the board of MGMA. He is the one who pulled Terry
Hammons into the project. He is able to make a lot of things happen, and he is
really a driver behind this. Don Barich is the actual principal investigator on
the project, and he is the director of quality at UPCP. Without him we won’t be
able to get into these practices and make this happen in the trenches. Dr.
Petrulis is Ohio KEYPRO’s chief medical officer, and Meghan Harris is the
project director at KEYPRO.

This is our technology time line. What little hair I have left I wouldn’t
pull out over anxiety about getting the technical pieces of this done, not
meaning to trivialize the technical implementation aspect of this. We already
have much of this in production. InstantDx is ready to do a fill/no-fill and
change cancel. It is really more a timing issue of when we are ready to train
and test. So it is only the RxNorm and structured SIG components and
potentially prior auth that are at the latter tail end of the project that we
are going to have to grapple with from a technical standpoint. It is really the
measurement piece — I am going to wrap up with this slide — it is really the
measurement piece that we are waist deep in now. We have got a lot of work to
do to set up our data center at KEYPRO, in getting test data runs from these
various data sources, and starting to do some preliminary analyses, and trying
to ascertain what we can and can’t do with the data. Similarly, our adoption
and call volume metrics and methods really need to be pinned down. So we have
got a lot of work to do over the next four to six weeks just to get caught up.

And with that, I’m done.

MR. REYNOLDS: Even though the government didn’t give you extra time, we
did. We’d like that noted.

Before I open it to questions, I guess wow would be what I would say. To
see it go from hearings and letters and collaboration we saw from the industry
before, and to — while I commend the government for their selection of the
people that they selected, all of you clearly understand the issue and your
time frames. I am incredibly impressed. So I would like to open it to the
committee for questions, if anybody has any.

MR. BLAIR: I just really have to commend all of you. You really are
winners. There were a lot of good choices, and the diversity and the integrity
of the evaluation process which you all have really bodes well for us getting
good data and value out of this.

The problem maybe that I have, because I think there are going to be good
results out of here, and it sounds as if many of you are pushing the limits as
to what kind of information we could get. If you are going to push the limits,
here are the areas where I would love to see the limits pushed. If there is any
possibility in any way, shape or form for any of you to include some degree of
controlled substances within your testing, that would be wonderful.

PARTICIPANT: We do that.

PARTICIPANT: We do it.

MR. REYNOLDS: You’re pushing the limit.

MR. BLAIR: Well, that is what I said, I am pushing the limits. The other
one I am pushing the limits —

DR. LAPANE: But some of these software products, we could ask them to
collect the information of what could be possible if it were permitted.

MR. BLAIR: I understand. The other area is the area that you are required
to do evaluations on with RxNorm, comparing it to NDC codes. If any of you have
the opportunity to go farther than that in terms of the use or usability of
RxNorm, that would certainly be helpful. Those would be the two areas that
would be nice, if it is possible.

The other thing that I would — and this is partially a question — and
Maria, the compliments partially go to you too, in terms of managing this
transition.

MS. FRIEDMAN: Thank you. Actually it goes to John White as well. The two of
us, we spent all year getting this through. But it has been a collaborative
partnership between CMS and AHRQ.

MR. BLAIR: So here is my last — this is actually a question. That is, when
these pilot tests are done, I know there is going to be some process that AHRQ
and CMS is going to have to gather all of the information. As important and
valuable as it was to have all of them testify here, I don’t know if there has
been thought given yet to how that will be pulled together. I think the
subcommittee would certainly be interested in these results, not only because
of the question of the interoperability of the standards, the basic missions of
the pilot test, but this is the first time that in the process of moving health
care closer to using information technology that I am aware of, where we have
had anything like this, where we have had broad, wide scale well thought out
pilot tests of leading edge standards. If we could learn anything from the
process as well as the results of the pilot tests, that would be helpful.

DR. WHITE: This is John White from AHRQ. I am going to chime in. I’ll
answer your question directly, and then give the rest of my presentation.

There is money that CMS has put up for an evaluation contractor. We are in
the process of getting that laid out. We are going through the federal
procurement process, so I can’t talk too explicitly about it.

But the short answer is, there are three things that we are going to be
looking for the evaluation contractor to do. The first is to help us hold
regular discussions and meetings and do regular information gathering from the
grantees and from Dr. Elson’s project as well. The second is to provide
technical support, expert advice, to the grantees because some of us are
feeling our way through, and we know that is going to arise. The third part of
the actual evaluation, specifically, CMS legislation that makes us do this
tells us that we need to report this all out to Congress by April 1 in a full
report.

Now, as you can all probably appreciate, we have asked these people to try
to reach for the moon, and they are doing a fantastic job. At the same time,
there are certain things that they need to get through, and they understand
that.

In order to get through the process of writing up the report to Congress
about their work, we need to be writing as we go. The closeout date for the
grantees’ work is December 31, and the closeout recording date for the
information is January 31, at which point the report to Congress will need to
enter the clearance process. So therefore, you all can appreciate that there is
not going to be a whole lot of time to mull this over.

That said, what you are describing is really important. In addition to
doing this, I do a lot of other IT projects at the agency, and you are exactly
right, it is ground breaking, but very exciting to be involved with, and we
really do want to capture lessons learned about that.

I would welcome any thoughts or structure that you have to that, you or
anybody else. Feel free to send them on to me. I am involved in structuring
evaluation contract awards. So if you ask Maria, she would be happy to give you
my e-mail, and let you send those comments on to me.

MS. GREENBERG: Thank you, John.

DR. BICKFORD: Throughout the presentations there was discussions about the
prescribing provider being physicians. In many instances we have advanced
practice nurses functioning in that capacity as a prescriber. I was wondering
if there was a plan to differentiate those who might be physicians versus
advanced practice registered nurses in the prescribing process.

DR. LAPANE: Can you repeat the question? It was coming in and out.

MR. BORDELON: The way I heard the question was,are we going to be
extending beyond just practicing physicians into advanced prescribing nurses
and other types of prescribers.

DR. LAPANE: Our project is.

MR. BORDELON: In the long term care project, we will be using nurse
practitioners. We will also have traditional physicians prescribing, but
probably a majority of the medications will be prescribed through an agent
model by the nursing staff. So all three of those models will be represented.

DR. BICKFORD: There is also nurse midwives, the nurse anesthetists and
clinical nurse specialists who might be involved in some of the practices as
well. It might be working in the collaborative practice model. So I was asking
for clarification, if that was going to be part of the —

DR. ROTHSCHILD: We are going to be looking at clinics that have house
staff, but I don’t believe they have nurse practitioners.

DR. BELL: We have tried to use the word prescriber, and I apologize if we
slipped into physician in my presentation. But we are interested in both. My
understanding is that some of the practices in New Jersey do have nurse
practitioners prescribing, probably only a minority, so it is not clear yet if
we will get those or not.

MR. REYNOLDS: I have one other question. As wehave heard in
e-prescribing along the way, NCPDP and others have been closely involved in the
process, and other standards organizations. How are they aligning? Are theyjust watching from a distance, or are they staying close?

Some of you mentioned you are pushing the envelope on some of the
standards. There are some things that maybe aren’t fleshed out as far as they
would like to be. Are entities like that staying close to the process?

M R. WHITTEMORE: NCPCP clearly is. In fact, there are a number of
the initial standards that, I think you saw from the presentation, everybody
has pushed them to the back of the year because they aren’t standards yet. But
there are work groups within NCPDP that created a guidance specifically for the
pilots, to tell us how we can approach this and get this done in 2006.

MR. REYNOLDS: So basically it sounded like you are outside of the envelope,
but being written on theenvelope. That is what I wanted to get a
clearer picture of.

DR. BELL: The RxNorm group within NCPDP issued their guidance in January,
so we looked at that carefully. Then for structuring codified SIG, they are
planning to issue it by the end of March, is my understanding. We have had the
chairman of that task group talk to our group, andalso talk to the AHRQ
kickoff meeting. So we have some advance knowledge of what they are planning.

MR. REYNOLDS: All this collaboration has been so successful from all of you
sitting over there up to now,that it is continuing.

MR. BORDELON: Last year NCPDP started work group 14, which is specifically
focused on long term care, and we spun out an e-prescribing task group out of
that. I lead that group. In fact, I just gave this presentation to that group
Wednesday this week.

MR. REYNOLDS: I wasn’t singling out just NCPDP, just making sure. It wasn’t
clear exactly how everybodywas grouping up. That’s good. Any other
questions from the committee?

Again, kudos. Incredible work in a fast period of time. As all of us run
big projects in our day jobs, you are doing a great job. Congratulations, and
we appreciate all of you being willing to come and talk to us today. Thank you
very much.

From a committee standpoint, we have had a lot of discussion. Some of us
have been discussing since Tuesday. So I would like to open it if there are any
other discussions. If not, we are set for our April hearing. We have already
got an agenda item for July. We will be looking at the 5010 and starting our
discussion of that based on the annual report from DSMO.

Is there any other comment from the committee?

MR. BLAIR: You did well.

MR. REYNOLDS: If not, other than Jeffrey’s comment, we stand adjourned.
Thank you very much.