EHRs and E/M Coding: Warnings, Pitfalls, and Best Practices

Take the time to thoroughly understand and customize your EHR.

By Michael Stearns, MD, CPC, CFPC

Most electronic health record (EHR) systems include software to help providers determine the appropriate evaluation and management (E/M) CPT® codes for patient encounters. Used correctly, these tools do support accurate coding based on medical necessity, and have been associated with generally higher levels of E/M coding. However, when I recently evaluated a number of EHR products’ E/M coding capabilities, I found significant EHR software design flaws, inadequate implementations, and a general lack of user knowledge regarding how the E/M coding systems function.

The Office of Inspector General (OIG) is concerned about EHRs “assisting” providers with coding and documentation decisions, but there has been little external testing of how EHRs capture and use information to recommend E/M codes. The findings in this article will help the reader to better understand the limitations of EHR based computer-assisted E/M Coding (CAEMC). The evaluations were performed in provider settings. The reviewed EHRs had a reported combined client base of more than 80,000 physicians, which makes these findings partially representative of the overall EHR user population. Each EHR evaluated took a markedly different approach as to how it captured, calculated, and displayed information relevant to E/M coding.

EHR Evaluation Shows E/M Level Discrepancies

All of the evaluated EHRs included features that supported (at least, partially) accurate E/M coding; but all had discrepancies that could cause inaccurate coding. In most healthcare settings, the physician is responsible for choosing the code, which places the liability for incorrect coding solely on the provider’s shoulders. To avoid denials, rejections, penalties, and even accusations of fraud, providers and coding professionals should understand how EHRs are designed, and their limitations.

One consistent trend was the EHRs’ inability to automatically identify key data elements related to the complexity of medical decision-making (e.g., a provider documented that he or she reviewed images). This suggests that, in general, EHRs are not capturing key encounter information necessary to support accurate CAEMC. Providers often rely on EHRs to guide their coding-related documentation. A system’s inability to document key E/M-related information in a structured format has the propensity to lead to errors in documentation and suggested E/M codes that are below what should have been reported, resulting in lost revenue.

In other instances, EHRs generated higher-level E/M codes than what were supported by documentation, primarily through the inclusion of irrelevant information (by default) or sections of the record that were inappropriately “cloned” (i.e., copied from previous records and pasted into the current document). All the EHRs reviewed supported the ability to clone information from other areas of record, but none gave any warnings that a section of the record was copied and might contain inaccurate information. Auditors are now using anti-plagiarism software and other methods to detect EHR record cloning, and EHRs will need to provide more sophisticated tools to ensure cloned and templated information is accurate and modified appropriately.

Commonly, EHR users were unfamiliar with coding guidelines, and how EHR coding tools could be used proficiently. With one exception, the vendor documentation available to the users during this evaluation was nearly devoid of information necessary to use the coding tools with any degree of sophistication.

The Most Common EHR Issues

CAEMC errors were organized into categories. Although the features of each EHR system varied, the following issues were identified in the majority of the EHRs:

1. Programming errors

a. Inaccurate levels of service calculation (E/M codes) based on information documented in the record.

b. Highly complex software applications made it challenging for users to modify how E/M coding information is recognized and managed by the EHR. For example, it was difficult for users to create content and set system defaults relevant to E/M coding.

3. Education and training issues

a. Inadequate staff training for best coding practices when using their particular EHR.

a. Discrepancies between how the system determined E/M codes and what is required in CMS’s 1995 and 1997 Documentation Guidelines for Evaluation and Management Services revealed errors in EHR coding tools related to:

Inability to recognize key elements of the history (e.g., history of present illness (HPI), past medical, social and family history, and review of systems (ROS)) that are used to determine the overall level of service for the encounter (i.e., the E/M code).

Deficiencies in how the number of diagnoses (and their statuses), the level of risk, and the amount of information are used to determine the overall level of complexity for the encounter.

Deficiencies in how the three levels of the encounter (i.e., history, examination, and complexity) are used to determine the final E/M code.

The inability to recognize documentation conflicts in the record (e.g., when information in the HPI is in direct conflict with information in the ROS section of the note).

b. The inability to recognize when cloned information contains obvious errors of documentation

Work-arounds and Best Practices

Despite the challenges associated with EHR design and use, if implemented and used properly the systems have the potential to improve coding accuracy. Some practices have seen remarkable improvements in their documentation and coding, but only after they took the time to thoroughly understand the inner workings and customization of their EHR. This more accurate coding has led some practices to see increases in revenue in excess of $60,000 per provider, per year.

The majority of EHR users, however, have not invested the time necessary to optimize their EHR’s billing elements. They tend to rely on the EHR’s suggested E/M code and assume the EHR vendor’s tools will protect them from coding at a level not supported by documentation, or by medical necessity. They, too, may see compensation increases, but these increases are associated with coding errors induced by poor EHR usage and design, which puts providers at significant risk for negative consequences.

In general, providers need to have greater knowledge of basic E/M requirements and how their EHR interprets and supports these requirements.

The following steps are recommended for EHR users:

Begin by reviewing relevant E/M coding user documentation provided by your EHR vendor. Include the actual E/M coding modules as well as default settings and content modifications that influence how relevant E/M data elements are entered into the clinical record.

Unfortunately, your EHR may not have documentation at the required level to fully understand how E/M codes are determined. Getting help from a knowledgeable vendor representative or having your system assessed by an independent third party may be necessary.

Review the basic elements of your EHR’s E/M coding, including which data elements in the history, physical examination, studies reviewed, assessment, and plan of the record have E/M relevance in both the 1995 and 1997 guidelines. For many providers, becoming familiar with E/M coding requirements published by individual payers may also be necessary. Learning and memorizing this level of detail is very challenging for most providers, but when it’s learned while receiving feedback from the EHR, providers become more sophisticated with E/M coding concepts.

The most challenging step is to identify areas of deficiency in the E/M coding tool so potential pitfalls can be avoided by users. A common example of this is the EHRs’ tendency to add irrelevant information into the clinical record through templates or default information.

Providers and their coding professional advisors must be certain that superfluous information (e.g., a 12-system ROS and/or irrelevant family history information in an uncomplicated follow-up visit or non-relevant diagnoses in the assessment section of the note) do not appear in the encounter note. Such detail may be seen as not medically necessary, and may trigger the CAEMC tools within an EHR to suggest a higher-than-justified coding level.

EHRs tend to “dump” noncontributory information into clinical encounter documents, and auditors learning how EHRs may cause coding errors. Providers must determine which information is medically relevant to document, and either change the default setting in the EHR, or make sure that the EHR’s coding tools do not use this extraneous information to make the final E/M coding recommendation. Providers must also be vigilant when reviewing information that has been cloned from another encounter note. This information needs to be updated, and made specific and relevant for the current patient encounter.

Empower Through Education

Most providers in the United States are relatively new to using EHRs and, in particular, to how EHRs influence or support their coding behavior. Further study of EHR systems and how they are used in “live” settings, and of EHR vendors, is ongoing. EHR CAEMC tools have not been subjected to a certification process. This may need to be considered as a future direction. The majority of EHR users need to acquire a much deeper understanding of how EHRs determine E/M codes. Because each EHR system is unique, users must become familiar with the nuances and shortcomings of their particular EHR system. This knowledge will empower providers to customize and use their EHRs to support accurate E/M coding.

Michael Stearns, MD, CPC, CFPC, works as a healthcare compliance and health information technology consultant. He has over 12 years of experience designing and implementing EHRs, and has created requirements for computer-assisted coding software applications for three major health information technology companies. Stearns is a member of the Austin, Texas local chapter. He can be reached at mcjstearns@gmail.com.

Renee Dustman is executive editor at AAPC. She has a Bachelor of Science degree in Journalism and a long history of writing just about anything for just about every kind of publication there is or ever has been. She’s also worked in production management for print media, and continues to dabble in graphic design.

Renee Dustman is executive editor at AAPC. She has a Bachelor of Science degree in Journalism and a long history of writing just about anything for just about every kind of publication there is or ever has been. She’s also worked in production management for print media, and continues to dabble in graphic design.