Methods: SARS-SP developed, distributed, and updated an Internet-based triage form to identify patients for
infection control and public health reporting. EDs then were invited to report visit frequencies with various SARS
syndrome elements to local public health authorities by using the REMI Internet application (first in one metropolitan area, and later in four). After pilot-testing in one metropolitan area, the surveillance system was implemented in three others.

Conclusions: ED and public health partners reported being satisfied with the system, confirming the usefulness of
Internet tools in the rapid establishment of multiregion syndromic surveillance during an emerging global epidemic.

Introduction

On March 15, 2003, CDC urgently requested health-care and public health agencies to conduct surveillance for
severe acute respiratory syndrome (SARS) (1), a pneumonia later attributed to a newly discovered coronavirus (SARS-CoV). SARS had spread rapidly by air travel to three continents and appeared to be highly infectious to health-care workers and patients in health-care settings (1). The cause of SARS was then unknown, and diagnostic tests were lacking. Basic epidemiologic facts (e.g., the range of clinical symptomatology, whether persons with mild or asymptomatic infection could transmit disease,
and the range of possible routes of infection) were unknown. Minimal assurance could be given that SARS was not
already
circulating in the United States. As a result, public health systems had to deploy complex, rapidly changing measures
to protect health-care facilities and to take an agile approach to surveillance.

Frontlines of Medicine (http://www.frontlinesmed.org) is a collaborative of emergency medicine, public health,
and informatics professionals organized to enable better public health surveillance of emergency department (ED) information (2). Frontlines of Medicine created the SARS Surveillance Project (SARS-SP) workgroup to develop, disseminate, and update a practical screening (case-finding) form for potential
SARS patients in EDs. The form was used to measure daily ED
volumes of SARS syndrome elements. These counts were transmitted and assembled regionally by using
EMSystem® Regional Emergency Medicine Internet (REMI).* Because EMSystem was in use in 26 cities (Figure 1), syndromic surveillance developed in one city was presumed to be portable to multiple urban areas quickly and
inexpensively.

The objectives of SARS-SP were to 1) create, disseminate, and update SARS screening forms for ED triage, 2) conduct
SARS surveillance by using REMI, 3) expand surveillance to multiple
regions, and 4) evaluate the usefulness of Internet tools for
agile surveillance during a rapidly emerging global epidemic. SARS triage forms and surveillance were field-tested in
Milwaukee, Wisconsin. The form was then distributed over the Internet, and three other urban regions initiated surveillance.

Methods

Case-Finding Triage Forms

A single-page screening form was created for ED triage personnel (available at
http://www.frontlinesmed.org/SARS-SP). The form was designed to 1) identify patients requiring
immediate infection control and public health notification (case
finding) and 2) facilitate counting and reporting to public health officials the number of daily visits featuring SARS syndrome elements for time-trend surveillance. Three check boxes recorded the presence or absence of the following elements of the SARS
case definition (hereafter referred to as "SARS elements"): fever (history or finding of temperature >38ºC); respiratory findings (i.e., cough, shortness of breath, difficulty breathing, pneumonia, or respiratory distress syndrome); and either recent travel to locations associated with SARS transmission or contact with a suspected SARS patient (hereafter referred to as "SARS
risks"). Pulse oximetry <95% was recorded separately.

Screening was originally recommended only for patients with fever; later, after CDC recommended assessing patients
for possible SARS on the basis of either fever or respiratory symptoms, triage personnel were instructed to screen patients
with either complaint. The screening form encouraged ED staff to telephone the local public health authority immediately for any patient with the triad of fever, respiratory findings, and SARS risks.

On March 17, 2003, forms were distributed to Milwaukee EDs via REMI. On March 30, revised forms were posted
online and the national membership of the American College of Emergency Physicians was notified by e-mail of the screening form website. Persons downloading forms were invited to enter an e-mail address to receive notification of updated forms and to participate in the voluntary syndromic surveillance
effort. Screening forms were revised twice (and registered users notified) to matching changing CDC recommendations.

Syndromic Surveillance

The Milwaukee Health Department (MHD) invited local EDs to report daily visit totals and the numbers of
screened patients sorted by mutually exclusive combinations of SARS elements (e.g., fever only, fever with respiratory findings
only, respiratory findings with SARS risks only, etc.). Because little was then known of the clinical spectrum of SARS
infection, surveillance was performed for each clinical element so that health authorities could be alerted to rising rates of febrile or respiratory illness even if patients failed to meet CDC criteria for SARS diagnosis (Figure 2). The reporting system was
similar to that employed in Milwaukee the previous summer during the 2002 Major League Baseball All-Star Game using
EMSystem (4,5).

Detailed instructions were e-mailed to ED managers and mounted on REMI for reference, with a follow-up conference
call. MHD staff provided assistance as needed. Designated ED staff collected all screening forms for 24-hour periods and sorted them into mutually exclusive sets of SARS elements. REMI automatically reminded EDs daily to enter the previous day's totals on a screen designed for that purpose. Only authorized staff could enter or view surveillance data.

Only visit counts were entered into REMI; no personally identifiable health information was transmitted. Triage
personnel stamped each form with patient identification and retained completed forms in case public health investigation of a particular patient was needed. If REMI reports included visits with the triad of fever, respiratory illness, and SARS risks, public health officials could ask the ED to identify the patient.

During nationwide dissemination, those who downloaded screening forms were asked if they would conduct
syndromic surveillance. If ED staff expressed interest, the state or local public health agency offered assistance. If EMSystem was not already in use in the area, local interface screens, log-on
accounts, server accounts, data storage, and 24-hour/day
technical assistance were offered at no charge to EDs and health departments, using existing EMSystem infrastructure.

Participating public health staff used password-protected
accounts to download daily jurisdiction-specific data from REMI as
a tab-delimited spreadsheet. Each health department had
exclusive access to its local data and controlled how it was analyzed and
acted on. Milwaukee data were also downloaded remotely
at CDC for analysis with the Early Aberration Reporting System (EARS) to
test the feasibility of remote analysis (6).

Surveys

Participating health department surveillance coordinators provided summary statistics and impressions of the project. In July 2003, surveys were also sent to nurse managers at the 13 participating Milwaukee-area EDs.

Results

During May--September 2003, a total of >500 SARS-SP website hits were logged, and 257 persons requested
e-mail notification of screening-form changes. Much smaller numbers visited the site after receiving e-mail notification of revised forms. The total number of EDs or clinics that used the screening form is not known.

During March 19--June 25, 2003, a total of 13 Milwaukee-area EDs participated in syndromic surveillance of
105,669 visits. Three other metropolitan areas (Denver, Colorado; Akron, Ohio; and Fort Worth, Texas) established ED
syndromic surveillance with reporting to health authorities. During April 23--May 31, 2003, nine EDs in Denver, Colorado, that
already used REMI sent surveillance information on 16,997 encounters to the Colorado Department of Public Health
and Environment (CDPHE). During May 1--June 1, 2003, three EDs in Akron, Ohio, reported information from
12,939 encounters to the Akron Health Department (AHD). Neither the hospitals nor AHD had previously used REMI.
During May 12--October 12, 2003, two hospitals in Fort Worth, Texas, that already used REMI reported on 10,941 encounters
to Tarrant County Public Health (TCPH), with surveillance continuing beyond October. EDs in eight other cities
expressed interest in daily syndromic surveillance, but efforts to recruit a public health agency failed in seven. The eighth city initiated a surveillance pilot in fall 2003.

Only one person in all four cities ultimately met the CDC criteria for possible SARS, and no confirmed cases were
reported. Thus, neither case-finding sensitivity nor specificity can be measured. During March 15--October 1, 2003, three of the
four jurisdictions investigated 42 potential SARS cases, of which 22 (52%) were prompted by the triage form. In Milwaukee,
five investigations originated from telephone calls about positive ED triage forms; four originated from REMI electronic reports; and five originated outside EDs. All 13 investigations
by CDPHE began with REMI reports. All 15 TCPH
investigations began before initiation of SARS-SP surveillance and originated from nonmedical settings (e.g. from airlines). No patient investigated for possible SARS visited a participating ED but failed detection by the screening form.

The median percentage of surveillance period days for which participating EDs reported syndrome frequencies electronically by using REMI was 89% (range: 52%--100%). The most common data-quality problems cited by public health surveillance coordinators were nonreporting, reports lacking
total ED visit census, and errors in the date of surveillance; telephone
calls were sufficient to resolve these concerns. In Milwaukee, questions and data-quality concerns required frequent calls (7--9 daily) to and from EDs early in the project but only 1--2 calls by the end.

Resources did not permit on-site chart review to validate the accuracy of SARS element frequency reporting. Also,
the standard ED record would not necessarily collect SARS risk history (travel or contact) and thus is not an ideal standard for comparison.

Each city performed its own analyses of syndromic time-series data. Cross-city analysis was not performed. In
Milwaukee, staff graphed time series of SARS elements as crude counts, proportions of total ED census, and standard scores (i.e.,
the
difference of daily counts from the cumulative mean, divided by the standard deviation) to display significant aberrations
from the mean (Figure 3). The overall incidence rate of ED visits with each SARS element varied widely between cities, which is
not surprising given the different geographic areas and date ranges of surveillance. Local
surveillance-period incidence rates of ED patients reporting fever plus respiratory illness ranged from 0.33% in Akron to 1.4% in Denver. Two cities (Milwaukee and Fort Worth) investigated increasing syndrome trends; in both cases, telephone queries and record reviews by ED staff
proved sufficient to exclude SARS as the cause.

During March 22--April 20, 2003, CDC easily downloaded daily Milwaukee data for EARS analysis, but these files did
not include corrections made by local public health staff after telephone contact with EDs. Permitting online correction of data
files on REMI would enable more accurate remote analysis.

Six of 13 participating Milwaukee ED managers returned nonanonymous surveys. Four of six believed SARS screening
was performed as requested during all shifts. On a five-point scale ("strongly agree," "somewhat agree," "neutral," "somewhat disagree," and "strongly disagree,") five of six managers at least somewhat agreed they felt more secure knowing screening
was being performed and also that screening increased
the index of suspicion for SARS in their ED (one response to each item
was neutral). Four at least somewhat agreed that data tabulation and data entry were easy (with one respondent neutral and
the other somewhat disagreeing to both items). The average estimate for the time to complete
the form at triage was 2.6 minutes (range: 1--5 minutes;
median: 3 minutes), and the average estimate for daily tabulation and reporting was 17 minutes
(range: 5--45 minutes; median: 15 minutes). These compared favorably with estimated time spent on syndromic surveillance
during the 2002 All-Star Game project, further validating that surveillance from the controlled confines of the triage desk was more manageable. Two managers had participated in syndromic surveillance during the previous summer; both strongly agreed
that triage-based surveillance was superior, and both at least somewhat agreed that prior experience with REMI
surveillance facilitated the rapid start-up of SARS surveillance.

The four public health surveillance coordinators all reported that they were glad they had participated and were interested
in similar surveillance opportunities. Queried on ways to
improve the system, two coordinators stated that they wished they
had recruited additional EDs to participate, and two stated that they desired better communications between public
health agencies and ED staff.

Discussion

SARS traveled extremely quickly, and new information about the disease evolved at a similar pace. SARS-SP, a
rapidly organized, voluntary response, leveraged three capabilities to help clinicians and health officials keep pace: 1) interdisciplinary collaboration between emergency medicine, public health, and informatics; 2) an always-on, secure REMI network used in >24 metropolitan areas, and 3) rapid Internet information dissemination to clinicians. These were applied to two critical tasks: 1) helping ED staff detect possible SARS cases (case-finding) so they could protect patients, staff, and the community and
2) establishing syndromic surveillance to warn local health officials if illness consistent with SARS was increasing in their communities. The latter was deployed because CDC's surveillance focused on identifying known or suspected SARS risks but might not alert authorities to illness from unsuspected SARS contact (e.g., from asymptomatic transmission or unreported cases).

Ready-to-use screening forms helped busy ED staff to consistently meet complex, rapidly changing CDC guidance.
ED triage (through which every patient passes early in an ED visit) was selected for case-finding and syndromic surveillance on the basis of ED workflow and previous experience. The 2002 All-Star Game surveillance project determined that relying
on treating staff to record syndrome data produced poor-quality surveillance data and substantial staff-time demands (4,5). In contrast, triage nurses equipped with a well-crafted
case-finding form could consistently "Screen---Isolate---Call
Public Health." Although the sample size was limited, ED managers in Milwaukee reported higher satisfaction, greater confidence in data collection, and more reasonable time demands from triage-based surveillance than from the earlier 2002
All-Star Game surveillance program.

Paper-based forms have important limitations. Manual data check-off, tabulation, and entry each multiply the risk of
data error and consume staff time. However, surveillance methods relying exclusively on mined data from existing registration, discharge, or other routine data sets would miss relevant
information (e.g., recent travel), and they would not provide a
real-time alert to ED personnel to implement infection control, diagnostic testing, and public health reporting. Therefore, data
mining alone does not replace intelligent tools at the point of service for agile surveillance and response. Ideally, future triage information systems could be modified rapidly to collect and analyze newly important information (e.g., travel) alongside other routinely collected data (e.g., chief complaints) as part of routine workflow. The right combinations of data
would automatically alert staff and public health authorities
of a potential case while data for ongoing syndromic surveillance
are collected with no additional human effort. Intelligent, programmable, and interoperable electronic medical record systems, linked through clinical networks such as REMI, could result in automated yet agile surveillance.

Milwaukee had used REMI previously to facilitate drop-in ED surveillance. Resulting experience and relationships helped MHD rapidly implement SARS surveillance. EDs in other cities appeared more prone to participate when they already used REMI in their day-to-day work (as was the case in 24 of the 27 participating EDs). Staff used the same application for surveillance that they used daily for other purposes, eliminating the need for new hardware and simplifying training. By contrast, public health agencies that were unfamiliar with the REMI application appeared more reluctant to participate.

Existing experience, servers, and 24-hour technical assistance capability that already supported the REMI system
were leveraged to support rapid, multiregional surveillance. The project demonstrated that remote CDC specialists could use aberration analysis on remote REMI data. Ideally, such data should be quality-checked locally before analysis.

Rapid dissemination and updating of the screening form was enabled by ACEP's membership e-mail list and Internet tools. Because SARS-SP anticipated rapid evolution of case definitions, clinicians were encouraged to subscribe for
updates. However, not surprisingly, busy clinicians often failed to return for updated forms after downloading the original form. Ideally, REMI-networked clinical information systems would automatically incorporate updates and eliminate
outdated tools from the point of service.

EDs in 12 urban areas expressed willingness to submit syndromic surveillance information to public health authorities, but only four health departments participated. The Council of State and Territorial Epidemiologists and the National Association of County and City Health Officials did not promote the project among their members because it lacked formal CDC endorsement. Such endorsement might be a precondition to participation, particularly in a fast-moving emergency
with competing time demands.

Although this was a successful proof of concept of multiregional REMI-enabled surveillance, it had limitations.
First, sensitivity and specificity of the triage screening and reporting cannot be calculated without SARS cases. Second, data were not validated by chart review. Third, ED records do not routinely record all information (e.g., travel) solicited. Finally, the system emphasized sensitivity over specificity.

With sufficient proportion of EDs involved, a sharp or sustained increase in community incidence of febrile and
respiratory illness would likely be detected. Stamping and storing complete screening forms simplified rapid public health investigation. Because all four health departments reported being satisfied that they had participated in the surveillance project, it appears that a low positive predictive value for SARS was nevertheless practically manageable. Surveillance did not exhaust the patience of either EDs or public health agencies in springtime, but the outcome might have been different if the incidence rate
of influenza and other common respiratory viruses were rising rather than falling.

Conclusion

SARS syndromic surveillance was rapidly established under emergency conditions by a loose network of
collaborators using the tools available. It was handicapped by the lack of a legal or practical framework for sharing surveillance information across jurisdictions, and resources did not allow rigorous evaluation of the system's performance. Nevertheless, the ability to share surveillance tools across communities in a rapidly evolving outbreak illustrates how networked tools (e.g., REMI),
which now reach >18% of the nation's EDs, have become practical instruments for agile surveillance across multiple regions. This
is enhanced when clinicians and public health agencies are familiar with the applications from regular use. State and
federal public health involvement might elicit participation by more agencies and could exploit untapped potential of
these applications, such as integrating data across multiple
regions and employing more sophisticated aberration
algorithms.

Acknowledgments

Lori Hutwagner, National Center for Infectious Diseases, CDC, provided advice and EARS analysis; American College of
Emergency Physicians publicized the screening tool and promoted participation in surveillance; Infinity HealthCare, Inc., developed and
deployed
new surveillance screens, accounts, and technical assistance without charge; and the staffs of participating hospitals and
health departments established local procedures to transmit and review surveillance data. Milwaukee Health Department staff were supported
in part by Lynde Uihlein through the Milwaukee Center for Emergency Public Health Preparedness, and by the Wisconsin Division
of Public Health/CDC Cooperative Agreement #U90/CCU517002-03-02.

* EMSystem is an Internet-served REMI that allows restricted viewing of Internet screens protected by standard Secure Sockets Layer (SSL) with 128-bit encryption, and can alert participants using text mail messages. EMSystem and similar networked REMI applications were developed to
improve situational awareness of emergency departments regarding ambulance diversions, mass casualty events, and other emergency medical services system changes. They have since been used for other functions including public health alerting, monitoring health-care utilization and readiness, and syndromic surveillance (3).

Use of trade names and commercial sources is for identification only and does not imply endorsement by the U.S. Department of
Health and Human Services.References to non-CDC sites on the Internet are
provided as a service to MMWR readers and do not constitute or imply
endorsement of these organizations or their programs by CDC or the U.S.
Department of Health and Human Services. CDC is not responsible for the content
of pages found at these sites. URL addresses listed in MMWR were current as of
the date of publication.

DisclaimerAll MMWR HTML versions of articles are electronic conversions from ASCII text
into HTML. This conversion may have resulted in character translation or format errors in the HTML version.
Users should not rely on this HTML document, but are referred to the electronic PDF version and/or
the original MMWR paper copy for the official text, figures, and tables.
An original paper copy of this issue can be obtained from the Superintendent of Documents,
U.S. Government Printing Office (GPO), Washington, DC 20402-9371; telephone: (202) 512-1800.
Contact GPO for current prices.

**Questions or messages regarding errors in formatting should be addressed to
mmwrq@cdc.gov.