URLs in this
document have been updated. Links enclosed in {curly
brackets} have been changed. If a replacement link was located,
the new URL was added and the link is active; if a new site could not be
identified, the broken link was removed.

Surveying Graduate and Professional Students' Perspectives on
Library Services, Facilities and Collections at the University of Illinois at Urbana-Champaign:
Does Subject Discipline Continue to Influence Library Use?

Abstract

The University Library at the University of Illinois at Urbana-Champaign
(UIUC) Services Advisory Committee has established a program to create a regular
rotation of patron surveys. The program is an effort to answer basic questions
about attitudes towards the library's services, facilities and collections.
Modeled directly after those surveys designed and carried out at the University
of Washington Libraries, the UIUC plan calls for an annual rotation of three
surveys, each focused on a different user group. The first group surveyed
(spring 2004) consisted of graduate and professional students, to be followed
by undergraduate students (spring 2005) and faculty (spring 2006). Surveys
will be conducted annually thereafter so that each user group will be surveyed
every three years. Results from the first UIUC web survey, with 1400 respondents,
revealed that graduate and professional students are very satisfied with their
library experiences and the pattern of responses by subject discipline corresponds
with the findings of Hiller (2002) at the University of Washington, but with
some predicted shifts. "How-tos" for the web survey are discussed
and included, along with a link to two sets of data analyses. In addition,
lessons learned from the survey experience are discussed and suggestions are
made for the presentation of data to successfully communicate survey results
to decision makers.

The original web questionnaire is found at:
{http://g118.grainger.uiuc.edu/gradsurvey/}
Survey results by UIUC department affiliation are found at:
{http://g118.grainger.uiuc.edu/gradsurvey/analysis/}
Survey results by academic discipline, grouped in divisions, are found at:
{http://g118.grainger.uiuc.edu/gradsurvey/division/}

Introduction

After long admiring the longitudinal patron survey program established
at the University of Washington Libraries (Hiller 2001; 2002), the Services
Advisory Committee at the University of Illinois at Urbana-Champaign initiated
a similar program in fall 2004. The purpose of the program is to establish
communication with our patrons, offering them a simple way to provide feedback
and ultimately to both influence decisions and provide direction concerning
the library's services, facilities and collections.

Graduate and professional students at UIUC number about 10,000 and account
for approximately 41% of all library book circulation, by far the group with
the highest checkout-per-person rate (Kruger 2001). UIUC's decentralized library
structure was established over 100 years ago to bring library service into
the departments. This is a research library model that includes service to
the upper division undergraduate student, but which is primarily geared to
graduate students and faculty. Because of the heavy use of library materials
and services in this user group, graduate and professional students were selected
to be the first group surveyed. A "survey of the whole," rather
than a sampling method was selected. The survey was considered easier to conduct
via the web since all graduate and professional students could be contacted
with a single email distribution and it allowed us to easily market the survey
to a single group. It is acknowledged that web-based surveys often result
in lower response rates than printed/mailed surveys (Kwak & Radler 2002).
Because of this possibility, an examination of respondents and their representation
of the graduate and professional student population at UIUC was conducted
as part of this study.

The UIUC library has conducted a number of other user surveys over the
past few years, including UIUC Library-conducted web surveys in 1998, 1999
and 2000 (Schmidt & Searing 1998; 1999; 2001) and LibQUAL+ surveys in
2001 and 2002. The earlier surveys, conducted 1998-2000, were anecdotal in
nature and open to anyone on campus. LibQUAL+ results, while interesting,
were not locally focused enough to provide specific answers about individual
departmental libraries. Although the LibQUAL+ data showed high satisfaction
levels with UIUC libraries, it is important to better define how users perceive
the library and to collect their opinions about library priorities for the
future. The new survey is designed to collect data related to the primary
and secondary library used by the survey respondents in order to provide specific
feedback to these units.

Previous Research

User surveys are widely-used tools reported often in library literature
as summarized by Hiller (2001; 2002). SPEC Kit 280 (Diamond 2004) reports
the results of a survey of ARL member libraries on the use of library user
surveys. The survey found that library user surveys varied widely by process
type, scope, and method of execution, and the use of web-based surveys is
increasing.

Hiller and Self (2002) described and compared a separate series of library
user surveys administered at the University of Virginia and the University
of Washington (UW). They concluded that user surveys are valuable tools, while
acknowledging their limitations, and stated that user survey results are most
useful when combined with other data. Hiller (2001) compared University of
Washington Libraries' surveys with results from UW participants in the ARL-sponsored
SERVQUAL (LibQual+) pilot. Hiller concluded that both LibQual+ and locally
administered surveys have value, however local institutional issues are probably
more effectively addressed by locally administered surveys.

Hiller (2002) used the UW survey data to compare library use, priorities,
and information needs by academic discipline. Comparisons were made by broad
academic area (Health Sciences, Humanities-Social Sciences, Science-Engineering,
and Other), faculty vs. graduate students, and year of survey (1998 &
2001). Survey results revealed significant differences between academic disciplines.
Areas of similarity were also noted, as well as changing use patterns and
priorities. Hiller (2004) used the 2001 survey data to compare how geoscientists
and scientists in other fields (Chemistry, Mathematics, Physics, Psychology,
Zoology, and Engineering) find information and use libraries. He concluded
that, in 2001, the UW earth science faculty were more dependent on print resources
than other scientists, but predicted that, due to the addition of many online
resources in the earth sciences, the trend would be toward increased use of
electronic resources and decreased physical visits to the library.

Recent research conducted by the University of California Libraries (Schottlaender
et al. 2004) measured the use of print journals placed in storage that were
also available electronically. This study also reported results of a User
Preference Survey conducted among faculty, graduate and undergraduate students,
staff and health professionals by broad academic discipline, and concluded
that electronic journals are "popular, extensively used, and pervasive",
although to a lesser extent for arts and humanities and for undergraduate
students. However, print format remains important. Problems with electronic
journals include unavailability of some older issues, and also some recent
issues, omission of some types of content, usability and printing barriers,
and access problems.

Survey Expectations

Based on the long-standing perception that academic disciplines use library
materials differently and also on the results previously reported at the University
of Washington showing the research differences between academic fields (Hiller
2002), the authors expected UIUC survey results to mirror these findings and
show that differences in research material use and library use continue to
be different between, for example, the sciences and the humanities. It was
expected that humanities disciplines would report more reliance on print-based
materials and therefore higher in-library use, while science disciplines would
report a higher use of online resources and remote library use.

Despite the fact that differences were expected in how the disciplines
use research material, the authors also believed the data would begin to show
a blurring of these lines as more research material in every discipline becomes
available online and Internet use becomes a more widely-used part of every
discipline's research model. The widespread use of Internet search engines
may be influencing this generation of library users.

UIUC Methodology

The surveys previously conducted at the University of Washington (UW)
provided the UIUC Library Services
Advisory Committee with a series of sample surveys on which to base the questionnaire.
A subcommittee was formed to draft a survey; very early in this process the
decision was made to conduct a purely electronic, web-based survey. With the
UW's consent, it was also decided to replicate some of the University of Washington
Libraries' questions in order to compare the results from the two institutions
because UIUC and the UW libraries serve similar groups of users.

Because human subjects were involved, the Research Board at the University
of Illinois required an application form and permission to conduct the survey.
Email notification to each graduate or professional student on the UIUC campus
was determined to be the best way to solicit participation. After submitting
our application and agreeing to the required security and privacy regulations,
approval to survey this group of 10,172 individuals was granted by the UIUC
Research Board. A graduate research assistant was then hired to program the
web survey and database to collect the responses.

Pre-testing the Survey

After completion of the first draft, the web survey was pre-tested with
a group of graduate student assistants working in the UIUC Grainger Engineering
Library. This group was selected for the pre-test due to their library experience
as students in the Graduate School of Library and Information Science and
because they met regularly as a group for reference meetingsThis group was
expected to be more critical than the larger population since they had library
backgrounds and felt comfortable with the survey authors who administer other
science library units. They were asked to make comments about the survey in
focus group sessions following their completion of the survey. The first "live"
test session resulted in numerous changes. Pre-testing determined the length
of the survey was perceived to be overly long, although it only took about
15 minutes on average to complete. In addition, the group disliked some of
the questions that asked them to determine multiple facets for an answer,
such as importance and satisfaction. After making changes based on the pre-test,
the group was asked to take the survey a second time a week later. Only a
few small changes followed the second pre-test, resulting in a survey that
took an average of ten minutes to complete because questions from the first
iteration of the survey were both simplified and eliminated in order to streamline
the survey.

Comments from the pre-tests were very helpful in creating a web-based
survey. Pre-test participants noted that they felt that a huge difference
exists between a paper survey and a web survey. Many felt a web survey was
inappropriate for questions that took long, thoughtful answers or that asked
for complex relationships to be explained (for example, satisfaction and importance).
A web survey, from the perspective of the pre-testing group, needs to be succinct,
clearly formatted, and timed to be completed in less than ten minutes. These
characteristics are not unlike those outlined in a Rand publication discovered
after most of our web survey was completed, but which has been very helpful
in documenting mistakes to avoid in the next survey (Schonlau et al. 2001).

Survey Costs

Total cost of the survey, not including librarian time, was $1,750.00.
Approximately 100 hours of graduate research assistant work went into programming,
database creation, and analysis for a total of $1,300.00. An additional cost
was the campus email distribution charge of $50.00 and gift certificates to
the Illini Union Bookstore which were awarded as incentives for participation.
A total of $400.00 was budgeted for incentives, two $100 gift certificates
and four $50.00 gift certificates. Participants were encouraged to enter a
drawing for the certificates as a thank-you for taking the survey. It is anticipated
that future surveys will not be as expensive since the initial web survey
has been created and will only be modified as changes are needed, although
campus fees for mass email distribution to undergraduate student are higher
($150.00 per mass mailing) due to the larger size of the population surveyed,
and a second email reminder may be sent during future surveys.

Approximately six librarians contributed hours to the survey, five who
served on the Library's Services Advisory Committee and one supervising computer/web
work. Librarian hours spent on the survey were estimated to total no more
than 20-30 hours. Because we relied heavily on previous work done at the University
of Washington Libraries, it was felt that there was no need to hire a consultant
or to conduct further background work. Our motto became "just do it!"

Confidentiality and Security

Like most research institutions, the University of Illinois is concerned
with providing security and protecting the confidentiality of its students.
Before gaining the permission of the UIUC Research Board to implement the
survey, the Library agreed to take deliberate steps to secure our web site
and to remove any links created between a student's survey responses and their
identity. In addition, a separate database was set up to receive the email
addresses of any respondent who wished to be considered for the prize drawing,
again to separate them from survey responses. Bluestem, a method for providing
reliable client identification for database applications, already in place
on UIUC servers, was used to validate users to take the survey. It also provided
a level of security by admitting authorized users only. Once entered into
the survey's host server, users were able to re-enter if they failed to complete
the survey in a single session. This feature proved very useful when the first
day's onslaught of respondents brought down our initial server site. If so
moved, users could return at another time to complete the survey. Once finished,
the survey did not allow them to enter the site again.

Initial Problems

A few problems arose during the first few hours of the survey. The email
invitation to participate in the survey was distributed late Sunday evening
by automated campus delivery. Early Monday morning as students read the notice,
they literally stopped the survey by crashing the server when 800 persons
attempted to logon at the same time. We obviously underestimated the number
of concurrent respondents and this database failure resulted in some turn-aways.
On the survey's second day it was switched to a more robust SQL server. No
further problems were encountered, but this lesson is well-learned since our
next survey will include the undergraduate population at UIUC which is estimated
at about 26,000 students. It will be critical to either stagger the email
announcements about the survey or to test the database and server for hundreds
of simultaneous users before launching the survey.

One surprising result of this server crash was the number of respondents
who took the time to tell us they could not complete the survey. Email addresses
of library personnel to contact with questions or problems were included throughout
the survey. Many emails were received explaining their problems with the survey
and a willingness to continue the survey when computer problems were solved.
A few individuals, mostly computer science majors, even offered opinions on
what went wrong with our design and how they would fix it. For the most part,
very thoughtful comments were received and demonstrated the goodwill the library
has among this constituency.

Presentation of Results

The successful presentation of survey results is as important as the data
collection itself. Since the survey data were collected to inform decision
makers, it was essential to provide the data in an extremely accessible and
approachable format. At UIUC, the departmental library structure allows for
decision making at many levels, including at the local departmental library
level. Since the data were collected, at minimum, to inform unit managers
about impressions and opinions concerning their libraries, the data needed
to be easily manipulated and viewed. This objective was achieved by programming
a web site hosting survey analysis data.

The {web site

Some attrition took place during the survey. Each question analysis reports
the number of survey respondents for that question. This attrition can be
attributable to either the first day's system failure, where respondents
did not return to finish the survey, or to general drop-off due to busy schedules
or disinterest in the survey.

A popular feature of the results web site is the respondents' comment text boxes. All comments can be viewed by the library used most frequently or second-most frequently, but with some viewing experience it is clear that respondents use many libraries and their comments were not always focused on a single library. Reading text box comments from the survey is like eating jelly beans -- you just can't stop once you start. The comments are real and often pointed ("Chemistry library does a good job of getting me the materials I need to do my research. However, it could use a facelift. It is very unpleasant to study there and I have gotten splinters from some of the tables"), but just as often laudatory about the importance of libraries ("UIUC would be nothing without its library.").

While conducting analyses for this paper, the data were grouped by division. {"Sub-question" results were created}, and these too have been made readily available to all web site viewers. Questions marked "second analysis" are those which needed further tweaking in order to compare to UW data. It is possible to continue to mine the data in order to tease out needed responses. The results are both many-layered and as simple or complex as desired. The primary goal was to provide data to fit the needs of many viewers in order to facilitate use of the data. A secondary goal was to be sure the data presentation was neither unwieldy nor overly complex so as to discourage use.

Results and Discussion

Survey Respondents

1400 responses were received during the two weeks the survey was advertised
and the web link was open. An announcement about the survey and an invitation
to participate was sent from University Librarian Paula Kaufman via email
to 10,172 graduate and professional students on the UIUC campus. While 1400
respondents represents only a 14% response rate, this number is larger than
the total population responding to University of Washington Libraries'
most recent surveys (which were print surveys mailed to sample groups, not
web-based, whole population surveys) which found 563 graduate students responding
in 2001 (a 40.4% sample return rate) and 457 graduate student responding in
1998 (a 45.7% sample return rate) (Hiller 2002). There are approximately 10,000
graduate and professional students at the University of Washington (Hiller
2002). In hindsight, a longer survey period with more aggressive marketing
might have collected a higher response rate at UIUC. For a first-time effort,
1400 replies provide a good baseline from which to build.

A literature review by Kwak and Radler (2002) indicates that the response
rates of electronic survey shave been lower than for traditional mail surveys,
"with the advantage of mail survey over email or web surveys ranging
from 8% to 37.2%" The response rates for the first wave of their comparison
study were 24.2% for the mailed survey and 18.1% for the web survey. Schonlau
et al. (2004) discuss response rates for web and mail surveys. In Chapter
3, their overview noted that "response rates range from 7 to 44 percent
for Web surveys and from 6 to 68 percent for e-mail surveys. " (Schonlau,
p. 20). In addition, direct comparison studies between postal mail surveys
and email surveys show that, "in most studies, the mail response rate
was higher by as much at 21 percent." (Schonlau, p. 21). Libraries conducting
web-based or email surveys might expect to have lower response rates than
traditional, postal mail surveys, although the number of respondents might
be higher than a postal mail, sample survey.

Figure 1 shows the respondents' primary department and degree program by broad discipline. At the UIUC libraries, departmental libraries number over 40 and are primarily geared to subject disciplines. The departmental libraries are grouped by "division" and this library structure was used to determine how respondents' departments would be categorized. For example, persons replying that their primary department was related to chemistry, math, engineering, physics or geology are considered to belong to the physical sciences or engineering category at UIUC, since these libraries comprise the Physical Sciences and Engineering Division. Because the number of respondents from Area Studies was low in number, this division could not be statistically considered in the divisional analysis. These departmental affiliations were moved into the Social Sciences Division for the divisional analysis. Full disclosure showing which departments were included in each division or broad discipline is found at {http://g118.grainger.uiuc.edu/gradsurvey/division/DEPARTMENTS.doc}. The UIUC Library also has a Central Public Services Division, a Special Collections Division and a Technical Services Division, which were not included in this analysis because they do not have a subject focus.

Because it was important to determine how those responding to the survey
by discipline corresponded to the total populations of these areas, Figure
2 was created to show the departmental distribution of graduate and professional
students at UIUC for fiscal year 2004 by division. These departmental enrollment
numbers were grouped by division using the same criteria as for Figure 1.
The respondents are shown to be reasonably representative of their populations.

Figure 2. Distribution of graduate and professional students at
UIUC 2003-2004 by library division.

While the data are able to provide representative results by broad (divisional)
discipline, there were often not enough respondents choosing a specific primary
departmental library to make a reliable analysis of the data at that level.
Even anecdotal opinions and written comments about departmental libraries
have proved interesting and worthwhile.

Five of the thirteen questions allowed respondents to provide written comments, and those five questions generated 1336 replies. A few participants emailed complaints about the survey, unhappy that the web comment boxes limited responses to 255 characters; some respondents had much more to tell us. With many more written comments than we expected, these short text messages about library services and collections are fascinating to read. In general, comments are positive and provide a window on graduate students' perspectives of the library. All comments can be read online, grouped by question (Questions 4, 9, 10, 11 and 13), at {http://g118.grainger.uiuc.edu/gradsurvey/division/}.

The Findings

Overall, graduate and professional students are highly satisfied with
UIUC library services and collections, as shown in Table 1. Satisfaction with
library services is highest among Physical Sciences and Engineering graduate
students while Arts and Humanities have a slightly higher satisfaction with
library collections than other disciplines. It is also important to note that
nearly all respondents were library users. Question 2 asked, "Have you
used the University Library's services or collections (either in-person
or remotely) during this academic year?" Only 2.4% of respondents answered
no, with 97.6% answering positively. While it may be self-selective since
this was advertised as a library survey, clearly library use is important
among graduate and professional students.

Table 1. Number and percentage of graduate and professional students choosing
4 or 5 on a scale of 1-5 with 5 being "very satisfied" and 1 being
"not satisfied," when asked, "How satisfied are you with
the University libraries?*

Arts & Humanities

Life Sciences

Physical Sciences & Engineering

Social Sciences

Unspecified

ALL DIVISIONS

Library services

118 (90.1%)

196 (88.7%)

441 (93.4%)

391 (87.9%)

6 (66.7%)

1,152 (90.1%)

Library collections

122 (93.1%)

194 (87.8%)

432 (91.5%)

395 (88.8%)

6 (66.7%)

1,149 (89.9%)

Overall Satisfaction Level

122 (93.1%)

197 (89.1%)

442 (93.6%)

397 (89.2%)

6 (66.7%)

1,164 (91.1%)

Total Divisional Responses

131 (10.3%)

221 (17.3%)

472 (36.9%)

445 (34.8%)

9 (0.7%)

1,278 (100%)

*Note: Although 1,400 persons started the survey, by question 13, shown
in Table 1, only 1,278 persons responded. Attrition was evident throughout
the survey and the results web pages reflect the loss of participants with
each successive question.

Table 2 shows type of library use by discipline conducted at least weekly
in UIUC libraries by graduate and professional students. Science graduate
students (both life, physical and engineering) visit the libraries least often
in person as a percentage by broad subject discipline, but report higher use
of libraries through office computers, most likely due to having access to
online serials, their primary source of research material. Arts and Humanities
graduate students report using libraries in person at a higher percentage
than other disciplines, demonstrating their continuing need for access to
print materials or other library services, including library computers and
study space. Humanists also report having the highest percentage of library
use from home computers. It may be that they are the group least likely to
have an office computer available to them. Most users are reporting more frequent
use of the library remotely rather than in person on a weekly or more frequent
basis.

Table 2. Number and percentage of respondents by division reporting type
of library use conducted at least weekly.

Arts & Humanities

Life Sciences

Physical Sciences & Engineering

Social Sciences

Unspecified

ALL DIVISIONS

Visit in person

99 (69.7%)

92 (38.7%)

196 (39.0%)

242 (49.4%)

3 (30.0%)

632 (45.7%)

Use office computer

77 (54.2%)

174 (73.1%)

365 (72.7%)

280 (57.1%)

4 (40.0%)

900 (65.1%)

Use home computer

96 (67.6%)

111 (46.6%)

184 (36.7%)

315 (64.3%)

4 (40.0%)

710 (51.4%)

Communicate via email /web/phone

28 (19.7%)

27 (11.3%)

49 (9.8%)

74 (15.1%)

1 (10.0%)

179 (13.0%)

Use on behalf of someone else

8 (5.6%)

7 (2.9%)

2 (0.4%)

43 (8.8%)

0 (0.0%)

60 (4.3%)

Total Divisional Responses

142 (10.3%)

238 (17.2%)

502 (36.3%)

490 (35.5%)

10 (0.7%)

1,382 (100%)

*Note: Although 1400 persons started the survey, by question 3b, shown
in Table 2, only 1382 persons responded. Attrition was evident throughout
the survey and the web pages reflect the loss of participants with each successive
question.

The data in Table 3 show what graduate students report they do while visiting
the libraries in person. The question reported in Table 3 is, "Why do
you visit the University libraries in person? Mark how often you do the activities
listed below in the primary library you use." A marked difference in
both types of materials used and frequency of use is demonstrated in this
table. For example, 58.3% of Arts and Humanities students report looking for
books in the library at least weekly, while only 24.7% of Physical Science
and Engineering students perform this activity at this rate. These data reflect
the nearly classic depiction of the disciplines that says humanists read books
and scientists read articles. Today this might be restated that humanists
read books in print and scientists read articles online. While stereotypical,
this assumption garners some support from the data in Table 3: in terms of
percentages, scientists look for books at least weekly in the library at less
than half the rate of humanists. Numbers of users (rather than percentages)
is another useful way to view the data. Because there are more science graduate
students on the UIUC campus, this group does visit libraries to look for books
in larger numbers than humanists.

The most-often reported reason for visiting libraries in person in three
of the four broad disciplines is "individual study/work." This
is the second-most reason given among humanists, who reported "look
for a book (print)" as their top in-library activity. These data support
the "libraries as place" phenomenon and demonstrate the continued
need for libraries as meeting, study, and discussion places, a separate identity
from the need for library collections and services.

Table 3. Type of library use while visiting the library by number and percentage
of graduate and professional students (by discipline) who responded that these
activities were conducted weekly or more frequently.

Arts & Humanities

Life Sciences

Physical Sciences & Engineering

Social Sciences

Unspecified

ALL DIVISIONS

Look for an article (print)

45 (32.4%)

62 (27.1%)

117 (23.9%)

144 (30.4%)

0 (0.0%)

368 (27.5%)

Look for an article (electronic)

31 (22.3%)

59 (25.8%)

101 (20.7%)

116 (24.5%)

1 (11.1%)

308 (23.0%)

Look for a book (print)

81 (58.3%)

57 (24.9%)

121 (24.7%)

167 (35.3%)

1 (11.1%)

427 (31.9%)

Look for a book (electronic)

26 (18.7%)

26 (11.4%)

42 (8.6%)

67 (14.2%)

0 (0.0%)

161 (12.0%)

Look for other material (e.g. maps, microforms, photos)

22 (15.8%)

3 (1.3%)

5 (1.0%)

16 (3.4%)

0 (0.0%)

46 (3.4%)

Review newly arrived items

10 (7.2%)

12 (5.2%)

23 (4.7%)

30 (6.3%)

0 (0.0%)

75 (5.6%)

Consult library staff

17 (12.2%)

7 (3.1%)

13 (2.7%)

46 (9.7%)

0 (0.0%)

83 (6.2%)

Photocopy

50 (36.0%)

36 (15.7%)

52 (10.6%)

99 (20.9%)

0 (0.0%)

237 (17.7%)

Use library computers

52 (37.4%)

52 (22.7%)

70 (14.3%)

139 (29.4%)

2 (22.2%)

315 (23.5%)

Individual study/work

55 (39.6%)

70 (30.6%)

126 (25.8%)

187 (39.5%)

3 (33.3%)

441 (32.9%)

Group study/work

15 (10.8%)

36 (15.7%)

75 (15.3%)

70 (14.8%)

2 (22.2%)

198 (14.8%)

Browse print journals

33 (23.7%)

27 (11.8%)

58 (11.9%)

86 (18.2%)

0 (0.0%)

204 (15.2%)

Browse the shelves for books

39 (28.1%)

19 (8.3%)

58 (11.9%)

85 (18.0%)

1 (11.1%)

202 (15.1%)

Total Divisional Responses

476

466

861

1,252

10

3,065

*Note: Although 1400 persons started the survey, by question 6, shown in
Table 3, only 1339 persons responded. Attrition was evident throughout the
survey and the web pages reflect the loss of participants with each successive
question.

When asked to identify library activities done remotely, Table 4 shows
respondents reported 2.97 uses per person in this question compared to 2.28
uses per person for the "in-house use" questions (Table 3 total
responses = 3065, respondents =1339; Table 4 total responses = 3927, respondents
= 1320), even though there were more selections possible for in-house use.
This possibly indicates a greater amount of remote use in general. Of Arts
and Humanities graduate students, 58.1% reported using electronic article
indexes at least weekly, very close to the 61.9% of Physical Science and Engineering
graduate students who reported using electronically available indexes at least
weekly. This can be attributed to the comparable number of online databases
now available in these fields and the fact that, due to low use and budget
constraints, many print indexes have been cancelled in recent years --
online may now be the only way to use many subject-specific indexes. Use of
online journals remains higher in the sciences, most likely due to the larger
number of journals in that format available to science graduate students.

It is important to note that nearly 70% of total graduate and professional
students report they search the UIUC online catalog at least weekly (Table
4), while only approximately 32% report they look for a print book in the
library at least weekly (Table 3). Our "front door" for this constituency
is the Library Gateway or unit web pages, not
physical libraries. This is critical information when designing any type of
service or choosing collections to support this group of users. It will be
interesting to see if, in upcoming surveys, undergraduates and faculty also
report greater online use of libraries compared to on-site visits.

Table 4. Type of remote library use by number and percentage of graduate
and professional students by discipline who responded that these activities
were conducted "weekly or more often than weekly."

Arts & Humanities

Life Sciences

Physical Sciences & Engineering

Social Sciences

Unspecified

ALL DIVISIONS

Search the UIUC Online Library Catalog

112 (82.4%)

156 (68.7%)

332 (68.5%)

315 (68.0%)

4 (44.4%)

919 (69.6%)

Search library-provided electronic article indexes

79 (58.1%)

162 (71.4%)

300 (61.9%)

282 (60.9%)

2 (22.2%)

825 (62.5%)

Look for full-text journal articles

64 (47.1%)

182 (80.2%)

334 (68.9%)

296 (63.9%)

1 (11.1%)

877 (66.4%)

Look for other full-text (e.g. reserves, reference works)

49 (36.0%)

93 (41.0%)

172 (35.5%)

247 (53.3%)

2 (22.2%)

563 (42.7%)

Read an electronic book

5 (3.7%)

12 (5.3%)

35 (7.2%)

37 (8.0%)

1 (11.1%)

90 (6.8%)

2 (1.5%)

8 (3.5%)

30 (6.2%)

32 (6.9%)

1 (11.1%)

73 (5.5%)

Look for images (e.g. photographs)

10 (7.4%)

15 (6.6%)

14 (2.9%)

22 (4.8%)

1 (11.1%)

62 (4.7%)

Renew or request materials

53 (39.0%)

37 (16.3%)

94 (19.4%)

126 (27.2%)

2 (22.2%)

312 (23.6%)

Look for information about the Libraries (e.g. hours)

29 (21.3%)

20 (8.8%)

54 (11.1%)

67 (14.5%)

0 (0.0%)

170 (12.9%)

Contact "Ask a Librarian" (the email/chat question service)

1 (0.7%)

5 (2.2%)

11 (2.3%)

19 (4.1%)

0 (0.0%)

36 (2.7%)

TOTAL NUMBER OF RESPONSES

404

690

1,376

1,443

14

3,927

*Note: Although 1,400 persons started the survey, by question 7, shown
in Table 4, only 1,320 persons responded. Attrition was evident throughout
the survey and the web pages reflect the loss of participants with each successive
question.

It was also expected that survey results would show that differences in
how different disciplines rank the importance of research material types (books,
print journals or electronic journals) would be less pronounced in 2004 than
those previously reported by Hiller (2002). The authors planned to compare
the UW's 2004 survey data to UIUC's 2004 data. Both surveys were to cover
graduate and professional students only, and would include a question concerning
the perceived importance of print journals, electronic journals, and books.
At UIUC, question 9 asked, "Please indicate the importance of the information
resources listed below, based on the primary library you use." Since
the populations of the two institutions are considered to be similar, the
2004 responses to this question were expected to be similar. A problem arose
when attempting the comparison since the UW survey did not repeat this specific
question in 2004. Although this lack of comparable data prohibits definitive
conclusions, the comparisons created in Table 5 are nevertheless noteworthy.
In Table 5, a row showing the percent range of difference between the disciplines'
ranking was calculated for each year and material type. The data show a decrease
over time in the differences between the disciplines from 1998 to 2004, indicating
more agreement about the importance of these material types.

For example, in 1998 27.5% of UW graduate students in the humanities and
social sciences ranked electronic journals "very important," while
52.3% of health sciences students and 47.3% of physical science and engineering
students agreed with this ranking. The range between these agreements varied
from 10.6% (between the Life & Health Sciences and Science/Engineering
Sciences) to 90.2% (between Life & Health Sciences and Humanities/Social
Sciences). In 2001, the range of differences grew smaller (7% - 56%), meaning
their views on this ranking became more alike and less divergent. In 2004,
at UIUC, the gap is the smallest, in a range of 5.3% -20.4%.

The data from Table 5 are graphically displayed in Figure 3 and show some
interesting trends. Most obvious are the parallel trends in the perceived
importance of print journals (trending downward) and electronic journals (trending
upwards). Apparently the different disciplines are becoming more alike in
their perception of the importance of print and electronic journals and are
trending in the same directions.

Although the range of difference apparently is shrinking, and is smallest
at UIUC in 2004, the percentage of graduate students at UIUC ranking books
as "very important" is higher than at UW. This could be due to
something as simple as the definition of "books" on each campus,
which could include not only monographs, but book series, dissertations, and
conference proceedings, or the relative size of the monograph collections
at each institution. Even though the adage links books and humanists together
and scientists with journals, in fact at UIUC in 2004 these disciplines have
very similar opinions concerning the importance of books as research materials.
Further research is needed in order to determine why books are ranked higher
at UIUC compared to the UW.

Table 5. Data showing the importance of different material types from the UW Libraries' surveys of 1998 and 2001 (Hiller (2002), Table 7) shown with data from the 2004 UIUC study ({http://g118.grainger.uiuc.edu/gradsurvey/division/}). Percentages are based on those respondents from each subject area selecting print journals, electronic journals or books as "very important" for their research (selecting 5 on a scale where 1 is "not important" and 5 is "very important."). The "% Difference Range" shows the differences between the disciplines' ranking of the importance of these types of materials

PRINT JOURNALS

1998 (UW)

2001 (UW)

2004 (UIUC)

Humanities /Social Sciences

73.5%

67%

58.2%

Life & Health Sciences

88.3%

73.1%

58.9%

Science & Engineering

88.4%

68.8%

51.2%

% Difference Range

.11% - 20.3%

2.7% - 9.1%

1.2% - 15%

ELECTRONIC JOURNALS

1998 (UW)

2001 (UW)

2004 (UIUC)

Humanities /Social Sciences

27.5%

45.2%

74.9%

Life & Health Sciences

52.3%

70.5%

90.2%

Science & Engineering

47.3%

65.9%

85.7%

% Difference Range

10.6% - 90.2%

7% - 56%

5.3% - 20.4%

BOOKS

1998 (UW)

2001 (UW)

2004 (UIUC)

Humanities /Social Sciences

68%

60.4%

65.9%

Life & Health Sciences

31.3%

30.8%

45.5%

Science & Engineering

48.1%

36.4%

61%

% Difference Range

53.7% - 117.3%

18.2% - 96.1%

8% - 44.8%

% Difference was calculated for the largest and smallest differences using
(Q1-Q2)/Q2*100

Figure 3. Line charts showing data from Table 5

It is not surprising that life and health science fields rank the importance
of print journals slightly above other disciplines, possibly due to the color
plates and detailed diagrams this field depends on. Science and engineering
graduate students at UIUC in 2004 show the lowest percentage of users who
consider print journals to be "very important." This is most likely
due to the fact that the sciences have access to more electronic journals
than non-science fields, and because electronic journal backfiles have been
produced and made available first in science fields, greatly reducing the
need for print journals. Another scenario is the possibility that this group
of researchers may be ignoring relevant literature when it is not available
online. This trend has been noted recently by Blowers and Williams (2004)
whose research shows that "online journals show rapidly increasing citation
rates while journals available only in print are leveling off."

Although not conclusive due to the use of data from two different institutions,
the data in Table 5 indicate that the disciplines are becoming more alike
in their opinions concerning the importance of books, print journals and electronic
journals.

Library Priorities from the Survey

One of the primary reasons to conduct the survey was to generate a sense of user expectations. A "priorities" question was designed, asking, "Please select the items you think the University Library should have as priorities during the next three years." A list of 19 services or collection-related items was given to choose from, and respondents could choose as many priorities as they wanted. A comment box was included as optional. Attempts to design this question with rankings or "what not to do" choices were eliminated during the pre-test, considered to be too negative and cumbersome. In addition, restrictions, such as choosing five top priorities, were also discarded as too limiting. The results for this question, shown in Table 6, speak volumes about how our users wish us to proceed. With a nearly unified voice UIUC graduate students from every discipline wish for more electronic resources. The single small departure from this theme is the humanists' top priority, "Maintain the quality of University Library print collections," an option that garnered only three more votes than their second priority (and everyone else's first priority), "Provide electronic access to current articles." When ranked by total votes as shown in Table 6, the top five priorities all relate to increasing electronic access, including shifting resources from print to electronic.

The priorities question also resulted in 151 comments (select "View Comments" at {http://g118.grainger.uiuc.edu/gradsurvey/division/} to view these in their entirety). Our clientele provided us with substantial reading and much food for thought. Although the respondent's primary and secondary libraries are listed with their written comments, it is interesting to note that most graduate and professional students use numerous libraries and comment on many of them, not only on the primary library they use.

Table 6. Number and percentage of the top (overall) six responses to the question, "Please select the items you think the University Library should have as priorities during the next three years." The question's entire results can be viewed at {http://g118.grainger.uiuc.edu/gradsurvey/division/}Total n=1281.

Arts & Humanities

Life Sciences

Physical Sciences & Engineering

Social Sciences

Unspecified

TOTAL

Provide electronic access to current articles

91 (69.5%)

182 (82.4%)

386 (81.4%)

343 (76.9%)

6 (66.7%)

1,008 (78.7%)

Provide electronic access to older articles

79 (60.3%)

183 (82.8%)

377 (79.5%)

299 (67.0%)

5 (55.6%)

943 (73.6%)

Deliver full-text documents to your computer

60 (45.8%)

171 (77.4%)

347 (73.2%)

304 (68.2%)

5 (55.6%)

887 (69.2%)

Provide electronic access to books and reference tools

67 (51.1%)

132 (59.7%)

280 (59.1%)

242 (54.3%)

5 (55.6%)

726 (56.7%)

Shift funds from print to electronic access to journals

37 (28.2%)

131 (59.3%)

271 (57.2%)

187 (41.9%)

1 (11.1%)

627 (48.9%)

Maintain the quality of University Library print collections

94 (71.8%)

99 (44.8%)

214 (45.1%)

209 (46.9%)

3 (33.3%)

619 (48.3%)

*Note: Although 1400 persons started the survey, by question 11, shown
in Table 6, only 1281 persons responded. Attrition was evident throughout
the survey and the web pages reflect the loss of participants with each successive
question.

Tips (or, What We Learned)

A great deal was gained from the survey and much was learned, both about
the survey itself and about library users. Although other libraries will learn
more by conducting their own surveys, some important points surfaced that
might help along the way or persuade libraries to try a web-based user survey.

· Plan ahead, at least a year in advance. Our survey took approximately
12 months from initial planning to recommendations based on the survey results.
The survey was planned and designed in fall and winter and was initiated during
the two weeks preceding spring break. Survey analysis took place in the summer
and by the following fall, recommendations were made based on the survey's
results. In other words, this is not a quick process, but it is a worthwhile
one.

· Check with your library administration and colleagues to see if
the results are likely to be considered as a basis to create change. Surveys
can yield very valuable information about user expectations and opinions;
however, it is not an effective use of time to conduct a study only to realize
that changes are not on the table for discussion. On the other hand, it may
be necessary to just plow ahead the first year and realize that comparative
data over a few years will make changes happen. As one respondent wrote when
asked about library priorities for the future, "Please don't ask for
this information if you don't plan to do anything about it. I don't really
think there will be any change in priorities, and that is sad." For
users to trust the library, survey results must be reported and acknowledged
with a response.

Keep the survey design and implementation team small, no more
than 3-5 members, with consultation outside the group as needed. Large committees
are generally not successful at creating a focused, tight-knit survey.

If you are not a direct part of the survey, be supportive and
trust your colleagues. Review the results and see how they might be applicable
to your situation or position in the library. Get involved if you want to
see the next survey go in a new direction or ask specific new questions. People
who design and implement surveys are usually very open to feedback since this
is the main goal of the survey.

Share the data results in the widest possible way. Our web site
of results was designed to provide maximum flexibility and show data results
in a variety of ways. In addition, the full database of results was made available
on a shared drive in an Excel spreadsheet so any type of manipulation was
possible by anyone in the library, for whatever purposes they wished.

Provide lots of text space for comments on your web survey, even
if you only have one or two questions (estimate how much reading and analysis
time you have first). The prose responses really resonate and provide some
excellent feedback -- often off the topic at hand, but interesting. In
our study, themes developed through the comment boxes and it was advantageous
to be able to track them by the departmental libraries used.

Web surveys and print surveys are very different and have different
characteristics. A print survey does not directly transfer well to a web survey.
A good source for information about all aspects of web surveys is Schonlau,
et al. (2001).

After setting up our surveys to be conducted every year, with
a different type of user each year, a different approach is now being considered.
At the UW, where surveys are conducted on the three groups every three years,
data are comparable among groups since they were collected simultaneously
and subject to the same conditions. This is probably the better approach,
if staffing time and costs can allow for it. Questions might need to be changed
over time in order to assess responses to new services or to follow up on
new trends, while still keeping some questions the same for longitudinal analysis.

PRETEST! It's an easy step to skip, but a critical mistake
to miss out on the chance to dry run your survey. Students love to share their
opinions and like to be in on new things. If you ask, you will get results.
Pretesting is just the first stage of asking.

Conclusions

While tables and figures graphically show us which libraries are used
most frequently or what's on the minds of graduate and professional
students, there are other ramifications to this survey. Data from this UIUC
survey combined with previous research at the University of Washington (Hiller
2002) indicate that, although some differences remain between disciplines
in terms of how libraries are used and what type of research material is needed,
the gap is closing. Stereotypes are changing and with them library models
for research.

In order to keep pace with these changes, it is critical to continue to
ask users what they do in libraries, how they use library materials, and their
perceptions about their library needs in the future. It is even more critical
for libraries to respond and make changes based on those findings. The UIUC
survey results do demonstrate that an appreciation of libraries is shared
among our graduate student constituents, who have very positive feelings concerning
library collections and services. It also appears that, although there was
a small constituency calling for a continued focus on print collections, which
our libraries have no intention of abandoning, the overwhelming majority of
users have demonstrated a substantial interest in electronic resources. In
order to maintain the excellent relationship we have with this clientele,
we must be willing to change with them and continue to support their needs.

The challenge to libraries is how to move our collections from print to
electronic to meet the needs of our users. These changes involve not just
a format change, but modifications to budgeting, cataloging and access, staffing,
facilities, equipment -- nearly every aspect of what we do everyday.
Change is hard and cultural change is very hard, but having survey results
that so clearly point the way is a great step in confirming that, by taking
these steps, we are on the right path.

________. 2002. How different are they? A comparison by academic area of
library use, priorities, and information needs at the University of Washington:
Issues in Science and Technology Librarianship 33 [Online].
Available: http://www.istl.org/02-winter/article1.html [January 13, 2005].

Hiller, S. & Self, J. 2002. A decade of user surveys: Utilizing a standard
assessment tool to measure library performance at the University of Virginia
and the University of Washington. In: Proceedings of the 4th Northumbria International
Conference on Performance Measurement in Libraries and Information Services,
August 2001, Pittsburgh, Pennsylvania: Washington, DC, Association of Research
Libraries, pp. 253-261. [Online]. Available: http://www.libqual.org/documents/admin/hiller1.pdf
[January 13, 2005].

Acknowledgements

The authors gratefully acknowledge the consultation and data provided by Steve
Hiller, University of Washington Libraries. In addition, our thanks for the
invaluable assistance provided by Graduate Research Assistant Tim Donohue
who created the programming for the web survey and collated the final data
analysis. Thanks also to Bob Burger and Lisa Hinchliffe, UIUC Services Advisory
Committee, for their reviews of this paper.