Conducting Surveys

So you want to survey the Ohio State community?

Besides the office of Institutional Research & Planning, many individuals and groups on and off of our campus conduct surveys of the Ohio State community. The number of invitations to participate in online surveys has increased dramatically, to the point where survey fatigue has become a problem. To see the number of ongoing surveys within Ohio State community, please visit the survey calendar.

If you are interested in surveying the Ohio State community, please think about the following:

The data we collect are archived and can be used to evaluate a wide variety of questions relevant to campus life and student development. IRP can help you consider ways in which the available data can address your questions. For more information about this, please contact IRP.

You may also want to consider alternative methodologies for collecting relevant data such as focus groups and face-to-face interviews. These approaches have important advantages and may be a better choice than a survey if you are exploring a topic about which relatively little is known (in which case, it would be difficult to develop appropriate survey questions) and/or you would like to elicit candid, in-depth information.

It is easy to type survey questions into web survey software, and bulk emails can be sent with the click of button. That should not imply, however, that a quality survey can be turned out quickly. Give yourself ample timefour to six months is not unreasonableto develop a questionnaire that makes sense and is not overly burdensome on the respondent in terms of survey length and difficulty.

Here is a reasonable example of time lines for an online survey:

Step

Time (at least)

Define abstracts/factors to be analyzed

2 weeks

Develop questions, instrument

1 week

Identify sample, obtain sample files

2 weeks

Submit instrument for review to either: IRB, SCC, other reviewers

1 week

Pretest the instrument and make changes

1 week

Instrument in the field

2 weeks

Collect and clean data

1 week

Reporting

4 weeks

Total (at least)

14 weeks

Survey instruments should be pre-tested on a small number of subjects from the population of interest and revised on the basis of their feedback. In addition, you will need to allow time for your project to be reviewed by various constituents within the university, for the printing or programming of your instrument, and for the actual dissemination of the survey to its intended respondents. A survey that meets a tight deadline by cutting corners may miss the overall objective of providing useful data.

Surveys should usually not be administered during periods of peak workloads, or during vacations or holidays. In general, more surveys are run during the spring semester than during the fall, so it might be advantageous to considering fielding a survey in the fall.

Check the survey calendar to minimize respondent survey fatigue and potential conflict with other surveys.

While this may seem too obvious a step to mention, it is a critical part of the research process. Take the time to clearly define the abstracts/factors to be explored. Develop a few research questions you want to address for each of the abstracts/factors. Check in with colleagues and relevant decision-makers to elicit their thoughts on important questions to address. Review the literature on your topic to see how questions have been framed by other researchers, and what related issues or themes should be taken into consideration when exploring this topic. The resources available through the University Libraries are good place to begin a search for relevant literature. Professional associations affiliated with your area of practice also often maintain links to current literature in the field.

The content of the survey stems directly from your research questions. Still, great care is needed to develop survey questions that will effectively and efficiently elicit the information you want. Ideas for specific survey questions can come from existing instruments, colleagues, members of the target population (collected via focus groups or interviews), and your own observations. It is important to balance adequate coverage of your research questions (comprehensiveness) with conciseness. Avoid the temptation to include questions that may provide interesting but not particularly useful results. Also, consider whether some of the data you want is available through other sources such as institutional databases.

Surveys should begin with a statement that clearly explains:

The purpose of the survey

That participation in the survey is voluntary

That the respondent can skip questions he or she would prefer not to answer

Whether responses provided will be treated as anonymous or confidential data

How information from the survey will be reported and used

When composing survey questions, here are some general guidelines to bear in mind:

Survey questions should not be leading or contain jargon or technical terms that may not be understood by all respondents

Response categories should reflect a comprehensive array of choices, including not applicable, don't know and/or other where appropriate

Limit the use of open-ended questions; as much as possible, position these at the end of the survey instrument

Short surveys generate more responses and minimize the imposition on the valuable resource of the respondent's time

Surveys that skip respondents over questions that are not relevant feel shorter and more pertinent to the respondent

Survey methodologists specialize in the construction of survey questions and their response categories. Consider having someone with survey design expertise review your survey instrument: Statistical Consulting Center, IRP, SCC, and CSSL. As mentioned earlier, it is very important to pretest your survey instrument with a subset of your target population. This provides a critical test of the clarity, comprehensiveness, and length of your survey.

Are you planning to conduct a web-based survey? If so, you will need to select an appropriate survey vendor. In addition to the programming and hosting services provided by a potential vendor, you need to know how they safeguard the identity of your participants and security of the data collected. Ohio State purchased Qualtrics for institutional use. For more information, review the Qualtrics' Privacy Statement, Qualtrics' Security Statement and Ohio State's Research Policies.

If your survey can be considered research, it needs to be reviewed by the Institutional Review Board. This process may take two weeks or longer.

According to The Belmont Report: Ethical Principles and Guidelines for the Protection of Human Subjects of Research, research generally includes any activity designed to test an hypothesis, permit conclusions to be drawn, and thereby to develop or contribute to generalizable knowledge (expressed, for example, in theories, principles, and statements of relationships).

In general, surveys related to academic course requirements (i.e., theses, dissertations) and surveys that produce data that will be presented at or published in off-campus venues will require IRB review and approval. New analyses of existing data may also qualify as research in IRB terms. Contact the IRB for further information.

In general it is not necessary to survey an entire population in order to have valid, generalizable results. A random sample will do the job while minimizing costs including the costs of survey fatigue. If you need help determining an appropriate sample size for your project, you may contact IRP for assistance. Texts on survey research design provide guidelines for setting sample sizes; sample size calculators are also available on the internet.

In your survey planning process, allocate sufficient time to compile the sample you will invite to participate in your research. The Ohio State offices providing the samples might need two to three weeks' notice for a data request.

Many campus surveys offer respondents the chance to be entered into a raffle drawing for prizes. Gift cards and personal electronic devices are common prizes. The literature on survey methodology suggests that these kinds of incentives have a modest impact, increasing the response rate slightly.

Surveys that offer incentives to respondents must track respondent identities in some way. If identifying information (such as names, e-mail addresses, or student identification numbers) are kept with the survey responses and confidentiality is promised to respondents, the study needs a security protocol for keeping the data safe (see next item).

Survey respondents need to be told if their responses will be anonymous, kept confidential, or are entirely non-confidential.

Anonymous data do not include names, addresses, student identification numbers or any other personal information that would make it possible to associate a response with any given individual.

Data that are confidential contain information that may identify an individual respondent. There are significant advantages to collecting identifiers, including the ability to do pre and post studies and eliminate unnecessary demographic questions through linked data files. However, files containing individual identifiers must be stored with great attention to data security and access. If you plan to collect confidential data, contact Office of Responsible Research Practices for more information on developing a security plan.

Before you begin your survey, develop a plan for analyzing the data and reporting the results. How will you use the data you collect? With whom will your results be shared? In what format will results be shared as visual presentations, written or electronic reports?

At a minimum, most reports of survey results provide a full set of frequencies for each question. Cross-tabulations of responses across subsets of respondents are also useful, although care must be taken to protect the privacy of individuals' responses. A general rule of thumb is not to report results for categories containing five or fewer respondents. Remember that survey results cannot be presented or published beyond the Ohio State community without IRB approval.

Consider how you might share your results with others on campus who are interested in related questions. By sharing your findings widely, you can not only enlighten the campus community about your work, but you may also be able to head-off a new data collection effort.

Survey Coordinating Committee

The Survey Coordinating Committee coordinates surveys within the campus community in order to reduce survey burden and improve the quality of the information collected.

The committee is chaired by the Director of Institutional Research and Planning, and includes representatives or appointees from the Office of the University Registrar, Office of Human Resources, Office of Student Life, Statistical Consulting Center, Faculty Council, University Staff Advisory Committee, and Institutional Review Board.

Develop and maintain a calendar of surveys regularly administered to faculty, staff, and/or students. The Committee will evaluate and, in consultation with the units administering these surveys, determine appropriate timing and cycle intervals. This does not include Student Evaluations of Teaching.

Review and approve all ad hoc surveys except the following:

Surveys administered in support of academic research conducted by faculty.

Surveys administered to any group within the administering unit. For example, the Department of English could administer a survey to faculty within the Department of English without approval, but would need approval from the Committee to administer a survey to all faculty in the Division of Arts and Humanities.

The College of Education and Human Ecology's Research Methodology Center (RMC) is a fully-resourced academic research center charged with advancing the design and conduct of high quality research in education and human ecology.

The Office of Responsible Research Practices provides administrative services to university researchers that facilitate research, improve review efficiency, and ensure regulatory compliance with research requirements.ORRP staff also support the university committees responsible for review and oversight of research involving animals, human subjects, and biohazards.

The Statistical Consulting Service (SCS) is a team of faculty, staff, and graduate students in the Department of Statistics whose mission is to provide professional statistical consulting support (for a fee) to Ohio State researchers and external clients in business, science, industry, and government.

The Center for Human Resource Research is a multidisciplinary research organization affiliated with the College of Arts and Sciences. They develop survey software, design survey instruments, oversee filed work, and generate and disseminate fully documented data sets.

The Ohio Colleges and Medicine Government Resource Center (GRC) staff and expert faculty have a wide-range experience in the management and analysis of large health care data bases. GRC staff are expert users of statistical analysis and data mining tools in SAS, SPSS, STATA, and R.