Just another WordPress.com weblog

Response Rate

Kata Welsh
25 Mar. 2010
19:169:002
Reaction Paper 3
Response Rate
The design features of surveys can heavily affect the response rate in a variety of ways. There are many different features that should be considered when creating a survey to ensure most efficiency and highest response rate. Depending on the type of survey, there are different considerations that must be put into effect. The survey’s introduction is one feature that can make or break survey respondents. In a mail survey introduction, compared to a telephone survey one, the introduction must be a detailed description of the uses and intentions of the survey, whereas the telephone introduction must be brief and straight to the point to avoid hang ups. Also to be considered is timing the call when conducting telephone surveys. Generally between 6 and 9 p.m. on Sunday evenings is acceptable.
Many methods also exist to encourage responses. Prenotification cards and follow-up mailings before and during the surveying period may remind and influence respondents to get to the survey that they may otherwise never open or remember to fill out. Another impact on responses is the availability of incentives to the respondent. More easily awarded to mail surveyors than telephone respondents, incentives usually range from gift certificates, product samples, or monetary incentives typically under $10.
To keep respondents lively and interested in continuing and finishing the survey, tricks such as delaying sensitive questions, keeping to a specific and desired length, and consistent encouragement help to ensure positive reactions towards the survey. Specific interviewer directions as well as limiting response sets and accounting for language and cultural diversity can avoid confusion and hopefully end in the most accurate results.
When reporting the information found through conducting a survey there are six key elements to take into account. Total sample size, as compared to the valid sample size, where all members requested to participate in the study are counted opposed to only those available to respond. Out of this, surveyors can calculate four different rates. The completion rate should be presumed before hand to conclude how many will actually complete the survey out of the total sample. The response and refusal rate count who responded and who declined out of the valid sample, the nonresponse, and the nonconduct rate represents those unable to participate from the total sample.
As read in Gawiser and Witt, not determining these factors can result in error. Either through lacking certain demographics due to nonresponse, data processing computing errors or bias in question ordering or interviewer interference, errors arise. In response to these errors the only real option is to adjust to them. Methods such as weighting can help make the sample more closely resemble the population.
I personally can identify with the benefits of these methods to ensure the likelihood of responses. Each tip and trick makes sense to attract and invite positive responses and confirm a smooth questioning process. When responding to phone surveys or mail surveys, it is more promising that I would participate if the previous steps were taken. The only part I would take issue with is the continuous reassurance. Most people would get annoyed by constant praise and complement for participation in the survey, especially when it does not seem the most honest. That would be the other downfall in this view of increasing response rates. Especially in phone surveys, the emotion from the interviewer does not always seem the most honest and truthful. Most of the time I’m not convinced these interviewers are legitimately concerned for my personal well being, but only for the information they hope to receive from me. It sometimes seems as if I am simply a means to an end for them. It is good to know, however, that these issues are taken into consideration to help ease the process for both interviewer and surveyor.
As a real life response rates example, The Los Angeles Times illustrates their methodology towards polling on their USC College of Letters, Arts, and Sciences poll. They explain how, like explained earlier, to counter polling error, the data is weighted to census regional and demographic characteristics. They also indicated their employment of bilingual dialers to satisfy both Spanish and English speakers as an added convenience feature. Their policy on dialing and redialing in cases of nonresponse telephone surveys is to call each number up to five times permitting callback rescheduling and redialing initial hang-ups. The LA Times’s policy clearly demonstrates what the two sets of authors tried to explain in today’s readings.