Response rate and measurement differences in mixed-mode surveys using mail, telephone, interactive voice response (IVR) and the Internet

https://doi.org/10.1016/j.ssresearch.2008.03.007Get rights and content

Abstract

The potential for improving response rates by changing from one mode of data collection to another mode and the consequences for measurement and nonresponse errors are examined. Data collection from 8999 households was done in two phases. Phase 1 data collection was conducted by telephone interview, mail, interactive voice response, or the Internet, while Phase 2 focused on nonrespondents to Phase 1, and was conducted by a different mode, either telephone or mail. Results from our study suggest that switching to a second mode is an effective means of improving response. We also find that for the satisfaction–dissatisfaction questions asked in this survey, respondents to the aural modes (telephone and IVR) are significantly more likely than are respondents to the visual modes (mail and web) to give extreme positive responses, a difference that cannot be accounted for by a tendency towards recency effects with telephone. In general, switching to a second mode of data collection was not an effective means of reducing nonresponse error based on demographics.

Introduction

One of the major survey trends of the early 21st century is the design and implementation of mixed-mode surveys in which some people prefer to respond by one type of survey mode while others prefer a different type. Several factors have encouraged the emergence of this trend. First, new survey modes such as the Internet and interactive voice response (IVR) give researchers more choices of which mode to use in addition to the traditional telephone, mail, and/or face-to-face surveys. Second, increases in cell phone use, the corresponding decrease in coverage for RDD surveys, and declining telephone response rates force researchers to consider alternative survey modes for reducing nonresponse error. Finally, previous research has shown that higher response rates can be obtained by the use of mixed-modes. For example, de Leeuw (2005) reported that use of a second or even a third mode may improve response rates and may also improve coverage.

However, mixed mode surveys could have potential drawbacks. For example, it has been learned that different survey modes often produce different answers to the same questions, such as more positive responses to scale questions on telephone than on web surveys (Dillman and Christian, 2005, Christian et al., 2008). If switching survey modes produces different measurement, then response rate gains may be offset by undesirable changes in measurement.

Our purpose in this paper is to simultaneously evaluate the use of a second survey (telephone or mail) mode to improve response rates achieved by an initial survey mode (web, IVR, mail or telephone) and the potential measurement differences between the first and second phases as well as across modes. This will allow us to determine the extent to which mixed-mode designs may improve response rates and whether measurement differences result. In addition, we also compare demographic differences among respondents to each mode, and between respondents and nonrespondents to determine whether respondents to a second mode of data collection vary significantly from respondents to the first mode and the population from which the samples were drawn. The issues addressed here are crucial to the design of quality sample surveys in the 21st century.

Section snippets

Use of a second survey mode to improve response rates

It has long been recognized that some respondents prefer being surveyed by one survey mode, whereas others prefer a different mode. For example, Groves and Kahn (1979) reported that among the respondents to a national telephone interview, 39.4% indicated they would have preferred being surveyed by telephone, 22.7% by face-to-face interview, and 28.1% by mail.

Other studies suggest that giving respondents a choice of which mode to respond to does not necessarily improve response rates. For

Study procedures

Response rate effects are examined for four different initial implementation strategies: a telephone interview, a mail questionnaire, an attempt by telephone to recruit respondents to answer a self-administered IVR survey, and an attempt by telephone to recruit respondents to complete a web survey. After a pause of one month in the data collection effort, nonrespondents to the telephone survey were asked to complete a mail questionnaire, while nonrespondents to the other modes (mail, web and

Response rates

Response rates for each phase of the data collection are reported by treatment in Table 1. During Phase 1 it can be seen that rates varied greatly, from lows of 13% for the web, to 28% for IVR, 44% by telephone and 75% for mail. Such wide variations were not unexpected. Many of the potential respondents contacted by phone in the web survey effort did not have computers and/or Internet access, and some who did have access were unwilling to participate. Completion of the IVR questionnaire

Discussion and conclusions

The use of two or more survey modes in a single data collection effort raises the possibility of improved response rates being achieved. However, those improvements may come at the cost of obtaining different answers to the survey questions asked of respondents. The cost involved is that it is very likely to obtain different answers to each mode. In this study of a national quasi-general public survey of individuals on satisfaction with their long distance service, for whom both telephone and

References (36)

  • Christian, L.M., 2007. How Mixed-Mode Surveys are Transforming Social Research: The Influence of Survey Mode on...
  • L.M. Christian et al.

    The effects of mode and format on answers to scalar questions in telephone and web surveys

  • L.M. Christian et al.

    The influence of symbolic and graphical language manipulations on answers to paper self-administered questionnaires

    Public Opinion Quarterly

    (2004)
  • C. Cook et al.

    A meta-analysis of response rates in web- or Internet-based surveys

    Educational and Psychological Measurement

    (2000)
  • M.P. Couper

    Web surveys: a review of issues and approaches

    Public Opinion Quarterly

    (2000)
  • E.D. de Leeuw

    Data Quality in Mail, Telephone, and Face-to-Face Surveys

    (1992)
  • E.D. de Leeuw

    To mix or not to mix data collection modes in surveys

    Journal of Official Statistics

    (2005)
  • Dillman, D.A., in press. Some consequences of survey mode changes in longitudinal surveys. In: Peter, L. (Ed.),...
  • D.A. Dillman

    Mail and Internet Surveys: The Tailored Design Method

    (2007)
  • D.A. Dillman et al.

    Survey mode as a source of instability in responses across surveys

    Field Methods

    (2005)
  • D.A. Dillman et al.

    Understanding differences in people’s answers to telephone and mail surveys

  • D.A. Dillman et al.

    Effects of category order on answers to mail and telephone surveys

    Rural Sociology

    (1995)
  • D.A. Dillman et al.

    Influence of an invitation to answer by telephone on response to census questionnaires

    Public Opinion Quarterly

    (1995)
  • Dillman, D.A., Robert, G.M., 1984. The influence of survey method on question response. Paper Presented at the Annual...
  • Griffin, D.H., Sally, M.O., 2002. Meeting 21st Century Demographic Needs Implementing the American Community Survey:...
  • R.M. Groves et al.

    Surveys by Telephone: A National Comparison with Personal Interviews

    (1979)
  • Horrigan, J.B., Smith, A., 2007. Home Broadband Adoption, 2007. Pew Internet & American Life Project. Accessed July 10,...
  • C.R. Jenkins et al.

    Towards a theory of self-administered questionnaire design

  • Cited by (509)

    • New Evidence on Nonresponse in Household Travel Surveys

      2024, Transportation Research Procedia
    • Measuring Health Care Work-Related Contextual Factors: Development of the McGill Context Tool

      2024, Journal of Continuing Education in the Health Professions
    View all citing articles on Scopus

    Financial support for this study was provided by The Gallup Organization. Additional support was provided by the Department of Community and Rural Sociology and the Social and Economic Sciences Research Center at Washington State University. The authors wish to acknowledge with thanks the assistance of many Gallup employees who contributed to the data collection and analysis of these data.

    View full text