Why long surveys produce poor quality data

In the past several years,  even marketers that had long resisted bringing their key surveys online, finally began doing so in large numbers.  Problem is,  the questionnaire designs that had served well as telephone surveys (or mail or mall surveys..)  do not translate perfectly to the conditions of online surveying.   An obvious problem with many “legacy” survey designs is that they tend to be over-long,  repetitive, and very often written to be administered by an interviewer in real-time, and are therefore ill-suited to the self-administration survey environment of the internet.  When surveys like this are placed online,  they perform poorly, as seen by low completion rates,  evidence of undesirable behaviors such as straight-lining, and other signs of data quality problems.  With more than 15 years of online surveying experience behind us,  one would think this kind of error would be a rare occurence, but it isn’t.  And,  rethinking survey design is essential when modality shifts,  and perhaps the most effective way of improving data quality.

According to a recent literature review sponsored by AAPOR (summarized in Public Opinion Quarterly, Winter 2010, 74:4),  “The influence of questionnaire design on measurement error has received attention in a number of publications (e.g., Dillman, Smyth, and Christian 2009; Galesic and Bosnjak 2009; Lugtigheid and Rathod 2005; Krosnick 1999; Groves 1989; Tourangeau 1984), and the design of Web questionnaires has introduced a new set of challenges and potential problems. Couper (2008) demonstrates a wide range of response effects due to questionnaire and presentation effects in Web surveys.”

It’s true that marketers tend to have a significant investment in tracking survey designs that may have been used to drive decisions in a firm for many years.   There is likely to be resistance to changing or perhaps dropping questions that are proven predictors.   Mitigation techniques that could be considered  would be careful pretesting to compare response patterns before a full project launch,  using matched samples and split administration to reduce respondent burden,  and avoiding long batteries that require respondents to provide minute distinctions between items.

The more we learn about the inverse correlation between survey length, completion rates, and data quality, the more pressure there will be on researchers to keep research goals for any particular survey concise and precise.  We have long experience in quality questionnaire design for the online environment and can help with survey migration issues such as those described here.

— Dr. Cheryl Harris

Similar Posts

  • April 20, 2015 03:01 ET

    Enterprise Mobile Security: New Global Study Exposes...

  • In the past several years,  even marketers that had long resisted bringing...

  • Here’s another important but under-explored factor in data quality — respondent engagement.  ...

  • The Advertising Research Foundation (ARF) announced a new “Research Quality Super...

  • The first rule of marketing, particularly in online marketing where there is...

  • We all know about the "call to action" and its importance in...

Leave a Reply

Your email address will not be shared or published.

You must be logged in to post a comment.