Why Panelists Drop Out: The Answer Can Be Found in the Questions
By James Sallows, VP Client Operations, EMEA
Since the dawn of online market research, data providers and data consumers alike have been concerned – and rightly so – about the quality of online panel data. Lightspeed Research has been at the forefront of this issue, developing and instituting a comprehensive range of data quality processes known as Lightspeed RealRespondents and Lightspeed RealResults to ensure that its 3.5 million panelists provide authentic, high quality survey responses.
While we recognize there is still room for improvement, we have made great strides in quality from a panelist and technology perspective. And, looking toward the future, we are working as an industry to find new respondent sources (river, communities, blends) and more efficient recruiting processes. But we must acknowledge that the respondent pool is a finite resource, and the quality of data it provides over time is completely dependent upon the way we conserve that resource – how we keep panelists engaged in our surveys and eager to take more.
Lightspeed Research has found that one of the best and simplest indicators of panelist engagement is the survey incompletion rate. Incompletes negatively impact not only the data from a particular survey, but also, as incompletes are less likely to take future surveys, the panel overall.
Responsibility for incompletes rests squarely with the survey itself. Survey length, subject matter, questionnaire design, media download speed, interactivity and use of multimedia are the driving factors. In a recent analysis of more than three million web surveys conducted by Lightspeed Research/Kantar, Lightspeed Research found the following causes for respondent drop-outs:
- 35 percent were not interested in the survey’s subject matter
- 20 percent said the survey was too long
- 20 percent grew impatient waiting for media to download (this was more of a factor for older respondents; the 18-34 age group was more willing to wait)
- 15 percent expressed a dislike for and reluctance to complete grids
- 5 percent expressed a dislike for and reluctance to complete open ended questions
Long Surveys, Grids and Other Hindrances
Lightspeed Research has published extensive research demonstrating the impact of questionnaire length on respondent engagement and key concept measures. (click here to view Lightspeed Research White Papers). In a recent study, Lightspeed Research found that among respondents who qualified for a survey, drop-outs increased significantly when surveys exceeded 25 minutes in length. For the 18-34 demographic, drop-outs began to increase at 15 minutes, even though this group initially demonstrated more patience with media downloads. Incompletes occurred even earlier when the survey’s subject matter was perceived as uninteresting.
Grids were a particularly irksome factor causing incompletes. Respondents asked to complete large grids dropped out noticeably at the 10-minute mark, and respondent interest never recovered. Unsurprisingly, too many open ended questions, too many answer choices, questions that seemed repetitive and questions that were difficult to comprehend all contributed to respondent dissatisfaction with the survey experience and resulted in incompletes.
Improving the Survey Experience
Market research as an industry is slow to change, and nowhere is this more evident than in the questionnaires we use for online panel research. While the technology used to deliver and access surveys has developed and changed at lightning pace, the questionnaires remain stubbornly similar to telephone surveys of decades gone by. If we are to increase engagement, we must address the surveys themselves and better understand how to keep respondents interested.
Over the past year Lightspeed Research has been conducting face-to-face usability tests every quarter to assess new survey features, designs, layouts. We have supplemented this research with eye-tracking software to study how panelists view surveys and what draws their attention. This has led to rethinking how we and our clients can learn from errors of the past to design better surveys.
The following were our key learnings:
We’ve heard from respondents that they don’t like “unanswerable” questions, and that they avoid those they perceive as dull, annoying and repetitive. Questions should not mix time frames and units, and should avoid the use of industry jargon.
Adding interactive elements can reap big results. In one recent survey, we included flash/java elements selectively at “key pain” points in a survey – questions that had suffered drop off –identified from past waves. We found that this small change generated considerable improvement in completion rates. However, it’s important to ensure the respondent’s role is at the center of the design. Do not use technology just for the sake of it. Style over substance tends to turn off respondents.
Based on principles set forth in Steve Krug’s book on web usability, Don’t Make Me Think, we can simplify survey web pages, making them more self-evident. With a goal to eliminate all thought on how the navigation is used and how the page is arranged, we can make each click a simple, obvious choice. Also, we should work to slash the number of words on each page. Respondents don’t read pages, they scan, and therefore, we must strive to make the question comprehensible in a moment’s scan. This example, from a recent survey run on our panel, illustrates the kind of question that promotes drop outs.
Conversely, this question is an example of what can be accomplished through focused attention on respondent engagement.
In this study, Lightspeed Research worked with our client to extensively rework the questionnaire. We removed 140 mouse movements and 70 clicks, disclosed the survey’s length early in the questionnaire, clearly stated what we needed and reduced the overall length by nearly 25 percent. As a result, the incompletion rate dropped from 42 percent to 12 percent and data quality improved significantly for the clients’ more than 200 projects.
At Lightspeed Research, we believe the future of our industry – both for providers and clients – is best served by working closely together to provide the best possible experience for respondents. Creating and delivering surveys that keep respondents engaged undoubtedly will improve the vitality and sustainability of online panels, which in turn will enable better insights leading to better business decisions. As an industry, we would be wise to heed the old show business advice, “Always leave them wanting more.”
James Sallows is the VP, Client Operations, EMEA. Based in London, he is responsible for European Client Operations including project management, data processing and survey programming. He is a keen scuba diver and football (soccer) fan.
Back to Home