Use of Progress Indicators in Online Surveys

By Susan Frede, Lightspeed Research and Zoë Dowling, Added Value

There’s been a lot of discussion and debate about progress indicators in online surveys.  Some consider them a form of encouragement toward the goal of completion, others contend they are largely counterproductive, and still others advocate their use in certain circumstances.

Recently, Lightspeed Research, working with the Kantar InTouch Team fielded research-on-research to investigate the effect of progress indicators in surveys fielded on the MySurvey online panel.  We sought to answer the following questions:

  • Do progress indicators lead to greater survey satisfaction?
  • Do dropout rates decrease when using progress indicators?
  • Are respondents who see a progress indicator more attentive and engaged?
  • What are respondents’ feelings toward progress indicators?
  • Does having an indication of progress impact data (i.e., change business decisions)?
  • Is providing progress intermittently better than providing it on every page?

Currently, the majority of online studies that provide progress feedback employ progress bars.  A progress bar is a graphical block that gradually fills with color or shading as the survey respondent advances through each screen of the survey.  Progress bars are permanently displayed and usually state the percentage of the survey that has been completed.  Previous studies report that respondents become frustrated with progress bars for three reasons:

  • “Watched pot” effect – Respondents exposed to a traditional, percentage-based progress bar displayed on every screen become discouraged when the changes they see from screen to screen are too small.
  • Accuracy – The nature of skips and piping is such that a smoothly moving progress bar is not always possible, and this is at odds with the expectations of respondents.
  • Bad news – Progress bars may deliver bad news to respondents when they learn they are not as far along as they had hoped.

Intermittent progress indicators can be used as an alternative to percentage-based progress bars and are the focus of this research-on-research.  These provide a less granular measure of progress, based on the sections of the survey the respondent has completed. Anecdotal evidence suggests intermittent progress indicators may be preferred by respondents.  In addition, they have been found to be a more accurate, practical and cost-effective method on surveys whose programming prevents an accurate percentage-based progress bar.

Research Design

This research utilizes a test-control design to answer the research questions.  The type of progress indicator, as well as the length of the survey varies.  This results in a six-cell study:

  • No progress indicator, 10-minute survey (control)
  • No progress indicator, 25-minute survey (control)
  • Permanent progress indicator, 10-minute survey
  • Permanent progress indicator, 25-minute survey
  • Intermittent progress indicator, 10-minute survey
  • Intermittent progress indicator, 25-minute survey

Both the permanent and intermittent indicators provide progress through defined survey sections, NOT the exact percentage-based progress.  The permanent indicator is displayed on every screen while the intermittent indicator is displayed at the beginning of each section.  When dividing the survey into sections, logical breaks in the survey were considered while trying to maintain similar lengths for each section.

Weighting has been applied to each of the six cells so that age and gender are consistent.

Survey Satisfaction

Progress indicators increase survey satisfaction.  This includes a higher level of survey enjoyment as well as higher rates of reporting the survey as interesting, easy to answer, relevant, and less repetitive.  The intermittent progress indicator performs better than the permanent indicator on enjoyment.

Dropout Rates

With the shorter survey, use of an intermittent progress indicator leads to lower dropout rates.  Both progress indicators increase dropout rates with the longer survey, so this approach does not appear to overcome the bad news issues impacting continuous progress bars (i.e., respondents learn they aren’t as far along as they had thought).

Respondent Engagement

Progress indicators increase respondent engagement.  Those seeing an indication of progress spend more time completing the survey than those with no progress indicator.  This is especially true of the longer survey.

Straightlining behavior on grid questions also was examined.  Very minimal straightlining occurs, and it does not vary based on progress indication.

Feelings Toward Progress Indicators

Based on prior analysis, very few respondents unaidedly mention wanting progress bars. However, in this research on research, when the respondents in the no indicator cells are asked directly whether they prefer to have some indication of progress 75 percent say yes.

Respondents who saw a progress indicator were asked if they noticed anything different or new in the survey layout.  Those who answered “yes” were then asked what it was.  Those seeing the intermittent progress indicator are more likely to say they noticed something different, and of those, more indicate that the progress indicator is what they noticed.

Respondents who saw a progress indicator were asked how helpful it was to be kept informed of their progress.  The intermittent progress indicator is seen as more helpful while the permanent one was less likely to have been noticed.

Impact on Data

The presence or absence of a progress indicator does not impact data nor change business decisions.  There are some statistical differences across the cells, but there are no consistent patterns to those differences.


Results are mixed on the use of progress indicators.  Progress indicators do increase survey satisfaction and respondent engagement.  When asked directly if they want some indication of progress, most respondents say they do.  In addition, including a progress indicator neither impacts the data nor changes business decisions.

Dropout rate are higher when progress indicators are used in longer surveys, which likely results from respondents’ learning they aren’t as far along as they had thought and thus, deciding to drop out.

The intermittent progress indicator has advantages over the permanent indicator.  Survey satisfaction is higher with the intermittent progress indicator.  With short surveys, the intermittent progress indicator leads to lower dropout rates.  Respondents are more likely to have noticed something new in the survey layout with the intermittent progress indicator and find that it is helpful to be kept informed of their progress.


For shorter surveys (less than 20 minutes), we recommend the use of an intermittent progress indicator, as it has been found to lead to higher survey satisfaction and lower dropout rates.  This R on R has shown that respondents are likely to find the intermittent progress indicator helpful in keeping them informed of progress, thus improving their survey experience.

Furthermore, this research provides additional evidence that surveys need to be kept as short as possible.  The shorter survey is always more enjoyable than the longer survey.  In addition, adding a progress indicator to a long survey leads to higher dropout rates.

About Kantar InTouch & Authors

InTouch is Kantar’s respondent engagement program, aiming to continually reinvent the research experience in the face of the ever-changing wider digital experience of consumers.  By ensuring continued engagement, the objective is to secure high quality data, lower dropout rates, and healthy panels.  InTouch focuses on three areas:  Web Surveys, Mobile, and Social Media.  The goal with Web Surveys is to provide tools, best practices and research on research on questionnaire language, questionnaire design, interactivity and look/feel.

Susan Frede is the VP of Research at Lightspeed Research. She has worked in the research field for 24 years, has published numerous research-on-research papers and is a well-respected speaker at key industry events. Some of the topics she has recently explored include questionnaire length, best practices for online research, suspicious and professional respondents and data stability. You can contact Susan at

Zoë Dowling is a Vice President at Added Value where she heads up R&D and Offer Innovation projects covering a variety of areas for both the North American business, and the AV Group. Zoë also leads the Web Surveys team for Kantar InTouch. Her doctoral research investigated web data collection for government surveys. You can contact Zoë at


Back to Home