Here's What I learnt about creating an online survey and UX

I analysed over 2,000 surveys and this is what I learnt about creating an online survey and its UX

To learn more about the UX and the science behind creating an online survey, I analysed over 2,000 surveys.

Firstly, I established some concepts I wanted to test such as:

- The number of survey questions

- Progress bars in a survey

- Matrix questions

- Requiring individuals to complete all of the questions

- Agree to disagree scales.

Then, I looked into the correlation between online survey results and the user experience (like completion rate).

Let’s dive right into the results.

Here is a summary of the key findings:

- Participants take more care when answering shorter questions and rush through longer surveys.

- Longer surveys increase the likelihood of participant dropouts

- Progress bars are useful to include when creating an online survey

- Matrix questions should have a maximum of 5 columns and rows

- Look out for straight-lining and other such tricks.

 

Before we start, it’s important to note that you can download the results from this blog post via – RESEARCHGEEKSURVEY ANALYSIS

 

On average, participants take more time per question when responding to a shorter online survey.

A good survey should consider both your needs for data and the capabilities and experiences of the respondents. It should also collect complete, accurate and reliable data. The connection between the number of questions in an online survey and the time spent answering each question is not linear. The more questions you ask, the less time your respondents spend, on average, answering each question. In other terms, the more answers you wish to find out from your participants, the faster someone could complete the survey. Potentially reducing the quality and reliability of your data.

 

Time spent on online surveys (@ResearchGeek)

 

On average, participants took over a minute to answer the first question in a survey, including any time spent reading through instructions. And then spend on average 5 minutes in total, answering a 10 question survey. However, as you can see from the table above, participants take more time per question when responding to shorter surveys compared to longer surveys.

Longer surveys increase the likelihood of participant dropouts.

We can’t assume longer surveys contain less thorough answers as it will depend on the type of survey, the audience and the relationship of participants to the survey. As well as other factors such as participants lifestyle and motivations.

However, data shows that individuals spend less time answering longer surveys.

In addition, from the analysis undertaken from 2,000 surveys, abandon rates (participants who quit the survey before completing) increased for those that took more than8 minutes to complete. And with completion rates dropping, sometimes anywhere from 5% to 25%. The patience for lengthier surveys is greater for those within the education and work sector, however, it decreases when they are customer-facing.

 

What this means when you are creating an online survey

 

Take survey completion time into consideration as you create your online survey. Make sure you’re balancing your participant's profile and online survey goals with the total number of questions.

 

Progress bars are useful to include when creating an online survey

 

Granted, these are small differences of only a few percentage points, but every little bit counts when keeping your completion rates high and reducing bias.

I compared completion rates of online surveys with and without progress bars to understand if it would impact customers results.

All surveys were ten pages long and asked the same 13 questions in the same order; the only difference between them was the placement and type of progress bar. Over 20surveys were tested and over 500 responses.

Once the surveys were closed, I compared the completion rates for each of the different surveys with various progress bars.

More positive results came from moving the progress bar to the bottom of the page. And all in all, the bottom progress bars with visual scales showed the most consistently positive results.

Progress bars are a valuable tool to use and should be at the bottom of the survey. The visual scale displayed alone, without page numbers is best.

Progress bars can act as a coach, encouraging people to keep on track and reach that finish line.

 

The analysis found that matrix questions should include a max of five columns and rows.

When you want to ask many questions using the same response options, you most likely need to turn to the mighty matrix question.

This question format places similar questions in a single grid instead of presenting them one by one.

This helps to organise your survey conceptually, but it’s also easy to go too far with questions and response options.

 

From asking the same 20 questions on an agree-disagree scale to respondents. Some people had only three response options, some had 5, and some had 7.

 

Additionally, some saw questions with a maximum of 5 rows a page (meaning they had four pages for the 20 questions), some had ten rows (2 pages with ten questions per page),and some had 20 rows (all 20 questions on one page).

 

By the survey's end, all respondents were asked to rate the survey’s layout and difficulty.

 

Even though everyone had the same questions, the layout affected some of our outcomes. In particular, when there was a smaller matrix with only five rows on a page, as compared to a larger matrix with 10 or 20 rows, people dropped out less and rated the suey as easier to take. There were no differences in reliability or completion time.

 

Now, this will of course depend on the exact number of questions you need and what kind of response options you have, but your best bet is again, to go small. Keep your matrix to no more than five rows per page, and your response options to the lowest necessary to get reasonable answers. Most importantly, always think of the respondents’ experience when creating your survey.

Should you require participants to complete every question?

You have designed your survey and you’re ready to launch—congrats!

 

But one nagging worry: I’ve put all this work writing my survey, but what if people skip some of my questions?

 

Well, not so fast. According to researchers, there are some definite downsides to requiring questions. If you require all questions in a survey you might end up with fewer responses overall.

In addition to getting fewer overall responses, an even bigger potential problem with requiring questions is getting wrong responses.

Only require questions when necessary.

For example, it’s a great idea to make a question required to create weighted responses or to cut your data. The good news is that most people don’t skip questions even if they can.

AND THIS HAS BEEN TESTED!

We tested two different types of surveys (1 – all questions required, 2 – no questions required). And there were some stark differences.

Those surveys where questions were all required received a lower response rate overtime, whilst the survey that was free to answer and skip was answered more. It was also felt that the former survey was answered wrongly at times due to the participants needing to answer each question.

 

The more someone cares about a survey, the more time and effort they will put into completing the survey.

 

An alternative is to require all questions but includes a Don’t know or prefer not to answer.

 

Should you include agree to disagree survey scales?

 

Even in tough situations, people generally try to be positive and agreeable. While that’s often a virtue in life, it isn’t when it comes to surveys. It can introduce bias in your survey.

And I am going to show you several ways agree/disagree questions can cause your respondents to answer in a way that doesn’t always reflect their true opinions.

 

You might think that by asking either question,

There are two major types of problem that can arise from using agree/disagree questions.

 

Acquiescence bias: People tend to say they like things, to say “yes” to things, or to agree with things—even if they don’t feel that way.

 

Straight-lining: Another way that agrees/disagree questions cause problems is by making it easy to go through many questions of a survey and select the same answer every time without actually reading the options.

 

If your survey bores your respondents by asking them a series of agree/ disagree questions, this research demonstrates that they aren’t going to be particularly focused or conscientious when they’re responding. And what’s the point of asking a question if you know your respondents aren't going to try to answer it well?

 

Item-specific questions require a little more effort to answer than agree/ disagree questions do.

 

But don’t worry—the benefits outweigh the negatives. More effort means more concentration and more care while responding, which in turn leads to more well-considered responses and more accurate data.

No items found.

More Recent Stories.