.

11 orem ipsum dolor sit amet,

consectetur adipiscing elit. Integer pretium lacus ac malesuada scelerisque. Vestibulum at metus vulputate, sagittis velit malesuada, tristique elit. Maecenas non odio eget nulla pellentesque aliquet a quis ipsum. Praesent condimentum id ipsum sit amet semper. Curabitur fringilla tellus eget purus ultricies viverra. Vivamus varius laoreet nisl sed vehicula. Cras porttitor sapien ut urna elementum, et aliquet tellus interdum. Nulla eget volutpat nisi, a scelerisque mauris.

kcgfshjkhdgfjgsjkfhgsdjfytlqwuygfljhsaglfjgrlytqldytrlvydqliu

gdjwhgrjehwgqrkjhgwqkjrghkqwjhger

kcgfshjkhdgfjgsjkfhgsdjfytlqwuygfljhsaglfjgrlytqldytrlvydqliu

gdjwhgrjehwgqrkjhgwqkjrghkqwjhger

kcgfshjkhdgfjgsjkfhgsdjfytlqwuygfljhsaglfjgrlytqldytrlvydqliu

gdjwhgrjehwgqrkjhgwqkjrghkqwjhger

33 orem ipsum dolor sit amet, consectetur adipiscing elit.

Integer pretium lacus ac malesuada scelerisque. Vestibulum at metus vulputate, sagittis velit malesuada, tristique elit. Maecenas non odio eget nulla pellentesque aliquet a quis ipsum. Praesent condimentum id ipsum sit amet semper. Curabitur fringilla tellus eget purus ultricies viverra. Vivamus varius laoreet nisl sed vehicula. Cras porttitor sapien ut urna elementum, et aliquet tellus interdum. Nulla eget volutpat nisi, a scelerisque mauris.

 

Without years of hands-on experience in survey design, it’s easy to overlook the elements that determine a survey’s success or failure. You create your questionnaire, send it out to an (at least seemingly) accurate audience, and wait for the magic to happen.

But it’s not always the case. You may find yourself realizing your survey has left you with more questions than answers (pun intended).

You start asking yourself:

“What am I doing wrong?”

“Why can’t I encourage my audience to open up?”

“Why did the answers create confusion?”


These are all valid questions, and we’re going to help you tackle the challenges that are behind each and every one.

 

How to approach survey design

Ever heard of Ben Horowitz’s famous silver and lead bullet story? Well, it’s as accurate for survey design, as it was for the famous Netscape v. Microsoft server battle.

While there’s no magical, silver means to strike home with your feedback collection effortlessly, there is a silver lining:

The wealth of wisdom you can learn from.

In the checklist below, we’re going to provide you with a solid handful of lead bullets for the most important strategic areas of survey design. We’re going to present the pros and cons behind the most strategic feedback collection decisions.

So, cross off all the elements on the list and see your survey quality rise exponentially!


Survey Design Checklist:

 

Section 1: Survey Question Design

Should I ask open-ended or closed-ended questions?

Oftentimes, novice survey creators are reluctant about asking open-ended questions. Why would you make your respondents answer in their own words, if you can present them with a choice you can neatly analyze and visualize?

There’s a number of reasons.

While it’s a whole other process to understand descriptive feedback, it offers you more detail than closed-ended questions. Your audience can respond outside the box, maybe even share their pain points or admiration emotionally.


So how does this fit into your product/service?

  • Would triggering an emotional, descriptive response bring you more value than a numerical or categorical question would?
  • Do you want to hear all about how your customer rates his/her stay at your countryside hotel? Or do you just want a “yes/no” or 1-10 answer?

 

Sometimes closed-ended questions can leave your respondents feeling unfulfilled, as they’re not being given space to provide those “it all depends”, descriptive answers.

 

Puzzled man sitting in front of a questionnaire

 

Some user researchers even suggest running your first couple surveys with open-ended questions only.

Why?

Susan Farrell, UX Research and Design Strategist suggests open-ended questions will let you in on your audience’s repetitive answers and feedback patterns. These, in turn, may serve as a future lists for closed-ended questions, which can be easily quantified.

What about the case for using closed-ended questions?

On the other hand, if your audience is business-oriented, and comes to your service/product with a straight-forward jobs-to-be-done (JTBD) approach, open-ended questions might exhaust them and backfire, if used without moderation.

In the words of Steve Krug, they might expect that your survey aligns with a “don’t make me think” mindset. Sometimes, people want a solution that requires as little time or complexity as possible.

If your audience’s goals or your marketing/user personas seem like the latter, you should use open-ended questions wisely.

That being said, the choice between open-ended and closed-ended questions isn’t binary. You can also go for a mixed strategy and use open-ended questions for follow-ups or comments (which we speak more about in the next section).

Which brings us to another important issue…

 

How do I make sure I’m not suggesting any answers?

Regardless of the type of question you choose, it is crucial that you make sure you’re not leading your respondents to answer in a certain way. An erroneously designed question may, for instance, suggest what answer is expected of them or what your attitude towards a given issue is. This is a common survey design trap you might fall into if you want to validate your hypothesis a little too much.

Amy Schade, former Director of UX at Nielsen Norman Group, arguments that leading questions may put the survey participant in an awkward spot to express an opinion other than what is evidently expected.

So what is a non-leading question like?


Here’s an example:

 

Leading: Do you think our new logo would look good in green?

Leading: Do you think our new logo should be red or green?

Non-leading: What color would you find most fit for our new logo?

 

Quite simple once we put it this way, right?

For more information on what it takes to formulate non-leading questions click here.

 

Let’s carry on to…

 

Should I ask follow-up questions?

Short answer? Yes. Let us quickly convince you why (that is, if you’re hesitant).

If you’re looking to send out a CSAT survey or NPS survey, then determining the percentage of your dissatisfied respondents or detractors might not be an insight actionable enough.

Let’s say the results are in and you’re presented with a prominent number of detractors.

Do you already know why they’re this unhappy?

Have you absolutely, thoroughly, whole-heartedly considered where the problem lies?

Unless you know their motivation, you couldn’t have.

 

With a tool like Survicate, displaying live follow-up questions is extremely easy. Our skip logic allows you to finalize the very same survey for your promoters, and set relevant follow-up questions or call-to-actions for neutrals and detractors. All so you can quickly understand how to undo any harm or delight your hesitant audience.

 

Survey design hack – adding follow-up questions to your survey

So, to sum up:

Don’t leave addressing your unhappy audience for a whole other round of questioning.

Wouldn’t you want to be given an outlet for your opinion the first time around?

Follow up and react fast. Your audience expects you to.


Speaking of determining the right questions for the right people, let’s carry on to:

 

Section 2: Segmenting your audience / determining survey sample size

Should I send my survey to everyone or just a segment of my audience?


Not so long ago, pollsters and opinion centers had limited outreach options and relied mostly on modestly-sized samples of respondents they found representative of the population.As we all know, collecting feedback from thousands of people no longer sounds like an excerpt from a sci-fi novel. But that doesn’t mean that sample-oriented surveys are put out to pasture.So how and where are they applicable?

Let’s focus on two elements that somewhat overlap, but are not to be mistaken.

Firstly, as a customer opinion researcher, you should put in the effort to segment your users and prequalify them as future respondents. For instance, Survicate is compatible with an abundance of CRM and marketing automation tools that can provide multiple user tags and attributes for quick customer segmentation.

This, in turn, allows you to address bigger or smaller chunks of users to get only the most relevant answers.

Secondly, you can also take advantage of the decades-long survey sample size technique to compliment your widely-targeted questionnaires. Depending on your needs, you can either derive a sample from the entire audience, or from a chosen segment (for example, a group of active users).

 

Let’s focus on sample sizes a bit more.

What are some good use cases?

You may find is especially effective if you’re aiming at knowing your audience better – for instance, if you want to create or validate your marketing personas. This will prove useful for insights such as your users’ age, gender, or urban/rural lifestyle, to name a few.

You’ll also feel comfortable with this approach if you worry about nudging your audience. You can reach out to your users frequently, without asking the same people several times in a row.

When is it not recommended?

If you’re planning to ask your audience for crucial feedback that conditions your product/service goals or public perception. You want to ask as many people as applicable, especially for data such as NPS, where the bigger the number of respondents, the more accurate your action plan will be.

Which brings us to…

 

Section 3: Survey relevance

How do I make sure my surveys are well-received?

Preventing survey fatigue has a lot to do with proper determination of your target audience and frequency of contact. But that’s not all.

Your respondents can feel overwhelmed if you ask questions that are too lengthy (i.e. both illegible and exhausting) or too obtrusive.

Naturally, sometimes long questions are a necessity and you can’t cut them down if you want to stay accurate. But all other times we recommend you hit that backspace button as far as it’ll go. Especially, if the survey is longer than one question.

In all honesty, which question is more convenient to respond to?:

An example of a short survey question

An example of a survey question that is too long

In addition to the keeping things short, if you use survey targeting techniques well (as described in the previous point), you won’t even have to get into much detail to provide the respondent with context, as shown in the lengthy, second example.

Last, but not least, think of potential eye-strain and the previously mentioned “don’t make me think” advice. Be quick, get to the point, and encourage your respondents to speak.

 

How do I make sure my audience feels listened to?

It’s one thing to fight for high survey response rate in your initial survey design efforts. It’s a whole other to keep response rates on a steady level after you’d already been reaching out to users for a significant time.

See a decline in answers? Or maybe your audience no longer responds to open-ended questions or abandons surveys halfway?

One of the reasons why your respondents stop answering your questions, or stop doing it thoroughly, is because they don’t see the point.

Think about it.

Let’s say a concert venue sends out a “Help us make Venue X a better place!” survey. Respondents spend 10-15 minutes on telling them to finally fix the AC and hint at the beer tasting terrible. Then they go in for another concert and once again end up sweaty, quenching their thirst with the same bland drink. They have every right to be annoyed. And since they’re feeling ignored and robbed off their time, they ignore your favors, too.

Here’s where you might also ask yourself:

 

Should I offer anything in return for completing the survey?

This, in itself, is a topic that deserves a separate post, as the potential pros and cons of offering incentives depend on various factors. Still, there are some key points that will let you see if it’s anything worth considering.

The big question is:

What makes your audience feel rewarded?

Will your respondents be satisfied by simply seeing product/service changes inspired by their responses, or would they appreciate a more material gift?

For example, “Head-of-Marketing Stacy” may or may not care about incentives from SaaS companies at work, but appreciates those offered by her favorite Asian cosmetics store where she shops after hours.

So, here are some things to consider:

  • what does your business offer/sell (for example, fast-moving consumer goods v. business software)?
  • is your audience comprised of private individuals or enterprises?
  • what do your users’ goals or marketing personas say about how they use your product? Would giving them a freebie boost your sales?
  • would your service/product benefit from pairing up with a partner? For example, your audience could receive a 10% discount or trial of a service that nicely compliments your own.

All of the above will influence your audience’s reaction to survey incentives.

Want to hear a pro hint?

If you’re still unsure, take that feedback incentive for a test drive and send out a survey to see how many users are lured by the offer (needless to say, even though it’s just an experiment, promise something you can actually deliver to those who respond).

You can also perform A/B testing with and without an incentive and see which works better. The options are there, so take a minute to think if it’s worth giving a try.

 

Section 4: Increasing open and survey response rate

Once you’ve decided whom, how, and where to target to raise survey accuracy, there’s also the option of raising your audience’s interest and establishing your brand voice through feedback. Here are two options you may want to consider:

Embedding first question in email/website/app

In the fast-paced digital world, users’ attention is being fought for by hundreds of companies, not to mention social media. If a member of your audience gets around to opening an email from you or visits your site, you should act fast.

Plus, if you don’t include a sneak-peak of what the email/message is about and ask the recipient to click on a link, you risk low survey response rate.

Here’s what an embedded question looks like:

An example of effective survey design. Image of email survey and website survey with first question asked in the body.

As you can see, the recipient immediately sees what the message is about and can act upon it in an instant. It’s highly recommended for your most crucial questions.

Only once they respond to the first question, are they taken to the entire survey.

An insider tip?

With Survicate, you save answers for each question. Worst case scenario, even if your respondent answers the embedded question only and discards the rest, you’ll be one step ahead in learning his/her opinion.

 

[Extra credit] User-friendly email subject line

Small confession on my side… I’ve got a little crush on InVision (OK, more like a big-time crush).

There are several reasons for my admiration, one of which are killer email subject lines for their product newsletters.

You see, a while back, InVision started rehashing famous pop music lyrics that relate to respective newsletter topics.

Just take a look (and try not to smile):

Inspiration from InVision email subject lines for survey design best practices

Point being, brands often miss the opportunities hidden behind seemingly small things.

Getting your email surveys noticed (or, if you play it out right, making your audience await each incoming message) is definitely something you’d like to achieve. Just think of the response rates!

 

The action point here is:

Does your brand have a cohesive voice? How can you extend it to the way you ask questions and create survey subject lines?

Don’t worry if your brand voice has not yet been established – here are some effective subject line hacks that will put you on the right track.

Finally, last, but not least…

 

Section 5: Good visual design

Looks matter as much in survey design, as they do in other areas of the digital ecosystem. If you want your audience to respond positively to your website, mobile, or email surveys, you need to make sure they’re interactive and visually pleasing.

This is where you can go a long way with Survicate. You can choose from our predefined themes, use our built-in customization tool to set colors that are cohesive with your branding within seconds, or take things a step further with custom CSS.

Here are just a couple of directions you can go with:

 

Types of targeted website survey design

Regardless of the tool you currently use, make sure your surveys look great and work like a charm.

Mastering survey design

Hopefully, you were able to cross off each element on the list of our survey design best practices, and your feedback strategy seems much clearer and organized.

Now, time to put things into play!

Try Survicate as a tool that lives and breathes survey design. You’ll love our effective, proven survey templates and other features that will take your feedback efforts to a whole new level.