This article was published on November 8, 2011

Online Surveys: Why to use them and how to make them great


Online Surveys: Why to use them and how to make them great

As 2011 draws to a close, many businesses are busy planning next year’s budgets, sales, marketing, operations plans, product lineups, etc. If you’re not the corporate type, chances are you’re at least giving some thought to what’s on tap for next year. And as we all know, social media can be a great way to garner a ton of useful information from your customers and/or fans, however I always find it to be a bit “here and now” and not given the proper, long-term analysis it’s due. Therefore, with only 53 days left in 2011, wouldn’t it be a good time to survey your audience and see what’s on their mind?

Creating a user survey can seem like a daunting task. With so many possible questions and data sets, it can be difficult to narrow down just what you want to know. The trick here is to construct a survey that will not only bring you valuable, actionable data when properly distilled, but also present it in a manner that brings in the highest number of valid responses possible.

Where we goin’?

Before the crafting of your actual survey begins, it’s time to play with the white board. What information are you looking for from your customers?

Have you noticed a few users here and there using your product in an unexpected manner? You might have a trend on your hands here – ask your users about it!

Are you looking to test the waters on what features that you’ve introduced have worked well (or not so well) with your audience? Or how about those feature requests that keep piling up? Perhaps it’s time to list them out in a survey and see what ranks at the top.

How satisfied are your customers; both short (relative) and long (loyalty measurement) term?

In addition to the overall plan and desired data from the survey, you’ll also want to consider frequency. With the above examples, you might want to measure loyalty and trend spotting on an annual basis, whereas customer/user satisfaction could be done on a bi-annual schedule. Obviously, you’re after as much actionable data as possible, but you also need to keep in mind user fatigue. Remember, your users are generally happy to provide feedback, but only if you ask them at the right time and in the right way.

Short and Sweet

Building upon the point above, resist the urge to throw in everything but the kitchen sink. A quick test for this is to ask yourself, “What am I going to do with the data gathered from this answer?” If you don’t have a solid action point behind it, leave it out.

If during your testing phase (see below), it takes any respondent more than 10 minutes to complete your survey either edit, edit, edit, or consider shaving off a few questions and repackaging them into others, or saving them for a completely different survey altogether.

Don’t be afraid to let other surveys serve as an example. Think about the last online survey you completed. Did it take any longer than 5 minutes? I’d bet that if you can actually remember that survey, it was no longer than 5 minutes.

Finally, let your respondents know where they’re at. A number of popular online survey building tools already offer this as a built-in feature. Each page of your survey should include an X out of Y items to complete, giving users a gauge of just how much of their time your survey is going to require.

Bonus: Time your test sessions, determine the average length of time it takes to complete, and build this into the survey title. i.e. Dan Taylor’s superawesome 6 minute survey!

K.I.S.S.

One of the toughest obstacles for survey drafters is to keep things simple enough so that the largest swath of respondents will understand the questions, but also not appear condescending to power users.

Avoid industry jargon and overly complex questions. If you’re asking for a “Yes” or “No” answer format, read, re-read, and then re-read those questions again to ensure that your grandmother could provide an answer. If you’re after a write in response, make sure that your target audience clearly understands the question.

Try to avoid too many questions that would be pointed at the power-user and up group. Sure, there are probably a few items on your list that are only going to apply to a certain percentage of your audience, but that doesn’t mean that you need to test your nerd cred. When phrased properly, you can gather the feedback that you want from this group, but at the same time, perhaps make a few converts from “casual” to “power” user along the way.

CIISP, PMP, ROI, KPI. Un-huh. There’s absolutely no need for acronyms within your user survey. Yes, there will be those that know them all, backwards and forwards, but that’s really a small percentage of your audience. If a user has to Google an acronym meaning…well, they’ve already tabbed a new browser window. What’s to say they’re coming back?

Stay on Target

Remember the last bar room conversation you had with someone you knew, but not that well, and while the conversation started out great, 15 minutes in, you’re wondering where exactly this is headed, and what their point is? Yeah. Kinda like that.

Keep your survey on target throughout the question cycle. Repeat after me:

Thou shalt not ask open-ended questions that require specific answers

…and are subsequently difficult (and speculative) to analyze.

That’s not to say that opinions, thoughts, memory recall, etc. don’t serve a place in your survey, but remember, you’re going to get exactly that: opinions. Ideally, your survey should contain Yes/No or multiple-choice answers to provide crystal clear data during the analysis phase. Going beyond these two options; many popular online survey tools offer ranking and matrix formats, but keep in mind, the survey should take 10 minutes at the maximum to complete (see K.I.S.S. above).

Logical Progression

Books are organized into chapters that provide a logical progression of developments. Your email is ordered by date received, and depending on your settings, conversations are grouped in a chronological order, providing a natural flow of conversation in a written format. The same should be true for your online survey.

If you’re spotlighting a series of user features, start at the beginning and work your way through the feature set. Opening survey questions should be easy for respondents to answer, as well as interesting enough to have them sink their teeth into the rest of the work. Remember, as we’re working with a logical progression, introduce one main topic and follow through with all related questions before moving on and introducing a second main topic. In this scenario, it would be helpful to use an individual page for each topic with subtopics listed below.

And since we’re discussing logic, don’t forget about using conditional branches and/or skip logic. In a nutshell, conditional branching will present users with a different “B” question based on their response to “A”. Skip logic will forward respondents to another part of the survey (“D”) or carry on to the next question “C” based on their response. There’s no better way to make a survey more relevant to the user than by employing a few of these methods.

Last, but not least, marketers love all kinds of demographic information. However, hold these questions until the very end of your survey. As with any handing over of information, trust needs to be established, and users are far more likely to hand over their age, income, etc. if they believe in your survey, and understand why you’re asking the question. Putting these questions too close to the beginning of your survey is akin to asking for the phone number before buying him/her a drink.

Design Matters

No, you do not need a professionally styled survey that will win multiple design awards. However, you do need to keep a few items in mind when it comes to the look and feel of your survey.

  • Ease their mind. By definition, humans are infinitely inquisitive. If you’re going to conduct on online survey, be upfront with your respondents and let them know what you’re going to use the data for. Do not introduce leading statements here, but rather, how you will use the data. I.e. there’s a big difference between, “We’re looking forward to hearing how much you enjoy our product,” and “We’ll be using this data to provide you with a better experience.” This intro page is also a good time to inform respondents approximately how long time the survey should take to complete.
  • Consistency. This should go without saying, but if you’ve set your ranking system to 1 – 5, 5 being the best, make sure that you’re sticking to the same scale across the board. Do not offer users a 1 – 5 on question number 3, but 1 – 10 on questions 7, 8, and 9.
  • Maximize the data. If you’re asking multiple choice questions, or even Yes/No, favor radio buttons instead of drop down menus. Radio buttons are far easier to read, and a twitchy mouse finger can click/drag to a false answer quite easily with the other options quickly masked.

It’s also a good idea to offer “Not Applicable” or “Do Not Know” buttons, as this will provide further information about a specific question, and provide data as opposed to a “field left blank” response. Remember, any and all data can be analyzed. No data is a wasted field.

Test it!

Once you’ve defined what you’d like to achieve from the survey, formulated your short, specific questions in a logical, engaging, and visually pleasing manner (phew…got all that?), it’s time to test it.

As with anything that goes before the public, a devote round of testing needs to be implemented. A two-step approach towards the publishing of your survey should include:

  • A small internal team. This will ensure that you’re not missing any essential bits of information that internal teams are interested in. It also helps with the readability and phrasing of your questions. With that said…see K.I.S.S. above.
  • A test sample. Got some active voices in your support forum? How about a group of top using customers? These are ideal groups to first implement your survey with, as they’re your most active users, good or bad. Be sure to get permission from these individuals before bombarding them with a survey, but their responses will help you further fine tune your questions.

After either (or both) of these groups have completed the survey, you might want to consider either having a quick chat (for internal teams) or a light email exchange with the sample group. Specific information that you should be looking at:

  • Do respondents understand the point of the survey?
  • Is the wording/phrasing of the survey clear?
  • Were there any questions that could not easily or readily be answered?
  • Are the answers collected relevant to the data you’re seeking?
  • Are the answers collected diverse enough?
  • Does the survey take too long?
  • Are there any key issues missed?

I cannot stress enough how important the testing phase is to making a winning survey. Those that are working closely on the survey project often can not see the forest for the trees. Thus, having an outside audience is crucial to ensuring that you’re asking the right questions, in just the right way.

Prime the pump

OK. You’ve done all the major steps in getting your survey in order. You’ve done multiple in-house and small group tests, refined your questions, and have your sights set on what information you’d like to acquire and what you’ll do with it. So now what?

Now is the time to determine an acceptable response rate. By definition, a response rate is the number of completed surveys divided by the number of participants invited. Various resources point to a “good” online survey response rate to be anywhere between 25%-30%.

But that’s a lot of work to only get a 30% response, isn’t it? So what can you do to up these numbers?

  • Let your users know about the survey long before it goes live – This informs users and lets them work it into their schedule, not yours
  • Two week maximum – Give your survey defined open and closed dates. SuperSurvey found (PDF) that half of their respondents reply within one day, and 96.5% replied within two weeks.
  • Ease of use – Covered above, but the simpler and easier you can make your survey, while still capturing the data you require, the higher your response rate will be.
  • Remind them – We’re all running at 1000 mph and don’t always remember everything we’d like to. At the end of the first week, send your users a reminder that your survey is underway. One day before the conclusion of your survey, send them a final reminder. If you have a way to track which users have and which have not responded, filter out those that already have.
  • Entice them – C’mon, admit it…we all love free stuff (or even the chance to win free stuff). Your users are no different. Want to know about their Android preferences? Hook them up with the latest tablet. Want to learn about their general thoughts? Amazon is your friend. Don’t want to spend out-of-pocket? Who ever said no to a product discount?

Bonus: Share it!

Now that your users have made their way through a short and sweet survey and picked up their ‘thank you’, what’s next? Well, there’s a whole lot of analysis to do on your end, and I’m no statistician, so I won’t even go into the technicalities of what you should be doing with the data you’re collecting, but you should think about how and what you’re going to share with your users. Obviously, some data should be guarded, but there’s a number of things that you could, and should share with your users.

If your survey was asking for a number of feature requests – put them on display and let your users know how and what you’ll be doing over the next couple of months. If you’re asking for mobile usage habits, platforms, etc. let your users know that you’re looking for an Android developer and that they should expect a beta in the near future.

If you need or want further information about specific topics, this is a great time to ask. If you came up short on iOS habits, share your Android data and ask users if they wouldn’t mind filling in a follow-up survey.

Warning: Treat this with extreme caution, as you’re now asking users for another action.

One of the fundamentals of online and social media marketing is having a conversation with your audience. Often over looked, the annual, bi-annual, etc. survey can be a great way to (semi) formalize the conversation and ask your users directly what they think, and where they’d like to see your organization go. By applying these rules to your next user survey you’ll not only maximize your return, but when applied properly, can generate a whole new avenue of back-and-forth between you and your users.

Get the TNW newsletter

Get the most important tech news in your inbox each week.