• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar

5 Circles Research

  • Overview
  • SurveysAlaCarte
  • Full Service
  • About
  • Pricing Gurus
  • Events
  • Blog
  • Contact

SurveyTip

Leave a Comment

IT terminology applied to surveys

James Murray is principal of Seattle IT Edge, a strategic consultancy that melds the technology of IT with the business issues that drive IT solutions. When James gave me a list of things that are central for IT professionals, I thought it might be fun (and hopefully useful) to connect these terms with online surveys for market research.

[Warning: if you are a technical type interested in surveys, you might find this interesting. But if you aren’t in that category, I won’t be offended if you stop reading.]

Scalability

The obvious interpretation of scalability for IT applies to online surveys too. Make sure the survey tool you use is capable of handling the current and predicted usage for your online surveys.

  • If you use an SaaS service, such as SurveyGizmo or QuestionPro, does your subscription level allow you to collect enough completed surveys? This isn’t likely to be an issue if you host your own surveys (perhaps with an open-source tool like Lime Survey) as you’ll have your own database.
  • Do you have enough bandwidth to deliver the survey pages, including any images, audio or video that you need? Bandwidth may be more of an issue with self-hosted surveys. Bandwidth might fit more into availability, but in any case think about how your needs may change and whether that would impact your choice of tools.
  • How many invitations can you send out? This applies when you use a list (perhaps a customer list or CRM database), but isn’t going to matter when you use an online panel or other invitation method. There are benefits to sending invitations through a survey service (including easy tracking for reminders), but there may be a limit on the number of invitations you can send out per month, depending on your subscription level. You can use a separate mailing service (iContact for example), and some are closely integrated with the survey tool. Perhaps the owner of the customer list wants to send out the invitations, in which case the volume is their concern but you’ll have to worry about integration. Most market researchers should be concentrating on the survey, so setting up their own mail server isn’t the right approach; leave it to the specialists to worry about blacklisting and SPF records.
  • Do you have enough staff (in your company or your vendors) to build and support your surveys? That’s one reason why 5 Circles Research uses survey services for most of our work. Dedicated (in both senses) support teams make sure we can deliver on time, and we know that they’ll increase staff as needed.

Perhaps it’s a stretch, but I’d also like to mention scales for research. Should you use a 5-point, 7-point, 10-point or 11-point scale? Are the scales fully anchored (definitely disagree, somewhat disagree, neutral, somewhat agree, definitely agree)? Or do you just anchor the end points? IT professionals are numbers oriented, so this is just a reminder to consider your scales. There is plenty of literature on the topic, but few definitive answers.

Usability

Usability is a hot topic for online surveys right now. Researchers agree that making surveys clear and engaging is beneficial to gathering good data that supports high quality insights. However, there isn’t all that much agreement on some of the newer approaches. This is a huge area, so here are just a few points for consideration:

  • Shorter surveys are (almost) always better. The longer a survey takes, the less likely it is to yield good results. People drop out before the end or give less thoughtful responses (lie?) just to get through the survey. The only reason for the “almost” qualifier is that sometimes survey administrators send out multiple surveys because they didn’t include some key questions originally. But the reverse is the problem in most cases. Often the survey is overloaded with extra questions that aren’t relevant to the study.
  • Be respectful of the survey taker. Explain what the survey is all about, and why they are helping you. Tell them how long it will take – really! Give them context for where they are, both in the form of textual cues, and also if possible with progress bars (but watch out for confusing progress bars that don’t really reflect reality). Use survey logic and piping to simplify and shorten the survey; if someone says they aren’t using Windows, they probably shouldn’t see questions about System Restore.
  • Take enough time to develop and test questions that are appropriate for the audience and the topic. This isn’t just a matter of using survey logic, but writing the questionnaire correctly in the first place. Although online survey data collection is faster than telephone, it takes longer to develop the questionnaire and test.
  • Gamification of surveys is much talked about, but not usually done well. For a practical, business-oriented survey taker, questions that aren’t as straightforward may be a deterrent. On the other hand, a gaming audience may greatly appreciate a survey that appears more attuned to them. Beyond the scope of this article, some research is being conducted within games themselves.

Reliability

One aspect of reliability is uptime of the server hosting the survey tool. Perhaps more relevant to survey research are matters related to survey and questionnaire design:

  • Representativeness of the sample within the target population is important for quality results, but the target depends on the purpose of the research. If you want to find out if a new version of your product will appeal to a new set of prospects, you can’t just survey customers. An online panel sample is generally regarded as representative of the market.
  • How you invite people to take the survey also affects how representative the sample is. Self selection bias is a common issue; an invitation posted on the website is unlikely to work well for a general survey, but may have some value if you just need to hear from those with problems. Survey invitations via email are generally more representative, but poor writing can destroy the benefit.
  • As well as who you include and how you invite them, the number of participants is important. Assuming other requirements are met, a sample of 400 yields results that are within ±5% at 95% reliability. The confidence interval (±5%) means that the results from the sample will be within that range of the true population’s results. For the numerically oriented, that’s a worst case number, true for a midpoint response; statistical testing takes this into account. The reliability number (95%) means that the results will conform 19 out of 20 times. You can play with the sample size, or accept different levels of confidence and reliability. For example, a business survey may use a sample of 200 (for cost reasons) that yields results that are within ±7% at 95% reliability.
  • Another aspect of reliability comes from the questionnaire design. This is a deep and complex subject, but for now let’s just keep it high-level. Make sure that the question text reflects the object of the question, that the options are exclusive, single thoughts, exhaustive (with don’t know, none of the above, or other/specify as appropriate).

Security

Considerations for survey security are similar to those for general IT security, with a couple of extra twists.

  • Is your data secure on the server? Does your provider (or do you if you are hosting your own surveys) take appropriate precautions to make sure that the data is backed up properly and is guarded against being hacked into?
  • Does the connection between the survey taker and the survey tool need to be protected? Most surveys use HTTP, but SSL capabilities are available for most survey tools.
  • Are you taking the appropriate measures to minimize survey fraud (ballot stuffing?) What’s needed varies with the type of survey and invitation process, but can include cookies, personalized invitations, and password protection.
  • Are you handling the data properly once exported from the survey tool? You need to be concerned with overall data in the same way that the survey tool vendor does. But you also need to look after personally identifiable information (PII) if you are capturing any. You may have PII from the customer list you used for invitations, or you may be asking for this information for a sweepstake. If the survey is for research purposes, ethical standards require that this private information is not misused. ESOMAR’s policy is simple – Market researchers shall never allow personal data they collect in a market research project to be used for any purpose other than market research. This typically means eliminating these fields from the file supplied to the client. If the project has a dual purpose, and the survey taker is offered the opportunity for follow up, this fact must be made clear.

Availability

No longer being involved in engineering, I’d have to scratch my head for the distinction between availability and reliability. But as this is about IT terms as they apply to surveys, let’s just consider making surveys available to the people you want to survey.

  • Be careful about question types that may work well on one platform and not another, or may not be consistently understood by the audience. For example, drag and drop ranking questions look good and have a little extra zing, but are problematic on smart phones. Do you tell the survey taker to try again from a different platform (assuming your tool detects properly), or use a simpler question type? This issue also relates to accessibility (section 508 of the Rehabilitation Act, or the British Disability Discrimination Act). Can a screen reader deal with the question types?
  • Regardless of question types, it is probably important to make sure that your survey is going to look reasonable on different devices and browsers. More and more surveys are being filled out on smartphones and iPads. Take care with fancier look and feel elements that aren’t interoperable across browsers. These days you probably don’t have to worry too much about people who don’t have JavaScript available or turned on, but Flash could still be an issue. For most of the surveys we run, Flash video isn’t needed, and in any case isn’t widely supported on mobile devices. HTML5 or other alternatives are becoming more commonly used.
  • Instead of accessing web surveys from any compatible mobile devices, consider other approaches to surveying. I’m not a proponent of SMS surveys; they are too limited, need multiple transactions, and may cost the survey taker money. But downloaded surveys on iPad or smartphone have their place for situations where the survey taker isn’t connected to the internet.

I hope that these pointers are meaningful for the IT professional, even with the liberties I’ve taken. There is plenty of information As you can tell, just like in the IT world there are reasons to get help from a research professional. Let me know what you think!

Idiosyncratically,

Mike Pritchard

Filed Under: Fun, Methodology, Surveys, SurveyTip

Leave a Comment

Survey Tip: Pay Attention to the Details

Why survey creators need to pay more attention to the details of wording, question types and other matters that not only affect results but also how customers view the company. A recent survey from Sage Software had quite a few issues, and gives me the opportunity to share some pointers.

The survey was for follow up satisfaction after some time with a new version of ACT! Call me a dinosaur, but after experiments with various online services, I still prefer a standalone CRM. Still, this post isn’t really about ACT! – I’m just giving a little background to set the stage.

  • The survey title is ACT! Pro 2012 Customer Satisfaction Survey. Yet one of the questions asks the survey taker to compare ACT 2011 with previous versions. How dumb does this look?
    Image:Survey title doesn't match question
  • This same question has a text box for additional comments. The box is too small to be of much use, but also the box can’t be filled with text. All the text boxes in the survey have the the same problem.
    Image: Comment boxes should be big enough
  • If you have a question that should be multiple choice, set it up correctly.
    Image: Use multiple choice properly
    Some survey tools may use radio buttons for multiple choice (not a good idea), but this isn’t one of them. This question should either be reworded along the lines of “Which of these is the most important social networking site you use“, or – probably better – use a multiple choice question type.
  • Keep up to date.
    Image: Keep up to date with versions
    What happened to Quickbooks 2008, or more recent versions? It would have been better to simply have Quickbooks as an option (none of the other products had versions). If the version of Quickbooks was important (I know that integration with Quickbooks is a focus for Sage) then a follow up with the date/version would work, and would make the main question shorter.
  • There were a couple of questions about importance and performance for various features. I could nitpick the importance question (more explanation about the features or an option something like “I don’t know what this is” would have been nice), but my real issue is with the performance question. 20 different features were included in both importance and performance. That’s a lot to keep in mind, so it’s good to try to make the survey taker’s life easier by keeping the order consistent between importance and performance. The problem was that the order of the performance list didn’t match the first. I thought at first that the lists were both randomized separately, instead of randomizing the first list and using the same order for the second. This is a common mistake, and sometimes the survey software doesn’t support doing it the right way. But after trying the survey again, I discovered the problem was that both lists were fixed orders, different between importance and performance. Be consistent. Note, if your scales are short enough, and if you don’t have a problem with the survey taker adjusting their responses as they think about performance and importance together (that’s a topic of debate among researchers) you might consider showing importance and performance together for each option.
  • Keep up to date – really! The survey asked whether I used a mobile computing device such as a smartphone. But the next question asked about the operating system for the smartphone without including Android. Unbelievable!
    Image: Why not include Android in smart phone OS list?

There were a few other problems that I noted, but they are more related to my knowledge of the product and Sage’s stated directions. But similar issues to those above occur on a wide variety of surveys. Overall, I score this survey 5 out of 10.

These issues make me as a customer wonder about the competence of the people at Sage. A satisfaction survey is designed to learn about customers, but should also create the opportunity to make the customers feel better about the product and the company. However, if you don’t pay attention to the details you may do more harm than good.

Idiosyncratically,

Mike Pritchard

Filed Under: Methodology, Questionnaire, SurveyTip Tagged With: Survey Tips, Surveys

Leave a Comment

SurveyTip: Think about the number of pages in your survey

Have you seen surveys where every question, no matter how trivial, is on a different page?  Or how about surveys that are just a single long page with many questions?

Neither approach is optimal.  They don’t look great to your primary customer — the survey taker — perhaps reducing your response rate. What’s more, you may be limiting your options for effective survey logic.

Every question on a new page

The survey taker has to check the “Next” button too many times, with each click giving an opportunity to think about quitting.  Each new page requires additional information to be downloaded from the survey host, causing extra time delay.  If the survey taker is using dialup, or your survey uses lots of unique graphics, the additional delay is likely to be noticeable, but in any case you create an unnecessary risk of looking stupid.

One reason for surveys being created like this is is a hangover from early days of online surveying when limitations were common, and as a result surveyors may think it is a best practice.  Another possibility is leaving a default set in the online survey design tool for placing each question on a new page.  But, rather than just programming without thinking, try to put yourself in the mind of the survey taker, and consider how they might react to the page breaks.

Most surveys have enough short questions that can be easily combined to reduce the page count by 20% or more.

It is generally easy to save clicks at the end of the survey, by combining demographic questions, and this is a great way of reducing fatigue and early termination.  However, try hard to make improvements at the beginning also, to minimize annoyances before the survey taker is fully engaged.  If you have several screening questions there should be opportunities to combine questions early on.

Be careful that combining pages doesn’t cause problems with survey logic.  Inexpensive survey tools often require a new page to use skip patterns.  Even if you are using a tool with the flexibility of showing or hiding questions based on responses earlier in a page this usually requires more complex programming.

Everything on one long page

People who create surveys on a single long page seem to be under the impression that they are doing the survey taker a favor, as their invitations generally promote a single page as if that means the survey is short.  Surveys programmed like this tend to look daunting, with little thought given to engaging with the survey taker.  There might be issues for low bandwidth users (although generally these surveys are text heavy with few graphics, so the page loading time shouldn’t be much of an issue).

Single page surveys rarely use any logic, even when it would be helpful.  As described above it may more difficult to use logic on a single page.  I often recommend that survey creators build a document on paper for review before starting programming, but single page surveys often look like they started with a questionnaire that could have been administered on paper (even down to “if you answered X to question 2, please answer question 3“), but that misses the benefits of surveying online.  One benefit of surveying online that isn’t always well understood is being able to pause in the middle of a survey and return to it later.  This feature is helpful when you are sending complex surveys to busy people who might be interrupted, but it only works for pages that have been previously submitted.

One of the most extreme examples of overloading questions on pages I’ve seen recently printed out as 9 sheets of paper!  It also included numerous other errors of questionnaire design, but I’ll save them for other posts.

In the case of long pages, consider splitting up the questions to keep just a few logical questions together.  For some reason, these long page surveys are usually (overly) verbose so it may be best to just use one question per page, or, more productively, reviews by other people to distill the questionnaire to the most important elements with clear and concise wording.

To finish on a positive note, one of the best online surveys I’ve seen recently was a long page survey from the Dutch Gardens company.  There were two pages of questions, one with 9 questions and the second with 6, plus a half-page of demographics.  The survey looked similar to a paper questionnaire in being quite dense, but it didn’t look overwhelming because it made effective use of layout and varied question types to keep the interest level high.  None of the questions were mandatory, refreshing in itself.  And the survey was created with SurveyMonkey — it just goes to show what a low-end tool is capable of.  This structure was possible because the survey was designed without needing logic.

I hope that you’ll get some useful ideas from this post to build surveys with page structure that helps increase the rapport with your survey takers.

Idiosyncratically,
Mike Pritchard

Filed Under: Questionnaire, SurveyTip

Leave a Comment

SurveyTip: Randomizing question answers is generally a good idea

Showing question answers in a random order reduces the risk of bias from the position.  

To understand this, think of what happens when you are asked to choose a question by a telephone interviewer.  When the list of choices are presented for a single choice question, you might be think of the first option as more of a fit, or perhaps the last option is top-of-mind.   The problem is even more acute when the person answering the survey has to comment on each of several attributes, for example when rating how well a company is doing for time taken to answer the phone, courtesy, quality of the answer, etc.   As survey creators, we don’t know exactly how the survey taker will react to the order, so the easiest way is to eliminate the potential for problems by presenting the options in a random order.  Telephone surveys with reasonable sample sizes are almost always administered with question options randomized for this reason, using CATI systems (computer assisted telephone interviewing).

When we create a survey for online delivery, a similar problem exists.  It could be argued that the survey taker can generally see all of the options so why is a random order needed?  But the fact is that we can’t predict how survey takers will react to the order of the options.  Perhaps they give more weight to the option nearest the question, or perhaps to the one at the bottom.  If they are filling out a long matrix or battery of ratings, perhaps they will change their scheme as they move down the screen.  They might be thinking something like “too many highly rated, that doesn’t seem to fit how I feel overall, so I’ll change, but I don’t want to redo the ones I already did”.    Often there could be an effect from one option being next to another that might be minimized by separating them, which randomizing will do (randomly).   The results from these options being next to each other would likely be very different:

  • Has a good return policy
  • Has good customer service
  • Items are in stock
  • Has good customer service

Some question types and situations are not appropriate for random ordering.  For example:

  • Where the option order is inherent, such as education level or a word based rating question (Likert scale)
  • Where there is an ‘Other’ or ‘Other – please specify’ option.  It is often a good idea to offer an ‘Other’ option for a list of responses such as performance measures in case the survey taker believes that the list provided isn’t complete, but the ‘Other’ should be the last entry.
  • A very long list, such as a list of stores, where a random order is likely to confuse or annoy the survey taker.

As with other aspects of questionnaire development, think about whether randomization will be best for the questions you include.

Idiosyncratically,
Mike Pritchard

Filed Under: Questionnaire, Surveys, SurveyTip

Leave a Comment

SurveyTip: Get to the point, but be polite

A survey should aim to be like a conversation.  Online surveys don’t have humans involved to listen to how someone feels about the survey, to reword for clarity or to encourage, so you have to work harder to generate comfort.  Although you don’t want to take too long (the number one complaint of survey takers is time), it is still better to work up to the key questions gradually if possible.  Even though it might be the burning issue for you, you risk turning someone off if you launch straight into the most important question. A few preliminary questions should also help put the respondent into the right frame of mind for the topic.

Generally, the best approach is to build up the intensity, starting from less important questions and then moving to the critical questions as quickly as possible, building up the survey taker’s engagement as you go.  Then reduce the intensity with clarifying questions and demographics.  That way, if someone bails out early, you’ll still have the most important information (assuming that your survey tool and/or your sample company allow you to look at partial surveys).

There are exceptions of course, and one comes from the use of online panels, particularly when you set up quotas and pay for completed surveys.  In this case, one or more demographic questions, used for screening, will be placed very early. 

Or sometimes the topic of the survey dictates the order, as with awareness studies where unaided awareness is usually one of the first questions.  You might also order the questions based on the survey logic. 

If you need to include a response from an earlier question in a later question (piping), or if the answer to one question will determine which other questions are asked (skip logic), this may impose a question order. 

For complex surveys, there are likely to be tradeoffs that are best decided by careful review of the questionnaire (as a document) before starting programming.  This is why questionnaire writing is a combination of experience and science with a little bit of guesswork thrown in for good measure.

One example of how a softer start helped was a survey for an organization considering new services.  The original questionnaire launched straight into the questions for the new services after a brief introduction.  Responses trickled in slowly.  When a question about membership in the organization was moved up to the beginning, the response rates jumped and we were able to complete the survey on time.

If you show respect for your survey takers, they’ll appreciate it and they’ll reward you by completing the entire survey.  Good luck!
Mike

Filed Under: Methodology, SurveyTip

Primary Sidebar

Follow us on LinkedIn Subscribe to Blog via RSS Subscribe to Blog via Email
Every conversation with Mike gave me new insight and useful marketing ideas. 5 Circles’s report was invaluable in deciding on the viability of our new product idea.
Greg HowePresidentCD ROM Library, Inc.
You know how your mechanic knows what’s wrong with your car when you just tell them what it sounds like over the phone? Well, my first conversation with Mike was like that — in like 10 seconds, he gave me an insight into my market research that was something I’d been struggling trying to figure out. A class like this will help you learn what you can do on your own. And, you’ll have a better idea of what a research vendor can do for you.
Roy LebanFounder and CTOPuzzazz
Since becoming our contracted consultant for market research services in 2010, 5 Circles Research has revolutionized our annual survey of consumer opinion in Washington. Through the restructuring of survey methodology and the application of new analytical tools, they have provided insights that are both wider in their scope and deeper in their relevance for understanding consumer values and behavior. As a result, the survey has increased its significance as a planning and evaluation tool for our entire state agency. 5 Circles does great work!
Blair ThompsonDirector of Consumer CommunicationsWashington Dairy Products Commission
Many thanks to you for the very helpful presentation on pricing last night. I found it extremely useful and insightful. Well worth the drive down from Bellingham!
G.FarkasCEOTsuga Engineering
I have come to know both Mike and Stefan as creative, thoughtful, and very diligent research consultants. They were always willing to go further to make sure respondents remained engaged and any research results were applicable and of immediate use to us here at Bellevue CE. They were partners and thought leaders on the project. I am happy to recommend them to any public sector client.
Radhika Seshan, Ph.DRadhika Seshan, Ph.D, Executive Director of Programs Continuing Education Bellevue College
First, I thought it was near impossible to obtain good market information without a large scale, complex market study. Working with 5 Circle Research changed that. We were able to put together a comprehensive survey that provided essential information the company was looking for. It started with general questions gradually evolving to specifics in a fast pace, fun to take questionnaire. Introducing “a new way of doing things” like Revollex’ induction heating-susceptor technology can be challenging. The results provided critical data to help understand the market demand. High quality work, regard for schedule, thorough understanding of the issues are just a few aspects of an overall exceptional experience.
Robert PoltCEORevollex.com
I hired Mike to develop, execute and report on a market research project involving a potential business opportunity. I was impressed with his ability to learn the industry and subsequently develop a framework for the market research project. He was able to execute the research and collect data efficiently and effectively. Throughout the project, he kept me abreast of the progress to allow for any adjustments as needed. The quality and quantitative output of the results exceeded my expectations and provided me with more confidence in the direction of the business opportunity.
Mike ClaudioVice President Marketing and Business DevelopmentWizard InternationalSeattle
5 Circles Research has been a terrific research partner for our company. Mike combines a wealth of experience in research methodology and analytics with a truly strategic perspective – it’s a unique combination that has helped our company uncover important insights to drive business decisions.
Daniel WiserBrand ManagerAttune Foods Inc.
Great workshop! You know this field cold, and it’s refreshing to see someone focused on research for entrepreneurs.
Maria RossOwnerRed Slice
What we were doing was offering not just a new product, but a new market niche. We needed to understand traditional markets well to characterize the new one. Most valuable was 5 Circles ability to gather research data and synthesize it.
Will NeuhauserPresident Chorus Systems Inc.

Featured Posts

Dutch ovens: paying a lot more means better value

An article on Dutch ovens in the September/October 2018 of Cook’s Illustrated gives food for thought (pun intended) about the relationship of between price and value. Sometimes higher value for a buyer means paying a lot more money – good news for the seller too. Dutch ovens (also known as casseroles or cocottes) are multipurpose, [Read More]

Profiting from customer satisfaction and loyalty research

Business people generally believe that satisfying customers is a good thing, but they don’t necessarily understand the link between satisfaction and profits. [Read More]

Customer satisfaction: little things can make a big difference

Unfulfilled promises by the dealer and Toyota of America deepen customer satisfaction pothole. Toyota of America and my local dealer could learn a few simple lessons about vehicle and customer service. [Read More]

Are you pricing based on cost rather than value? Why?

At Pricing Gurus, we believe that value-based pricing allows companies to achieve higher profitability and a better competitive position. Some companies disagree with that perspective, or feel they are stuck with cost-based pricing. Let’s explore a few reasons why value-based pricing is generally superior. [Read More]

Recent Comments

  • Mike Pritchard on Van Westendorp pricing (the Price Sensitivity Meter)
  • Marshall on Van Westendorp pricing (the Price Sensitivity Meter)
  • 📕 E mail remains to be the most effective SaaS advertising channel; Chilly emails that work for B2B; Figuring out how it is best to worth… - hapidzfadli on Van Westendorp pricing (the Price Sensitivity Meter)
  • Isabelle Spohn on Methow Valley Ski Trails gets pricing right
  • Microsoft Excel Case Study: Van Westendorp-un "Price Sensitivity Meter" modeli | Zen of Analytics on Van Westendorp pricing (the Price Sensitivity Meter)

Categories

  • Overview
  • Contact
  • Website problems or comments
Copyright © 1995 - 2023, 5 Circles Research, All Rights Reserved