Dimensional Research

Technology Market Research

  • Home
  • Privacy Policy
  • Contact Us
  • About Us
  • Solutions
  • Services
  • Successes
  • Contact
  • Research Library


Read the Dimensional Research Blog

A Researcher’s Confession

by Diane Hagglund

I admit it.  Although web surveys are one of our most popular research offerings, I strongly prefer qualitative research (focus groups, in-depth interviews).

I can’t help looking at the findings in web surveys and feel like I want to dig in and ask more questions.  The participants say things that don’t make sense to me and I want to know why.  Or they make short comments responding to open-ended questions that leave me with a dozen follow-on questions I don’t have the opportunity to ask.

When I present qualitative findings, I know I can answer any question that comes up with complete confidence.  With quantitative findings, I always know that there will be questions where the answer is, “We can’t draw any conclusions based on this survey.”

This came up again last earlier this month:  We just did a series of in-depth interviews where we wanted to understand perceptions about the cost of various alternatives.  All the participants in this study had identified themselves as product decision makers who had full visibility into costs – a requirement for the study.   If you had just looked at the first level of answers people gave, you would have thought that one of the tools we were looking at was very inexpensive compared to other options.

But because this was an in-depth interview, I got to ask that all-important “why” question. I quickly realized that while all the participants had been educated on the “line 3” costs that were billed directly to their organization, not everyone was aware of the additional “line 10” costs that had to be added to support this different approach.  When you added both of those up, the tool that originally appeared less expensive turned out to have a similar TCO to other options.

Now, it’s true that we could have found this out by writing a good web survey, but one of the secrets to writing great web surveys is to know the answers to all the questions first.  We continue to recommend web surveys as good vehicles for quantifying concepts that you know well, but want to put an accurate % by each of the options.  This is a valuable thing to do, especially for market sizing, external marketing and PR purpose.

But for finding out the answers that you don’t really know, start with qualitative research – and by all means do a web survey next to put those %s in place once you know the statements to put the %s with.

Three Signs That You Are NOT Ready For A Web Survey

by Diane Hagglund

We hope that we’ve made it clear in this blog that Dimensional  Research is a big fan of Web surveys.   Web surveys are a great market research tool. They make it easy to get immediate results right at your fingertips.

However, you need to have a certain level of knowledge before you can run an effective Web survey.  Here are three signs that might indicate you are not ready for a Web survey, and you should do some qualitative research first.

  1. If you write multi-choice questions and are completely guessing the answers, you’re probably not ready for a Web survey.  Of course, you always need to have an “Other” section to cover the corner-cases that you just didn’t think of, but your options must capture most of the likely responses to effectively quantify a finding.  Let’s admit it, survey takers will often pick a a presented option that isn’t quite right rather than take the time to fill in an open-ended option, so you can’t rely on “other” to cover your lack of knowledge.
  2. If you’re asking a lot of open-ended questions, you’re probably not ready for a Web survey.  But what is “a lot?”  Good rule of thumb, no more than one open-ended question for every 20 survey questions – not including that important last question “Is there anything else you’d like to tell us?” that you put at the end of every survey.
  3. If you get only one shot at an audience, don’t waste it with an uninformed Web survey.  If you don’t completely understand a new market, and have a participant list that you can use only once, mitigate the risk that you’ve gotten something wrong by doing a few interviews with a couple of list members, or doing some small trial surveys with more open-ended questions that you can use to form a better Web survey.

Don’t waste a Web survey. If participants are taking the time to give you feedback, make sure you’re getting the most from them.  Don’t be scared to add some test surveys or interviews to a project schedule – the extra week or two needed in the schedule will give dramatically higher results.

We just wrapped up a great project that started as a stand-alone Web survey. We needed to quantify some specific purchasing metrics, so the survey was clearly the right methodology.   But as we drilled deeper, we realized that we were making far too many guesses about actual purchase motivations.

We added a series of 15-minute customer interviews prior to the Web survey that gave us some great insights into the  scenarios that were driving behavior, and then developed a much better Web survey that gave crystal clear data about all scenarios. Most importantly, it allowed us to eliminate a group of customers that weren’t motivated by the conditions we were evaluating (although their behavior was similar) and would have skewed our data significantly if we hadn’t excluded their responses for certain questions.

Going into the final presentation, we were very glad that we’d done the interviews.  We knew our stuff cold and had the backup we needed to defend the results and ensure they were taking seriously – a must to influence the business outcome.

Web Surveys vs. Phone Surveys

by Diane Hagglund

One approach to market research that we haven’t talked much about in this blog is phone surveys.

Phone surveys are when a person with a nice voice calls people and asks questions from a script. Answers are recorded in a spreadsheet, and the final result is very similar to the graphs you’d expect out of a Web survey.

Phone surveys are different from in-depth interviews because the script is asked exactly as written, unlike an in-depth interview where you have a researcher who is a technology expert asking the follow-on questions needed to drill down into answers.

Dimensional Research usually doesn’t recommend phone surveys to our technology clients. Even for consumer marketing, phone surveys are becoming less useful according to Jay Leve of SurveyUSA: “There is [no] future for any form of telephone research that is predicated on the researcher being able to barge in at will and seize the respondent.”

Several more compelling reasons why phone surveys don’t work are outlined very nicely by Jeffrey Henning at Research Live. The ones that relate to market research for technology companies include:

– Expense of dialing. More and more phone surveys are done via cell phones, since more and more people (and this includes many technology startups) don’t use landlines. In the US, by law, you can only use automatic dialing for landlines, not for cellphones. Manual dialing is much more labor-intensive – and expensive.

– Online surveys eliminate the expense of data entry. The respondent to a web survey is, in effect, donating the data entry cost, as they select the appropriate choices and type in their answers. With a phone survey, you are paying a call center representative to transcribe each respondent’s replies.

– The visual medium of Web surveys lets you easily show people visual concepts, such as ads or core messages, and get their response. Surveys that require respondents to react to visual concepts can’t be conducted with phone surveys alone.

– Web surveys have the allure of confidentiality. People today feel more comfortable sharing information on the Web than answering the prying questions of a phone interviewer.

– People prefer Web surveys, because a Web survey can be done at the respondent’s convenience, rather than at the moment the phone interviewer happened to call.

Jeffrey does make a case for phone surveys in certain situations, but almost none of these are relevant to IT market research including:

  • Major Account Research – In these cases we recommend in-depth interviews or online customer advisory boards. Why wouldn’t you take the opportunity to have a deep conversation with your biggest customers?
  • The Human Touch – This argument only works with Corporate IT if you are also knowledgeable about technology, so again, it’s better to do in-depth interviews.
  • Some People Aren’t Online – This is obviously not an issue for technology professionals.

Our recommendation, after years of doing market research with Corporate IT, is to avoid phone surveys when doing technology market research. Instead, use Web surveys, in-depth interviews, or a combination of both.

Bad Survey Design – Ouch!

by Diane Hagglund

Whenever we get a chance, we love to participate in market research done by other companies. Doing so gives us the opportunity to think about how we’re answering questions about the stuff we care about. Being on the “other side of the glass” is a great reality check.

We recently upgraded some important business software that we use here at Dimensional Research, and had the opportunity to do a follow-up survey on the experience.  We quickly discovered that it was a very poorly designed survey. The company wasted their time and energy (and ours too!) on a survey that was so badly designed, it will do nothing to help make things better next time.

Here are the things they did wrong:

1) To get the product installed, you first had to register, get a new license key, download the product, then run the installer. Four distinct steps – and we had problems with the first three! But the survey asked only about the step that actually worked – the last one. There was no opportunity to tell them that we had to register three times, that we ended up phoning somebody to get our license key, and that the first site we were sent to for the download didn’t have the right file.

2) They were clearly using the same exact survey for people who were first-time buyers and for those who were upgrading. Even though we purchased three years ago and our reasons for purchasing aren’t relevant anymore, we were required to answer questions about how we chose this vendor, etc.  Even if their lists are so bad that they don’t know we were an upgrade rather than a new purchase, a better designed survey would have given us the opportunity to skip new purchaser questions and continue onto the relevant product-related questions.

3) They did not ask any product-related questions aside from the installation. We’ve been using this product for three years now and they’ve never asked us any questions about the product – only about purchasing and installing the product. It definitely left us with the impression that all they care about is getting our renewal dollars .

4) There were no open ended questions where we could inform them of our issues, so that they could fix them, or to find out if we were an isolated case or if all of their customers have had the same problems.

Today’s fabulous, easy-to-use online survey tools are enabling a lot of bad survey behavior. We trust the readers of this blog will have better judgment!

Three Tips for Crafting Better Online Surveys

by Diane Hagglund

There are many ways to make sure your online survey is efficient and effective. One of them, of course, is to avoid asking bad questions.  

Tip 1: Craft your questions carefully to avoid unwanted results 

This tip was inspire d by Seth Godin. He explains that “Every question you ask changes the way your users think. If you ask, ‘which did you hate more…’ then you’ve planted a seed.” 

Mr. Godin makes a great point. I recently booked a trip with Travelocity. It was a great trip. I was happy.  I did have a small issue with the airport transfer getting home, so I filed a complaint to see if I could get a refund.  It was a small matter – about $30 credit – so I wasn’t too worried about it. Travelocity’s reply email was pretty typical, asking me for more information (which I had to get by opening up the email THEY sent me, so a bit annoying). 

BUT… Then they sent me a survey asking about my experience with Travelocity.  One of the questions on the survey was “Do you know about Travelocity’s guarantee that your booking will be right, or we’ll work with our partners to make it right, right away?” I actually didn’t know, but this question clearly did NOT describe the experience I just had. 

Travelocity did make it right – it took them about 30 days to do so – but that wasn’t what their guarantee said and their survey pointed that out to me.  By including that question in the survey, they planted a seed that they didn’t want to plant and I ended up being less happy overall than I was before the survey.

Tip 2: Ask at least one question that participants actually WANT to answer

It’s important to ask the questions your customers are actually interested in answering.  Too often, marketing departments are so focused on the company’s newest offerings that they ignore the products their customers have come to depend on.

I use QuickBooks for my business. I have LOTS of feedback for Intuit on the core QuickBooks product, but they never ask me about that.  They constantly survey me, asking if I want to buy checks or do payroll or take credit cards, but there is no “Thanks for your time, is there anything else you’d like to tell us?” that would enable me to give them the feedback I WANT to give them.  

Another example from my own professional life: I use Zoomerang for my surveys. It is a GREAT product – with a few caveats.  One of the problems I have with the product is that they have a horrible interface for “choose the answer that most closely applies.”  It’s a small button that’s almost impossible to see. On the other hand, their interface for “choose all that apply” is great – a nice square with a big check mark.  I want to tell them about this issue, but they keep sending me web surveys about other things and I’ve never had a chance to give them important feedback that I really want to give them. Maybe they’ll read my blog and I’ll get to them that way?  <CORRECTION:  They did in fact just send me a survey this week that allowed me to give that feedback.  I’ll wait and see if they act on the feedback.>

Tip 3: Reward your participants, wisely

This tip is inspired by Patricio Robles: “offer users who respond to a survey a discount, an entry in a drawing for a prize, something of value. It will boost response rates and make them feel like they’re investing their time wisely.”

While I generally agree, I would add that when offering a reward to participants, it’s important to consider your target market. A gift that is too nice motivates people who aren’t qualified to complete the survey. Then you have to wade through junk or set up lots of qualifying questions to weed them out. If the reward is nice enough, some people will game the system and try to guess what you’re looking for, so the filtering questions won’t always work. 

When you’re doing a technology web survey, this is especially important, since people who don’t know your topic can really throw off the results, as they are not informed.  If possible, giving a copy of the final report on a topic is a great incentive, since only people who know and care about the topic will respond to this type of reward, keeping your input very clean.

Copyright © 2008 - 2023 · Dimensional Research