This recent Marketing Profs article was penned by David Bean, Ph.D. - the founder of Attensity, one of the largest text-analytics companies around.
I think it, again, misses the mark. And, again, this is not in any way a criticism of David, his company or text analytic tools in general (I really like these tools). It's a criticism of how they are currently getting pitched.
A good portion of his article argues against traditional research methods in favor of harvesting unstructured conversational data from consumers and customers. There are two major problems with the assumptions behind this argument.
Firstly, traditional research methods, by and large, are not broken. David refers to traditional research methods as struggling to 'make sense of it all'. As if they are somehow unfocused. Yes, research practitioners come across unfocused briefs, but they are not the norm. The norm is to have a clear objective. Most research gets commissioned when there are some specific issues afoot. A lot of this research solves these issues effectively. And it does it without asking consumers questions like 'So what do you think would make this better?' - which, in most situations, doesn't get you any good answers.
Yes there are bad examples of questionnaires and focus groups. But this is usually the practitioners fault, not the methodology (if you're having a hard time finding a good research firm, drop me a line, I can point you in the right direction).
The irony in David's article is that after a page of argument against using structured research to solve issues, he pulls out a quote from a structured research study to prove his point! (470 responses from Planet-feeback.com is not the best source to understand the 'trust in advertising' issue).
Second, there is an assumption that unstructured data is more useful to understand everything from new product ideas, effectiveness of marketing campaigns, product positioning, etc. This is a tall order. People will write in to companies with emotionally laden stories of how they were helped/liked/loved/made to feel good, or the opposite. But they rarely talk about how the brand 'needs to be slightly more edgy to really make me like you'; 'I loved your ads but you need to change the color tones to more suit your brand's style; or 'here's a product idea for you...'. Fanciful stuff.
The myriad of stories you receive from customers need a solid layer of interpretation before they become useful. This typically involves structured research.
The gist of my argument is don't try and create a cure for a disease that doesn't exist. Text analytics is a great augment for your research and intelligence gathering efforts, not a replacement for them. I know there is a need to go after an internal budget when selling in a solution like this. And there is pressure to justify the cost. But this argument will be a tough sell.
A good text analytics tool is like a radio. Without a radio, you can't tune into a radio channel. You just get a bunch of meaningless static. Meaningless static is what most companies hear when they try and listen to consumers or customers. A text analytics tool gets you access to the channels you need to listen to. From there, you can decide what you need to know in more depth.
The problem with a radio is that it is one-way communication. You can't direct the conversation. This is a limitation of the medium. It's not a selling point.
Wednesday, April 30, 2008
Another pitch for text analytics gone awry
Posted by Paul Soldera at 8:43 AM
Subscribe to:
Post Comments (Atom)
5 comments:
Paul,
Thanks for another provocative post.
I can understand both David Bean's and your opinions on the matter. I agree to a point with David's point that structured research is limited in its ability to really elicit the "first person feedback" that he describes - where a person describes in personal, anecdotal detail what, how, why, and when something went well, went poorly, worked, didn't work, etc. And in the absense of a prompted question, or focus group moderator, people are more likely to bare their souls, emotions, and sentiments without fear of being judged. You often get much more passion (or to use a Text Analytics buzzword, "sentiment") associated with events and activities from the purely unstructured content of an email or a web posting.
There is a middle way, however, to get value from both structured and unstructured content and create a 1+1=3 scenario.
Survey methodology has historically required the imposition of highly structured questions, with highly structurd answers (yes/no, 1/2/3/4/5, L/M/H) that leave little room for passion, qualitative insight, or unsolicited feedback. When a survey asks 10, 20, 30, or more quesions, they turn the respondents into drones, and often suck the life out the respondent's passions and qualitative insights. Throw in a couple of open ended questions and maybe you'll get an insight or two, but often the respondent is unwilling by that time to give any insight.
But check out the survey tabs at AOL, for instance:
https://secure.opinionlab.com/ccc01/comment_card.asp?time1=1209742286314&time2=1209742294592&prev=&referer=http%3A%2F%2Fwww%2Eaol%2Ecom%2F&height=1024&width=1280&custom_var=undefined
AOL's first question is simple:
"Tell us what you think"
The second question is a rating - pick from 5 levels, ranging from "hate it" to "love it"
And finally the survey closes with 3 quick/short questions.
With a survey like this (which is technically structured but has one BIG open-ended question that elicits qualitative feedback) you can really elicit huge qualitative insight. Additionally, when you merge the "love it/hate it" structured answer with the qualitative insight, you can start to do some very powerful correlation and causal association of issues, and expeiences to overall customer satisfaction and happiness (or dissatisfaction/unhappiness).
You find the same structured/unstructured synergy opportunity in many web sources of content - be they review sites (where the true insight is in the open ended text posting, but the correlation or causality can be mapped to structured answers), or in call center verbatims (where the operator may do light coding of interactions but the real insights are in the text).
My view is that there is tremendous untapped value in the unstructured data, but it's often even more powerfully exploited when it's captured, and merged with structured data "framing" through some structured capture methodology.
A final thought about that AOL survey...
AOL says at the end of its survey that "Comments will be read, but AOL regrets that it cannot respond directly."
With Text Mining/Text Analytics - comments can not only be read, but can be categorized, can be sentiment scored, can be directed to appropriate people (marketers, product managers, customer service reps, etc) and CAN be responded to directly. That's the real value of text analytics in a survey or customer communication enviroment. Connecting the insight not just to the company, but driving a meaningful response back to the most happy, the most unhappy, the most opinionated and passionate consumers, is ultimately the way you show to your customers that you hear them, and you're acting on their ideas and experiences.
Sid.
Great comment Sid. And I think you are spot on. There are large untapped opportunities for 'semi-structured' research. Research where you direct the respondent with a few structured questions but let them free-form their comments in text - ultimately linking the two together.
I think that's a smarter argument for text analytics for research data than trying to replace structured research altogether. It also makes your SaaS play very logical - you want to be able to deploy a solution like that fast, and with a minimal foot-print on the client's end.
I personally believe there is room here to re-write the customer-satisfaction book using text analytics technology. Talk about an opportunity - small, less intrusive surveys with GREATER insight into customer issues.
But surveys they still need to be. Which is why I argued against some of David Bean's more aggressive statements about replacing directed feedback altogether.
One of the reasons I take tis stance is that I don't agree that in the absence of a prompted question or moderator that people are more likely to bare their souls. I think people get that impression from the truly detailed, well thought out and passionate blog posts and reviews that are done online. The problem with just looking at these posts and concluding that this is the holy grail of customer insight is a problem of 'silence evidence'. Most people still don't write blogs or customer reviews. In fact the large majority don't. The large majority of people find it hard to express themselves in the written form. Most people find it difficult to talk about emotional situations. And struggle with understanding their own emotions and how these link to behaviors. Most people deny having any emotional response at all.
I've seen good focus group moderators get ordinary people to talk at length about issues they would never write about, and frankly couldn't write about.
This is not to say there isn't any good information out there to collect and analyze. It's just that it's probably not the whole story.
I like your 1+1=3 scenario much more.
Thanks for the comment.
i don't agree with either with that scenario i'am sorry
credit repair
it's going to be great.i would like to appreciate the intelligence of this blog’s owner.
Hi greeat reading your blog
Post a Comment