5 Comments

Summary:

The online space has delivered the kind of measurability we only dreamed of before. The problem is that more seems to mean less: the avalanche of stats seems to reveal little real, reliable information. Technology research reporting is rife with value judgments, omissions and unjustified conclusions.

Back when I was a greenhorn graduate, and the web wasn’t widely available in businesses, let alone in homes, finding good market research involved leafing through musty industry research tomes that resided in libraries.

For marketers, the online space has delivered the kind of market measurability we only dreamed of before. The problem is that these days, more seems to mean less: the avalanche of stats often seems to reveal little real, reliable information. Frequently, online technology research reporting is rife with value judgments, omissions and unjustified conclusions. Rarely does it simply present the facts.

Here are some real-world technology research reports that provide valuable lessons in misinformation — and hints for finding good tech data that you can really rely on.

1. The Misleading Headline

A good headline means more clicks — everyone knows that. But when that tactic is applied to research findings, it can actively mislead us readers. This article, titled “Users of Facebook’s Social Network are Mostly Anti-Social,” is one example. Get past this headline and it turns out that, in fact, 90 percent the study’s respondents wanted to use Facebook to deepen friendships, though they weren’t sure how to do that through the service, and “Many users appear disappointed by this lack of intimacy”. The respondents don’t sound particularly antisocial to me yet, but let’s read on. “For nearly half of all respondents,” the article says, “Facebook isn’t considered a social network but more a public phone book or search engine.” So what does the research actually tell us? All it says to me is that the perceptions and potential of social networks are still evolving in users’ minds. So much for all those “anti-social” Facebook users.

Lesson: Don’t take research findings from headlines. Access the full study, read, and draw your own conclusions.

2. Infographic Blindness

If web marketers lust after data, we’re certainly having one heady love affair with the infographic. They’re used to visualize everything from carbon emissions to real-time Twitter conversations.

This one, revealing how teens use cell phones, hints at some of the informational problems that can arise when researchers focus on the form, rather than the function of the infographic.

I consider a teen to be someone aged 13-19, but this infographic reflects usage among people aged 12-17. OK. But where are they from? And how was the sample selected? The infographic can’t really give us any indication of the background behind these pretty pictures. For that, we have to access the original Pew Internet study. It turns out the sample included only teens in the continental U.S., so if you were hoping to rely on this infographic to justify marketing efforts in Australia, Estonia, or even Hawaii, you’d better think again.

Lesson: Infographics rarely, if ever, contain all the information you need to actually use the research findings.

3. Baseless Conclusions

Obviously, the reader’s objective in reviewing statistics is to learn something — to draw some conclusions about the research findings. Often, though, the sites that report on those findings are all too eager to throw their value judgments into the equation as a means to supply “added insight” for their sites’ particular readership.

Look again at the Flowtown commentary on teen cell phone usage. That last line about “sexting” is particularly problematic. The comment “sexting seems to becoming [sic] more of an issue” is all implication and value judgment — and no fact. No historical data is provided on the phenomenon, so it’s difficult for the reader to credit the implied increase in sexting or know at what rate it’s increasing (if indeed it is). Links to other research on the topic would be useful here. Also, the statistics presented in the infographic do not reflect that the behavior itself is an issue, nor who it might be an issue for. Words like “seems” used in a statistical context always make me wary.

Lesson: Journalists will add value judgments and draw unstudied conclusions in order to make a story. Read beyond their interpretation to find the facts.

4. Useless Statistics

If “useless” sounds like a harsh assessment, take a look at this Mashable report of some Facebook statistics. This research tries to correlate positive attitudes with Facebook’s relationship status indicator. If that sounds like a stretch to you, you’re not alone: even the reporter comments on the irrelevance of the  stats.

“In the end,” says the author, “… it’s only scientific insofar as it is a reflection of what people choose to share in their status updates. Obviously, that’s not going to be a strong — or even defining — indicator of how people really feel, regardless of relationship status.”

Right. So there’s no correlation and the “study” proves nothing beyond how many American Facebook users communicate their relationship status. Why is this being reported on at all?

Lesson: Just because someone released some stats doesn’t mean they’re valid, relevant, or informative.

5. Dubious Research Methodology

Understanding the way a given piece of research is conducted is critical to contextualizing its findings. Unsurprisingly, the methodology — and even basic premise — behind some online research is extremely questionable.

That Facebook “research” on happiness is a fabulous example. How does one measure happiness? You and I might struggle with this question, but Facebook has the answer: it “already has a methodology for measuring the overall ‘happiness’ of its users. It basically looks at how many positive words people use in their status updates (for English-speaking users). This results in the USA Gross National Happiness Index.”

One imagines that the more wry, dry forms of humor don’t rate in this scale — and that’s just for starters. To me, Facebook’s self-proclaimed authority on happiness seems baseless. And what about the first piece of research I mentioned in this post, about Facebook users being unsure how to deepen relationships through the service? If that’s true, and Facebook users actually are antisocial (however that may be defined), then how reliable is the language these people use on the site as a reflection of their — or an entire nation’s — happiness?

Lesson: Some organizations will do anything to present themselves as authorities. Look for real credentials plus a sound research premise and methodology in the studies you consider.

These are the main lessons I’ve learned from reports of technology trend research. What warnings can you add to the list?

Image by stock.xchng user cobrasoft.

  1. Thank you for this post Georgina! I’m always surprised by how many people take internet stats as gospel.

    As a researcher, this is a great, bulleted check list that I’ll definitely be referencing again.

    Share
  2. I think the article is well grounded and relevant. However it is also doing one folly of drawing a conclusion against each tenet that is being analyzed.Further, It is a general human folly that we need some data and conclusion :) to facilitate our decision making regardless of how thin the basis of the decision is. So people will feed those human desires

    Share
  3. Good post.
    I think that’s the price we pay for the ever increasing speed, availability and reach of the information in our life: we become more superficial, quick to reach conclusions and move on.

    We gain in scope but lose in depth, achieve reach but miss out intimacy, gain in novelty but lose in permanence.

    This phenomenon worries me. I have written about it here:
    http://innovationimitation.com/2010/03/stop-tweeting-start-talking/

    Share
  4. [...] president says WVU would survive Big Ten expansion – Daily Mail – Charleston•5 Lessons in Misinformation – WebWorkerDaily (blog)•Older, Abused Women Suffer Poor Mental Health – [...]

    Share
  5. [...] 5 Lessons in Misinformation (Webworkerdaily) [...]

    Share

Comments have been disabled for this post