user survey = FAIL

By Steve Outing

While I do occasionally use the Wall Street Journal iPhone app to look specifically at what has available, on the web I rarely visit the website as a destination (and I am not currently a paying subscriber to either the newspaper or the website). Instead, articles tend to come to me.

That is, I see links to recommended articles on news aggregator sites like Google News or Digg, or in blogs, or in Twitter posts from those I follow, or in my Facebook Newsfeed as recommended by my Facebook friends.

My normal behavior is to click through and read the interesting-looking article. With the recent change by Google to its “First Click” program, I can now read up to 5 articles elsewhere on the site if I choose to click around, before getting to the site’s pay wall. But it’s rare that I go surfing around the rest of the site after reading the article I came for.

This week, I indeed clicked through to a story that I saw linked on a Twitter post, and got presented with a pop-up offer to take a survey about my usage of the site. (I think that appeared as I left the site, but can’t recall for sure.) I took the bait, and was shocked as I answered questions that the survey did not allow me to report my method of using; the data the company will get from it is seriously flawed, because the survey excluded tracking behavior like mine that is widespread among experienced Internet users.

If the link above allows you to take the survey, you’ll see that if you identify yourself as a non-subscriber to both the Wall Street Journal print edition and, the line of questioning will assume that you are a regular destination visitor to the site, and asks things like:

  • What days do you typically visit
  • What sections do you typically view on the site?
  • How often do you visit the site? … And so on.

By the time I reached the end of the survey, I realized that I was not allowed by the survey to indicate my actual behavior and use patterns. While I typically read several articles a week, it’s entirely unpredictable, because I don’t have a habit of going to, but rather I get haphazardly referred to specific articles on the site that others (individuals or aggregators) have recommended. One week I might read 20 of its articles, the next zero.

In the final field was a free-form comment field where I explained my actual usage of, but the data that I added to the survey results — even though I answered every question accurately and truthfully — gave no clue to my actual behavior with the WSJ brand.

It’s as though whoever wrote the survey questions — or approved them — did not want to have the results show how many web users actually behave. That might work against parent News Corp.’s plans to shove much of its other news properties’ website content behind paywalls. survey = FAIL.

Author: Steve Outing Steve Outing is a Boulder, Colorado-based media futurist, digital-news innovator, consultant, journalist, and educator. ... Need assistance with media-company future strategy? Get in touch with Steve!

5 Responses to " user survey = FAIL"

  1. ex News Au
    ex News Au 8 years ago .Reply

    They do that in their staff engagement surveys too. In the first two years of the surveys, staff had a free text field for additional comments on all questions. Now, we are forced to select from radio buttons next to answers that may not relate at all. News also has majority share in opinion polling company in Aus – Newspoll.

  2. Joey Baker
    Joey Baker 8 years ago .Reply

    I don’t know if they do this, but I suspect that the WSJ knows where you enter the site based on javascript. From your description, they only offered you the survey after you entered the main site via Twitter. If WSJ’s analytics are any good, they’ve already go the answer to the question – more reliably then asking the user.

    Of course, I’m assuming that the WSJ has a competent web staff. Could be a mistake. :)

  3. Steve Outing
    Steve Outing 8 years ago .Reply

    Joey: Interesting point about they already know based on web analytics. But then why would they bother offering surveys to non-paying frequent website visitors like me? My (truthful) answers end up contaminating their data, since they’re making the wrong assumption: that everyone online still seeks out their site as a destination. Methinks it’s outdated mindset showing itself in the questions asked, and those not.

  4. Customer Survey Software

    They won’t be the first huge organisation to spend lots of money on surveys and forget (or are too incompetent) to ask the right questions. They really should have one questionairre for those entering the site and another for those leaving (assuming they haven’t already filled one in).

  5. Markedsundersøkelser

    I’ve seen several cases of surveys being poorly planned and put together – but the more money the company make the more careless I feel they sometime happen to be with their money.

Leave your comment