Loading ...
Sorry, an error occurred while loading the content.

Were we just talking about different sources of "web analytic" data?

Expand Messages
  • Eric Peterson
    ... or was it the WAA s internal thread that someone leaked to me. I cannot remember. Either way, Neil Mason has a nice write up about different sources of
    Message 1 of 4 , Oct 4, 2005
    • 0 Attachment
      ... or was it the WAA's internal thread that someone leaked to me.  I cannot remember.

      Either way, Neil Mason has a nice write up about different sources of data and how they can work together in a complete "quantitative plus qualitative" data environment.  I recently published a piece on the "new usability framework" and found a great number of similarities in our views of the available data.  Too often we forget that the voice of the customer is tremendously important and it's something nearly impossible to mine from the volumes of traffic data.

      http://www.clickz.com/experts/crm/analyze_data/article.php/3553021

      Nice work, Neil!
      --
      Eric T. Peterson
      Author, Web Analytics Demystified and Web Site Measurement Hacks
      www.webanalyticsdemystified.com

      Have you joined the Metrics Discussion Group?  Email webanalytics-subscribe@yahoogroups.com to join today!
    • Jim Novo
      ... data and how they can work together in a complete quantitative plus qualitative data environment. I recently published a piece on the new usability
      Message 2 of 4 , Oct 4, 2005
      • 0 Attachment
        > Either way, Neil Mason has a nice write up about different sources of
        data
        and how they can work together in a complete "quantitative plus
        qualitative"
        data environment. I recently published a piece on the "new usability
        framework" and found a great number of similarities in our views of the
        available data. Too often we forget that the voice of the customer is
        tremendously important and it's something nearly impossible to mine from
        the
        volumes of traffic data.

        http://www.clickz.com/experts/crm/analyze_data/article.php/3553021

        I find more value in the "action" of the customer, since action is what
        makes money, not opinions. Once I know the action / non-action, I can then
        gather data from the customer for additional insight. The "voice of the
        customer" doesn't always reflect actual behavior or make economic sense.

        It doesn't work the other way around. I think by not emphasizing this
        pretty significant point, Neil is painting too rosy a picture, assuming the
        reason you ask people their opinion is you are going to act on it. But if
        you aren't going to act on the opinions, then it really doesn't matter,
        does it?

        So sure, get the VOC, but first understand the behavior so you have a
        concrete idea of whose "voice" you are really hearing. I don't value the
        opinion of customers who *say they will buy* as much as I do the ones that
        *did buy*.

        Integrate qualitative with quantitative? Sure. But there is a right way
        to do this, and that is not being discussed at all, no direction is being
        given.

        Said another way, try starting with qualitative and crosstab with
        quantitative. It's almost always a mess. Start with quantitative and
        crosstab to qualitative, it almost always makes sense. Actual behavior
        defines opinions, not the other way around.

        Said yet another way, the voice of a customer who *says they will take
        action* lacks actual customer experience; this voice lacks credibility.
        Want I want to know is what the people who *did take action* (buy, click,
        visit, download, whatever) think; these are the voices of true customers.

        Jim
        jim@...
        http://www.jimnovo.com
      • J Li
        I interpreted some of the points in Neil Mason s article [http://www.clickz.com/experts/crm/analyze_data/article.php/3553021] a bit differently from Jim Novo
        Message 3 of 4 , Oct 5, 2005
        • 0 Attachment
          I interpreted some of the points in Neil Mason's article
          [http://www.clickz.com/experts/crm/analyze_data/article.php/3553021%5d
          a bit differently from Jim Novo (..and I'm interested in others'
          opinions... ).

          The value of using complementary qualitative methods isn't in
          predicting "what will happen". In a Web analytics context,
          qualitative is especially useful in understanding "why something we
          expected to happen didn't happen". Qualitative helps us diagnose
          reasons behind inaction, understand why the customer behaved as
          observed, understand sources of dissatisfaction, or gauge emerging
          awareness that didn't result in a site visit. Maybe it's something
          subtle a competitor has changed. Then use the qual input with quant
          analysis (and all other relevant knowledge you've accumulated) to
          recommend the next tweak. Start the analysis cycle again. Well-run
          qualitative research proceeds only when very specific research
          objectives are defined.

          I agree with Jim that the "action" of the customer is what makes
          money. "Action" is undeniably a strong voice of the customer.
          However, it isn't always possible to set up marketing and sales
          efforts so that every step of the conversion funnel up to the money
          making step is fully trackable quantitatively. And not every
          organization is set up to execute A/B or multivariate testing for
          subsequent changes. Organizational or technical barriers may
          intervene. Or you might be required to back up your next proposed
          online tweak with "a bit more input". In these non-ideal
          conditions, fill the feedback void with other methods.

          Qualitative methods don't compete with quantitative. As Neil says,
          they're complementary. Use qualitative as another diagnostic option
          in your customer insight toolbox.

          My 0.02.

          [On a side note, at an upcoming local usability chapter meeting,
          we'll be discussing how usability practioners & user experience
          architects can use the quant aspects of Web analytics to help them
          in their work. If anyone's interested in the outcome of this
          session, drop me a note offline at the email id below and I'll send
          you an update post-meeting.]

          June Li
          ClickInsight
          June.Li@...
          http://www.ClickInsight.ca

          --- In webanalytics@yahoogroups.com, "Jim Novo" <jim@j...> wrote:
          > > Either way, Neil Mason has a nice write up about different
          sources of
          > data
          > and how they can work together in a complete "quantitative plus
          > qualitative"
          > data environment. I recently published a piece on the "new
          usability
          > framework" and found a great number of similarities in our views
          of the
          > available data. Too often we forget that the voice of the customer
          is
          > tremendously important and it's something nearly impossible to
          mine from
          > the
          > volumes of traffic data.
          >
          > http://www.clickz.com/experts/crm/analyze_data/article.php/3553021
          >
          > I find more value in the "action" of the customer, since action is
          what
          > makes money, not opinions. Once I know the action / non-action, I
          can then
          > gather data from the customer for additional insight. The "voice
          of the
          > customer" doesn't always reflect actual behavior or make economic
          sense.
          >
          > It doesn't work the other way around. I think by not emphasizing
          this
          > pretty significant point, Neil is painting too rosy a picture,
          assuming the
          > reason you ask people their opinion is you are going to act on
          it. But if
          > you aren't going to act on the opinions, then it really doesn't
          matter,
          > does it?
          >
          > So sure, get the VOC, but first understand the behavior so you
          have a
          > concrete idea of whose "voice" you are really hearing. I don't
          value the
          > opinion of customers who *say they will buy* as much as I do the
          ones that
          > *did buy*.
          >
          > Integrate qualitative with quantitative? Sure. But there is a
          right way
          > to do this, and that is not being discussed at all, no direction
          is being
          > given.
          >
          > Said another way, try starting with qualitative and crosstab with
          > quantitative. It's almost always a mess. Start with quantitative
          and
          > crosstab to qualitative, it almost always makes sense. Actual
          behavior
          > defines opinions, not the other way around.
          >
          > Said yet another way, the voice of a customer who *says they will
          take
          > action* lacks actual customer experience; this voice lacks
          credibility.
          > Want I want to know is what the people who *did take action* (buy,
          click,
          > visit, download, whatever) think; these are the voices of true
          customers.
          >
          > Jim
          > jim@j...
          > http://www.jimnovo.com
        • Jim Novo
          ... I don t believe any of this contradicts what I said...action and inaction are just two sides of the same coin, and if you are diagnosing the reasons for
          Message 4 of 4 , Oct 6, 2005
          • 0 Attachment
            > The value of using complementary qualitative methods isn't in
            > predicting "what will happen". In a Web analytics context,
            > qualitative is especially useful in understanding "why something we
            > expected to happen didn't happen". Qualitative helps us diagnose
            > reasons behind inaction, understand why the customer behaved as
            > observed, understand sources of dissatisfaction, or gauge emerging
            > awareness that didn't result in a site visit. Maybe it's something
            > subtle a competitor has changed. Then use the qual input with quant
            > analysis (and all other relevant knowledge you've accumulated) to
            > recommend the next tweak. Start the analysis cycle again. Well-run
            > qualitative research proceeds only when very specific research
            > objectives are defined.

            I don't believe any of this contradicts what I said...action and inaction
            are just two sides of the same coin, and if you are "diagnosing the reasons
            for inaction", you are looking at the behavior first and then linking known
            behavior to qualitative "why", as I suggested was a best practice. I don't
            "hate" qualitative, what I do have problems with is the improper use of it.

            Folks, why were surveys, panels, and focus groups developed in the first
            place? Because for broadcast media, you don't have any data at the
            individual level. All you have is "we reached 2 million people 25 - 34 an
            average of 5 times in this market". That's it. If that is all I had, I'd
            certainly want more. But if I have behavior (including non-behavior), I
            have the core, I have the root.

            > I agree with Jim that the "action" of the customer is what makes
            > money. "Action" is undeniably a strong voice of the customer.
            > However, it isn't always possible to set up marketing and sales
            > efforts so that every step of the conversion funnel up to the money
            > making step is fully trackable quantitatively. And not every
            > organization is set up to execute A/B or multivariate testing for
            > subsequent changes. Organizational or technical barriers may
            > intervene. Or you might be required to back up your next proposed
            > online tweak with "a bit more input". In these non-ideal
            > conditions, fill the feedback void with other methods.

            Agreed. If you don't have anything, having something is better, as is the
            case with broadcast media. But that model is broken, so why follow it?

            > Qualitative methods don't compete with quantitative. As Neil says,
            > they're complementary. Use qualitative as another diagnostic option
            > in your customer insight toolbox.

            I don't disagree with that, but use behavior to segment and then do the
            qualitative by these behavioral segments. That's all I ask. Otherwise,
            you
            have a long series of disappointments in front of you. I have seen this
            happen 100's of times, and have posted examples previously. A brand new
            example crossed my desk today:

            Client is a non-profit org in the industrial segment. They do the typical
            things - publish a magazine, provide certifications, train people, etc.

            They did a survey of their "membership" and changed certain policies and
            took certain actions based on this survey of their "members". However,
            since
            they didn't study the behavior of the members and non-members and
            ex-members (their segmentation) prior to the survey, they didn't really
            understand who they were talking to in terms of value, and how opinons tied
            to behavior.

            Turns out that ex - members are some of their most valuable customers -
            much more valuable than a "member" on average - due to usage of some
            ancillary services they provide. Customer LifeCycle, folks. For them,
            membership is a churn and burn thing, like cell phones. Best customers are
            outside this pattern.

            Further, the policies they enacted based on a survey of "members" cut the
            average value of an "ex-member" by 50% over a period of 3 years. In other
            words, they *intentionally* tailored their business model to their *least
            valuable* customers and then drove away their best customers - all based on
            a survey.

            Just one of hundreds of cases. I have seen this movie before many times.

            In the 1980's, direct marketers became fascinated with "demographic
            enhancements" only to lose millions of dollars and find out through testing
            that these enhancements only made sense if you had years of behavioral
            studies under your belt and very high end modeling for support. In two or
            three years, you will be seeing case studies that prove the same thing is
            true on the web.

            As people strive to integrate quant and qual, the winner of this tug of war
            will just become more and more obvious.

            Segment by behavior first, so you know where the people whose opinion you
            are asking exist in the LifeCycle. Then do the qualitative. That is my
            only message. The two approaches may be "complementary", but I hope nobody
            thinks that means they are "equals".

            Unfortunately, that is the message I keep hearing.

            Jim
          Your message has been successfully submitted and would be delivered to recipients shortly.