Loading ...
Sorry, an error occurred while loading the content.
 

Re: [webanalytics] Google Analytics reports that are not easy to take action

Expand Messages
  • Pavan K S
    Hi Dave, Thanks for the response. I think I phrased my question wrongly. I was suggesting that once you do the comparison manually and figure out that there is
    Message 1 of 6 , Jun 27, 2013
      Hi Dave,

      Thanks for the response. I think I phrased my question wrongly.

      I was suggesting that once you do the comparison manually and figure out
      that there is a problem, what do you do after that?

      For example, lets say that the bounce rate on a landing page has increased
      by 10% over the last week. What is the next course of action. Who should
      you target, what should you change etc. - I was suggesting can we automate
      this. Basically, for every report, have a mapping to some sort of
      "correction" suggestion.

      We can then may be even hook up each report to some behavioral targeting
      tool and start targeting straight from GA! Let me know if this sounds
      useful.

      - Pavan




      On Thu, Jun 27, 2013 at 1:06 AM, Wandering Dave Rhee
      <wdaveonline@...>wrote:

      > **
      >
      >
      > Hi, Pavan,
      >
      > That's interesting, that you would try to automate interpretation!
      >
      > Generally speaking, I view any report or set of reports as a snapshot in
      > time, and therefore static.
      >
      > Analysis is about looking at changes. You need two snapshots (e.g.,
      > compare this period to the prior period) in order to determine direction.
      >
      > If you know what actions (changes) took place during the period, then you
      > might be able to correlate them. If you are good at hypothesis creation
      > and testing, you might even be able to determine causality.
      >
      > Only if that interpretation is valid can you then begin to predict which
      > actions would cause better numbers.
      >
      > I suppose you could argue that if absolutely no activity were taking place,
      > and your reports generated different numbers, that something should be done
      > -- but it is difficult to imagine a scenario in which there are absolutely
      > no changes in anything (actions, environment, seasonality, etc.) take place
      > in any given reporting period.
      >
      > Which in turn means it's difficult to imagine a scenario in which automatic
      > actions would be generated from reporting, without contextual analysis or
      > interpretation.
      >
      > For very small sites, traffic is too low-volume and inconsistent (very
      > large standard deviations) to trust generating actions to automated
      > interpretation.
      >
      > For larger sites with enough volume, the investment, and expected returns,
      > are too large to trust generating actions to automated interpretation.
      >
      > Those are my personal opinions, anyway.
      >
      > WDave
      >
      > On Wed, Jun 26, 2013 at 1:41 PM, Pavan K S <itspanzi@...> wrote:
      >
      > > **
      >
      > >
      > >
      > > Hello,
      > >
      > > I wanted to find out - Of all the Google Analytics reports, do you have 3
      > > reports that stump you? By stump, I mean, you know they are telling you
      > > there's a problem, but its not immediately obvious what you do about it?
      > >
      > > For example, for me, Average Duration, Average Number of Pages Visits and
      > > Exit Pages are these reports. Its not clear what I should do when these
      > > numbers go bad.
      > >
      > > The reason I am asking this is because I have been thinking about the
      > > possibility of automating the process of converting any GA report into an
      > > insight that tells you what needs to be done. Something that you can put
      > > into action straight away. Having a laundry list of these reports would
      > be
      > > interesting.
      > >
      > > Thanks!
      > > Pavan
      > >
      > > --
      > > Don't Panic, I am Mostly Harmless
      > >
      > > [Non-text portions of this message have been removed]
      > >
      > >
      > >
      >
      > [Non-text portions of this message have been removed]
      >
      >
      >



      --
      Don't Panic, I am Mostly Harmless


      [Non-text portions of this message have been removed]
    • Wandering Dave Rhee
      Hi, Pavan, Thanks for clarifying the context -- but I still think it s not a good idea to automate the analysis! In your instance, the key point is still
      Message 2 of 6 , Jun 27, 2013
        Hi, Pavan,

        Thanks for clarifying the context -- but I still think it's not a good idea
        to automate the analysis!

        In your instance, the key point is still understanding WHY the bounce rate
        increased over the past week. Something must have changed.

        Either a change was made to the site, in which case you need to change it
        back, or change it again.

        Or a change was made to the behavior of the people coming to the site.
        Understanding this is critical to knowing what to change next.

        For example, if you are a news site, and you made no changes, then visitors
        came and saw old news, and therefore did not bother to click through.
        Recommended action: update the site with topical content.

        Or perhaps traffic increased due to someone linking to the site, but they
        set expectations incorrectly. People came expecting a site filled with
        Topic B, but you had only one blog post on that, and the rest of the
        content focuses mostly on Topic A. So yes, of the new traffic, more will
        bounce. But this is not necessarily bad -- your core traffic that came to
        you for Topic A was still satisfied and didn't bounce higher than usual --
        it was only the additional traffic looking for Topic B that bounced.

        In this case, you should have segmented your bounce traffic by referring
        source, then back-tracked to see what expectations the referrer set. Maybe
        no action is required. Or maybe the appropriate action is to put a comment
        in the referring page (e.g., a blog post that linked to your landing page)
        to clarify what your site offers, so those who click through will not be
        disappointed.

        Without finding out the cause of the change, I don't see how anyone could
        recommend making a change to a site that isn't purely guesswork, which
        could in turn make your numbers worse rather than better.

        On the other hand, if you have absolutely nothing to go on, then sure, try
        guessing -- if it gets worse, then undo the change, and do the opposite.
        But you're still just guessing. Few of my clients would tolerate that
        sort of an approach, even if "the automated tool" recommends it without
        providing relevant, plausible reasons.

        Avinash and I had a discussion about this many years ago (back when the
        original eMetrics was called E-Metrics, and held in Santa Barbara).
        Essentially, we concluded that any number (metric, KPI, or whatever jargon
        you want to use) is meaningless for activity UNLESS you segment it. Your
        "average" bounce rate, in this case, is a useless number. Find different
        ways to segment it, until you find out which segment performs differently
        than the average. Then focus your change activities on those actionable
        segments.

        In the example above, I suggested that the bounce rate might be segmented
        by referrer. It could be that you made a change to the landing page, and
        while the overall bounce rate got worse, it was due to one non-target,
        under-converting segment that got much worse, while the target
        highest-converting segments actually got better. This would be a wonderful
        finding -- your site increased conversion for your target segments, and
        didn't waste time trying to convert non-target segments! Send them off to
        a different landing page, and worry about them (focus on their needs and on
        converting them) there instead.

        Here's another way of saying it. "Actionable insights" are insights which
        apply to "actionable segments." A mass audience is not a segment, and
        therefore not actionable. Also, segments which cannot be targeted are also
        not actionable, and therefore insights about them are not actionable
        either. For example, since "owns a cat v. a dog" is not a browser property
        that is readable by most analytics tools, it does not help you to have an
        insight like, "cat owners convert more when the page background is green,
        while dog owners convert more when the background is blue." Yes, you have
        segments (cat owners v. dog owners), but they are not actionable, so your
        insight is useless.

        The same applies to something like a marketing persona. Sure, it's helpful
        to know that a teenage girl will buy differently than a retired man, but
        unless you can infer that part of their identity, then you can't actually
        act on it, can you? So you look for proxies -- if they were referred from
        a fashion magazine website, or a luxury automobile collector's magazine,
        you can infer which of those two segments the visitor probably came from,
        and then act on it accordingly.

        But if your landing page bounce rate goes up, and you think a tool can make
        a generic recommendation without actionable segmenting, then I think that
        particular automated tool is doomed to fail.

        Of course, I'd be happy to hear opinions from others, especially cases
        where an automated tool has proven to be successful!

        WDave


        On Thu, Jun 27, 2013 at 9:06 AM, Pavan K S <itspanzi@...> wrote:

        > **
        >
        >
        > Hi Dave,
        >
        > Thanks for the response. I think I phrased my question wrongly.
        >
        > I was suggesting that once you do the comparison manually and figure out
        > that there is a problem, what do you do after that?
        >
        > For example, lets say that the bounce rate on a landing page has increased
        > by 10% over the last week. What is the next course of action. Who should
        > you target, what should you change etc. - I was suggesting can we automate
        > this. Basically, for every report, have a mapping to some sort of
        > "correction" suggestion.
        >
        > We can then may be even hook up each report to some behavioral targeting
        > tool and start targeting straight from GA! Let me know if this sounds
        > useful.
        >
        > - Pavan
        >
        > On Thu, Jun 27, 2013 at 1:06 AM, Wandering Dave Rhee
        > <wdaveonline@...>wrote:
        >
        > > **
        >
        > >
        > >
        > > Hi, Pavan,
        > >
        > > That's interesting, that you would try to automate interpretation!
        > >
        > > Generally speaking, I view any report or set of reports as a snapshot in
        > > time, and therefore static.
        > >
        > > Analysis is about looking at changes. You need two snapshots (e.g.,
        > > compare this period to the prior period) in order to determine direction.
        > >
        > > If you know what actions (changes) took place during the period, then you
        > > might be able to correlate them. If you are good at hypothesis creation
        > > and testing, you might even be able to determine causality.
        > >
        > > Only if that interpretation is valid can you then begin to predict which
        > > actions would cause better numbers.
        > >
        > > I suppose you could argue that if absolutely no activity were taking
        > place,
        > > and your reports generated different numbers, that something should be
        > done
        > > -- but it is difficult to imagine a scenario in which there are
        > absolutely
        > > no changes in anything (actions, environment, seasonality, etc.) take
        > place
        > > in any given reporting period.
        > >
        > > Which in turn means it's difficult to imagine a scenario in which
        > automatic
        > > actions would be generated from reporting, without contextual analysis or
        > > interpretation.
        > >
        > > For very small sites, traffic is too low-volume and inconsistent (very
        > > large standard deviations) to trust generating actions to automated
        > > interpretation.
        > >
        > > For larger sites with enough volume, the investment, and expected
        > returns,
        > > are too large to trust generating actions to automated interpretation.
        > >
        > > Those are my personal opinions, anyway.
        > >
        > > WDave
        > >
        > > On Wed, Jun 26, 2013 at 1:41 PM, Pavan K S <itspanzi@...> wrote:
        > >
        > > > **
        >
        > >
        > > >
        > > >
        > > > Hello,
        > > >
        > > > I wanted to find out - Of all the Google Analytics reports, do you
        > have 3
        > > > reports that stump you? By stump, I mean, you know they are telling you
        > > > there's a problem, but its not immediately obvious what you do about
        > it?
        > > >
        > > > For example, for me, Average Duration, Average Number of Pages Visits
        > and
        > > > Exit Pages are these reports. Its not clear what I should do when these
        > > > numbers go bad.
        > > >
        > > > The reason I am asking this is because I have been thinking about the
        > > > possibility of automating the process of converting any GA report into
        > an
        > > > insight that tells you what needs to be done. Something that you can
        > put
        > > > into action straight away. Having a laundry list of these reports would
        > > be
        > > > interesting.
        > > >
        > > > Thanks!
        > > > Pavan
        > > >
        > > > --
        > > > Don't Panic, I am Mostly Harmless
        > > >
        > > > [Non-text portions of this message have been removed]
        > > >
        > > >
        > > >
        > >
        > > [Non-text portions of this message have been removed]
        > >
        > >
        > >
        >
        > --
        > Don't Panic, I am Mostly Harmless
        >
        > [Non-text portions of this message have been removed]
        >
        >
        >


        [Non-text portions of this message have been removed]
      Your message has been successfully submitted and would be delivered to recipients shortly.