Loading ...
Sorry, an error occurred while loading the content.

Re: Why do web analysts switch packages?

Expand Messages
  • robbinsteif
    What a great analogy, pasta making. (Of course, the tool whore is my favorite entry, it is too bad that I am not running a contest.) Of course, this would
    Message 1 of 14 , Aug 31 7:58 PM
    • 0 Attachment
      What a great analogy, pasta making. (Of course, the tool whore is my
      favorite entry, it is too bad that I am not running a contest.)

      Of course, this would imply that we mess up the first time only
      because we don't understand our atmosphere. So the second time we
      should know what to buy. (This reminds me of a great blog post, but
      everyone has already read it by now.)

      So then we can say, "It makes perfect sense that people change the
      first time. Why do we change twice and three times and...."??

      It could be a) People still sit around and have those feet up on the
      table, watch the fancy stuff analyses, i.e. the Jon Bovard answer b)
      Someone new comes in and doesn't know what has to get measured (and of
      course, no one else in the organization understands WA) and we go
      through it again c) we really are tool whores.

      Or it could be that the needs of the organization change.

      These were all such great thoughts, yours and everyone else's.

      Robbin

      --- In webanalytics@yahoogroups.com, "Paula Thornton" <iknovate@...>
      wrote:
      >
      > Jon gave a great list. And while we always like to see lists (they
      help us
      > organize the critical information after the fact), I can tell you
      that the
      > lists are nearly useless: you can't see going in what you realize coming
      > out.
      >
      > While my involvement in tools has been more from a behavioral
      perspective
      > than a transactional perspective (e.g. absolutely no interest in SEO or
      > commerce, but in effectiveness of design and assessment of needs),
      you don't
      > realize the value or the limitations of the tools until you work
      with the
      > data.
      >
      > People underestimate data. They see it as a collection of inert
      artifacts.
      > Data has personality -- and it changes. The appropriateness of a
      tool (and
      > how to tune it) are very specific to the business model and the data
      that
      > goes with it. You can't intellectualize that on the way in.
      >
      > I can categorically tell you that the things I've found most valuable or
      > least valuable about tools was only discovered when running the data
      through
      > it.
      >
      > Consider this analogy (one that I don't even have first-hand experience
      > with). Say you're looking for a pasta maker. You do all the research and
      > even look at blogs/discussions. You pick one. It's an absolute disaster.
      > Why? Because you live in a high altitude, dry environment and the
      texture of
      > your pasta dough is far different than the other conditions in which the
      > device performed flawlessly.
      >
      >
      > [Non-text portions of this message have been removed]
      >
    • romanojon
      Hey Robbin, As you know, we use the Omniture grouping of products. We ve had it for a year now and dealt with the shifting and changing. I admit it doesn t
      Message 2 of 14 , Sep 1, 2007
      • 0 Attachment
        Hey Robbin,

        As you know, we use the Omniture grouping of products. We've had it
        for a year now and dealt with the shifting and changing. I admit it
        doesn't surprise me that analysts would switch. It does surprise me
        that you infer that this is much more frequently than I anticipated.
        For some reason it just doesn't seem likely in the age of much more
        sophisticated tag-based and hybrid solutions. Maybe you have some
        stats on that. :)

        Omniture SiteCatalyst does exhibit some limitations which we would
        like to overcome, but, in all, I think Paul and I agree that the
        adoption of a new solution is not in our best interest. Here my
        reasons for this:

        1. Costs of a new solution, implementation, interface training, agency
        adaptation and re-centering of data is exorbitant and prohibitive. I
        see it as completely unnecessary in the absence of the promise of a
        superior insight driving crystal ball.

        2. The ability to really nurture something like predictive analysis in
        your process relies on the ability to seamlessly compare data in an
        apples to apples environment and to isolate events which cause
        movement in your trend lines and prepare for them. Package-hopping
        removes that necessary aggregate reporting. There is no way around
        this without taking enormous steps at the integrated dashboard level
        to accommodate the transition.

        3. On the macro scale of this topic, constant solution-hopping from
        analysts everywhere only leads to an abundance of underdeveloped tools
        wrought from the impetuous complaints of half-baked analytics
        programs. Think of this one most...if at every advent of a new KPI or
        metric of relative importance we switched to the vendor who had this
        packaged or offered some better-marketed version of something we each
        already have but are unaware of, the true structure of what we're
        operating in is made of sticks and rice-paper.

        In the interest of this becoming an exhaustive work for Saturday AM,
        I'll quit there, but, I will say, if the trend and propensity of
        analysts to switch tools is occurring with the frequency which you
        describe, I would become critical of it and pay very close attention
        to the solution-vendor market tectonics. Watch as the innovations
        stagnate to meet the impulsive demands of a consumer and not the
        comprehensive needs of the practitioner.

        Sincerely,

        Daniel W. Shields
        Analyst
        CableOrganizer.com

        http://danalytics.blogspot.com



        --- In webanalytics@yahoogroups.com, "robbinsteif" <steif@...> wrote:
        >
        > I have read (maybe in the 2002 Marketing Sherpa WA report? I must be
        > an elephant) that the average live of a WA package is 18 months -
        > companies switch a *lot*
        >
        > Why? Hosted analytics are so popular, and every time we switch, we
        > lose all our data. It's true that now we have GA and a free resource
        > to benchmark all our data and keep even when we switch a second
        > package, but this was the trend (I believe) way back before GA.
        >
        > So why do analysts switch so much?
        >
        >
        > Robbin
        >
      • metronomelabs
        In my expereince, it is either: a) because they outgrow the starter package b) because the features they were sold did not work as expected and eventually
        Message 3 of 14 , Sep 4, 2007
        • 0 Attachment
          In my expereince, it is either:
          a) because they outgrow the starter package
          b) because the features they were sold did not work as expected and
          eventually became a show-stopper.
          c) Marketing wanted more detail but IT baulked at more and more
          custom tagging and replaced the data colelction with packet sniffing
          emulating the tag server feeding the same analytics package.

          Doug Watt

          --- In webanalytics@yahoogroups.com, "romanojon" <daniel@...> wrote:
          >
          > Hey Robbin,
          >
          > As you know, we use the Omniture grouping of products. We've had it
          > for a year now and dealt with the shifting and changing. I admit it
          > doesn't surprise me that analysts would switch. It does surprise
          me
          > that you infer that this is much more frequently than I
          anticipated.
          > For some reason it just doesn't seem likely in the age of much more
          > sophisticated tag-based and hybrid solutions. Maybe you have some
          > stats on that. :)
          >
          > Omniture SiteCatalyst does exhibit some limitations which we would
          > like to overcome, but, in all, I think Paul and I agree that the
          > adoption of a new solution is not in our best interest. Here my
          > reasons for this:
          >
          > 1. Costs of a new solution, implementation, interface training,
          agency
          > adaptation and re-centering of data is exorbitant and prohibitive.
          I
          > see it as completely unnecessary in the absence of the promise of a
          > superior insight driving crystal ball.
          >
          > 2. The ability to really nurture something like predictive
          analysis in
          > your process relies on the ability to seamlessly compare data in an
          > apples to apples environment and to isolate events which cause
          > movement in your trend lines and prepare for them. Package-hopping
          > removes that necessary aggregate reporting. There is no way around
          > this without taking enormous steps at the integrated dashboard
          level
          > to accommodate the transition.
          >
          > 3. On the macro scale of this topic, constant solution-hopping from
          > analysts everywhere only leads to an abundance of underdeveloped
          tools
          > wrought from the impetuous complaints of half-baked analytics
          > programs. Think of this one most...if at every advent of a new KPI
          or
          > metric of relative importance we switched to the vendor who had
          this
          > packaged or offered some better-marketed version of something we
          each
          > already have but are unaware of, the true structure of what we're
          > operating in is made of sticks and rice-paper.
          >
          > In the interest of this becoming an exhaustive work for Saturday
          AM,
          > I'll quit there, but, I will say, if the trend and propensity of
          > analysts to switch tools is occurring with the frequency which you
          > describe, I would become critical of it and pay very close
          attention
          > to the solution-vendor market tectonics. Watch as the innovations
          > stagnate to meet the impulsive demands of a consumer and not the
          > comprehensive needs of the practitioner.
          >
          > Sincerely,
          >
          > Daniel W. Shields
          > Analyst
          > CableOrganizer.com
          >
          > http://danalytics.blogspot.com
          >
          >
          >
          > --- In webanalytics@yahoogroups.com, "robbinsteif" <steif@> wrote:
          > >
          > > I have read (maybe in the 2002 Marketing Sherpa WA report? I
          must be
          > > an elephant) that the average live of a WA package is 18 months -
          > > companies switch a *lot*
          > >
          > > Why? Hosted analytics are so popular, and every time we switch,
          we
          > > lose all our data. It's true that now we have GA and a free
          resource
          > > to benchmark all our data and keep even when we switch a second
          > > package, but this was the trend (I believe) way back before GA.
          > >
          > > So why do analysts switch so much?
          > >
          > >
          > > Robbin
          > >
          >
        • Scribner, Craig (Web Analytics and Testi
          In my experience, the most pressure to switch comes either from new exec s that are used to a different tool, or some buzz is floating around about a new or
          Message 4 of 14 , Sep 4, 2007
          • 0 Attachment
            In my experience, the most pressure to switch comes either from new
            exec's that are used to a different tool, or some buzz is floating
            around about a new or different product.



            I went to vendor's conference three or four years ago, and one of the
            panelists asked his fellow users to raise our hands if our
            implementation was over a year old. To those 10 or 15 percent of
            respondents he grinned and said, "feels good, doesn't it?" I don't think
            that statement was a validation of the specific vendor we were all
            using, but more as a commentary on our profession. There is tremendous
            value in sticking with the system you've got. Being able to product
            baselines and benchmarks at any time for almost any user activity on our
            site is critical for developing new tests. We're tweaking our capture
            methods all the time to bring into focus pieces about which we've only
            had partial, "blurry" data. But thankfully we've never had to start
            over.



            One of my most valuable contributions to my current company has been
            dealing with the buzz surrounding an alternate solution by saying,
            "Sounds like a great idea. Let me put it on the shelf for you."



            PS. Please ignore this message if you're a vendor. Give us better tools
            or we'll jump ship this second! :-)





            [Non-text portions of this message have been removed]
          Your message has been successfully submitted and would be delivered to recipients shortly.