Loading ...
Sorry, an error occurred while loading the content.
 

Re: Why do web analysts switch packages?

Expand Messages
  • romanojon
    Hey Robbin, As you know, we use the Omniture grouping of products. We ve had it for a year now and dealt with the shifting and changing. I admit it doesn t
    Message 1 of 14 , Sep 1, 2007
      Hey Robbin,

      As you know, we use the Omniture grouping of products. We've had it
      for a year now and dealt with the shifting and changing. I admit it
      doesn't surprise me that analysts would switch. It does surprise me
      that you infer that this is much more frequently than I anticipated.
      For some reason it just doesn't seem likely in the age of much more
      sophisticated tag-based and hybrid solutions. Maybe you have some
      stats on that. :)

      Omniture SiteCatalyst does exhibit some limitations which we would
      like to overcome, but, in all, I think Paul and I agree that the
      adoption of a new solution is not in our best interest. Here my
      reasons for this:

      1. Costs of a new solution, implementation, interface training, agency
      adaptation and re-centering of data is exorbitant and prohibitive. I
      see it as completely unnecessary in the absence of the promise of a
      superior insight driving crystal ball.

      2. The ability to really nurture something like predictive analysis in
      your process relies on the ability to seamlessly compare data in an
      apples to apples environment and to isolate events which cause
      movement in your trend lines and prepare for them. Package-hopping
      removes that necessary aggregate reporting. There is no way around
      this without taking enormous steps at the integrated dashboard level
      to accommodate the transition.

      3. On the macro scale of this topic, constant solution-hopping from
      analysts everywhere only leads to an abundance of underdeveloped tools
      wrought from the impetuous complaints of half-baked analytics
      programs. Think of this one most...if at every advent of a new KPI or
      metric of relative importance we switched to the vendor who had this
      packaged or offered some better-marketed version of something we each
      already have but are unaware of, the true structure of what we're
      operating in is made of sticks and rice-paper.

      In the interest of this becoming an exhaustive work for Saturday AM,
      I'll quit there, but, I will say, if the trend and propensity of
      analysts to switch tools is occurring with the frequency which you
      describe, I would become critical of it and pay very close attention
      to the solution-vendor market tectonics. Watch as the innovations
      stagnate to meet the impulsive demands of a consumer and not the
      comprehensive needs of the practitioner.

      Sincerely,

      Daniel W. Shields
      Analyst
      CableOrganizer.com

      http://danalytics.blogspot.com



      --- In webanalytics@yahoogroups.com, "robbinsteif" <steif@...> wrote:
      >
      > I have read (maybe in the 2002 Marketing Sherpa WA report? I must be
      > an elephant) that the average live of a WA package is 18 months -
      > companies switch a *lot*
      >
      > Why? Hosted analytics are so popular, and every time we switch, we
      > lose all our data. It's true that now we have GA and a free resource
      > to benchmark all our data and keep even when we switch a second
      > package, but this was the trend (I believe) way back before GA.
      >
      > So why do analysts switch so much?
      >
      >
      > Robbin
      >
    • metronomelabs
      In my expereince, it is either: a) because they outgrow the starter package b) because the features they were sold did not work as expected and eventually
      Message 2 of 14 , Sep 4, 2007
        In my expereince, it is either:
        a) because they outgrow the starter package
        b) because the features they were sold did not work as expected and
        eventually became a show-stopper.
        c) Marketing wanted more detail but IT baulked at more and more
        custom tagging and replaced the data colelction with packet sniffing
        emulating the tag server feeding the same analytics package.

        Doug Watt

        --- In webanalytics@yahoogroups.com, "romanojon" <daniel@...> wrote:
        >
        > Hey Robbin,
        >
        > As you know, we use the Omniture grouping of products. We've had it
        > for a year now and dealt with the shifting and changing. I admit it
        > doesn't surprise me that analysts would switch. It does surprise
        me
        > that you infer that this is much more frequently than I
        anticipated.
        > For some reason it just doesn't seem likely in the age of much more
        > sophisticated tag-based and hybrid solutions. Maybe you have some
        > stats on that. :)
        >
        > Omniture SiteCatalyst does exhibit some limitations which we would
        > like to overcome, but, in all, I think Paul and I agree that the
        > adoption of a new solution is not in our best interest. Here my
        > reasons for this:
        >
        > 1. Costs of a new solution, implementation, interface training,
        agency
        > adaptation and re-centering of data is exorbitant and prohibitive.
        I
        > see it as completely unnecessary in the absence of the promise of a
        > superior insight driving crystal ball.
        >
        > 2. The ability to really nurture something like predictive
        analysis in
        > your process relies on the ability to seamlessly compare data in an
        > apples to apples environment and to isolate events which cause
        > movement in your trend lines and prepare for them. Package-hopping
        > removes that necessary aggregate reporting. There is no way around
        > this without taking enormous steps at the integrated dashboard
        level
        > to accommodate the transition.
        >
        > 3. On the macro scale of this topic, constant solution-hopping from
        > analysts everywhere only leads to an abundance of underdeveloped
        tools
        > wrought from the impetuous complaints of half-baked analytics
        > programs. Think of this one most...if at every advent of a new KPI
        or
        > metric of relative importance we switched to the vendor who had
        this
        > packaged or offered some better-marketed version of something we
        each
        > already have but are unaware of, the true structure of what we're
        > operating in is made of sticks and rice-paper.
        >
        > In the interest of this becoming an exhaustive work for Saturday
        AM,
        > I'll quit there, but, I will say, if the trend and propensity of
        > analysts to switch tools is occurring with the frequency which you
        > describe, I would become critical of it and pay very close
        attention
        > to the solution-vendor market tectonics. Watch as the innovations
        > stagnate to meet the impulsive demands of a consumer and not the
        > comprehensive needs of the practitioner.
        >
        > Sincerely,
        >
        > Daniel W. Shields
        > Analyst
        > CableOrganizer.com
        >
        > http://danalytics.blogspot.com
        >
        >
        >
        > --- In webanalytics@yahoogroups.com, "robbinsteif" <steif@> wrote:
        > >
        > > I have read (maybe in the 2002 Marketing Sherpa WA report? I
        must be
        > > an elephant) that the average live of a WA package is 18 months -
        > > companies switch a *lot*
        > >
        > > Why? Hosted analytics are so popular, and every time we switch,
        we
        > > lose all our data. It's true that now we have GA and a free
        resource
        > > to benchmark all our data and keep even when we switch a second
        > > package, but this was the trend (I believe) way back before GA.
        > >
        > > So why do analysts switch so much?
        > >
        > >
        > > Robbin
        > >
        >
      • Scribner, Craig (Web Analytics and Testi
        In my experience, the most pressure to switch comes either from new exec s that are used to a different tool, or some buzz is floating around about a new or
        Message 3 of 14 , Sep 4, 2007
          In my experience, the most pressure to switch comes either from new
          exec's that are used to a different tool, or some buzz is floating
          around about a new or different product.



          I went to vendor's conference three or four years ago, and one of the
          panelists asked his fellow users to raise our hands if our
          implementation was over a year old. To those 10 or 15 percent of
          respondents he grinned and said, "feels good, doesn't it?" I don't think
          that statement was a validation of the specific vendor we were all
          using, but more as a commentary on our profession. There is tremendous
          value in sticking with the system you've got. Being able to product
          baselines and benchmarks at any time for almost any user activity on our
          site is critical for developing new tests. We're tweaking our capture
          methods all the time to bring into focus pieces about which we've only
          had partial, "blurry" data. But thankfully we've never had to start
          over.



          One of my most valuable contributions to my current company has been
          dealing with the buzz surrounding an alternate solution by saying,
          "Sounds like a great idea. Let me put it on the shelf for you."



          PS. Please ignore this message if you're a vendor. Give us better tools
          or we'll jump ship this second! :-)





          [Non-text portions of this message have been removed]
        Your message has been successfully submitted and would be delivered to recipients shortly.