Loading ...
Sorry, an error occurred while loading the content.

Why do web analysts switch packages?

Expand Messages
  • robbinsteif
    I have read (maybe in the 2002 Marketing Sherpa WA report? I must be an elephant) that the average live of a WA package is 18 months - companies switch a *lot*
    Message 1 of 14 , Aug 29, 2007
    View Source
    • 0 Attachment
      I have read (maybe in the 2002 Marketing Sherpa WA report? I must be
      an elephant) that the average live of a WA package is 18 months -
      companies switch a *lot*

      Why? Hosted analytics are so popular, and every time we switch, we
      lose all our data. It's true that now we have GA and a free resource
      to benchmark all our data and keep even when we switch a second
      package, but this was the trend (I believe) way back before GA.

      So why do analysts switch so much?


      Robbin
    • jay.allen@cutterbuck.com
      Some of this is probably upgrading. In particular, I think new companies tend to underrate analytics initially. Then go and upgrade to a more robust system
      Message 2 of 14 , Aug 30, 2007
      View Source
      • 0 Attachment
        Some of this is probably upgrading. In particular, I think new companies
        tend to underrate analytics initially. Then go and upgrade to a more
        robust system within a year or two. These initial switches probably
        bring the average way down.



        I have no evidence to prove this, just a hunch. But I've seen this
        happen on several occasions.



        Thanks,

        Jay Allen

        Cutter & Buck

        770-410-4700 x4708



        ________________________________

        From: webanalytics@yahoogroups.com [mailto:webanalytics@yahoogroups.com]
        On Behalf Of robbinsteif
        Sent: Wednesday, August 29, 2007 10:24 PM
        To: webanalytics@yahoogroups.com
        Subject: [webanalytics] Why do web analysts switch packages?



        I have read (maybe in the 2002 Marketing Sherpa WA report? I must be
        an elephant) that the average live of a WA package is 18 months -
        companies switch a *lot*

        Why? Hosted analytics are so popular, and every time we switch, we
        lose all our data. It's true that now we have GA and a free resource
        to benchmark all our data and keep even when we switch a second
        package, but this was the trend (I believe) way back before GA.

        So why do analysts switch so much?

        Robbin





        [Non-text portions of this message have been removed]
      • Adam Berlinger
        Frustration with an incumbent solution or bad customer service experience or Pricing I think a more interesting question is who is losing most of their
        Message 3 of 14 , Aug 30, 2007
        View Source
        • 0 Attachment
          Frustration with an incumbent solution or bad customer service experience or Pricing

          I think a more interesting question is who is losing most of their customers and to whom? In other words, where is all the business going and why?


          Thanks,
          Adam
          http://analyticsbyadam.blogspot.com

          ----- Original Message ----
          From: robbinsteif <steif@...>
          To: webanalytics@yahoogroups.com
          Sent: Wednesday, August 29, 2007 9:24:02 PM
          Subject: [webanalytics] Why do web analysts switch packages?



          I have read (maybe in the 2002 Marketing Sherpa WA report? I must be

          an elephant) that the average live of a WA package is 18 months -

          companies switch a *lot*



          Why? Hosted analytics are so popular, and every time we switch, we

          lose all our data. It's true that now we have GA and a free resource

          to benchmark all our data and keep even when we switch a second

          package, but this was the trend (I believe) way back before GA.



          So why do analysts switch so much?

          Robbin
        • Debora Geary
          ... Great question! Top 3 reasons I see in my world: 1) Vendor customer service - after 18 months, when the tool is still not working as advertised... 2)
          Message 4 of 14 , Aug 30, 2007
          View Source
          • 0 Attachment
            >
            > So why do analysts switch so much?
            >


            Great question! Top 3 reasons I see in my world:

            1) Vendor customer service - after 18 months, when the tool is still
            not working as advertised...
            2) New people (turnover, never investng properly in training, etc -
            new person has different favorite tool)
            3) Tools in parallel - is it possible that the MarketingSherpa data
            doesn't account for people adding a second tool, I'm seeing this a lot

            Interested in what others think!

            Debora
          • Do Duong
            My top 5 reasons. 1) Blame it on the tool/vendor when you aren t generating any positive change from it 2) Consultants who are tool whores 3) Vendor customer
            Message 5 of 14 , Aug 30, 2007
            View Source
            • 0 Attachment
              My top 5 reasons.

              1) Blame it on the tool/vendor when you aren't generating any positive
              change from it
              2) Consultants who are tool whores
              3) Vendor customer service does suck. However, I think customer service
              should only be treated as customer service and not some kind of tier 1,
              best practices resource. The vendors are being treated unfairly. Go to
              a WA consultancy/agency if you want advanced implementation, business
              recommendation, and eventual revenue lift.
              4) Interdisciplinary field. There's a lot of ramp up to understand the
              web, WA technology, WA reports, and then finally finding insight. So
              blame it on the vendor.
              5) Self-education. This one applies to all organization systems. So
              blame it on the vendor.

              It's the chicken and the egg with #1 and #3 combined.

              I have never worked for a vendor (I have used Webtrends, Hitbox, and
              SiteCatalyst).

              Do

              ________________________________

              From: webanalytics@yahoogroups.com [mailto:webanalytics@yahoogroups.com]
              On Behalf Of Debora Geary
              Sent: Thursday, August 30, 2007 8:01 AM
              To: webanalytics@yahoogroups.com
              Subject: [webanalytics] Re: Why do web analysts switch packages?



              >
              > So why do analysts switch so much?
              >

              Great question! Top 3 reasons I see in my world:

              1) Vendor customer service - after 18 months, when the tool is still
              not working as advertised...
              2) New people (turnover, never investng properly in training, etc -
              new person has different favorite tool)
              3) Tools in parallel - is it possible that the MarketingSherpa data
              doesn't account for people adding a second tool, I'm seeing this a lot

              Interested in what others think!

              Debora






              [Non-text portions of this message have been removed]
            • miles@milesbennett.co.uk
              I think also that because the companies are either 1. Starting up and have limited budgets or 2. Feel that they do not have the additional databases to opt for
              Message 6 of 14 , Aug 30, 2007
              View Source
              • 0 Attachment
                I think also that because the companies are either 1. Starting up and have limited budgets or 2. Feel that they do not have the additional databases to opt for a enterprises solution like webtrends or sas ebis.

                Reporting tools like indextools are fantastic for visually displaying the data and requires limited knowledge of web programming.

                The main focus is to give businesses a taste of what is available and then they (in my experience) want more!

                Miles
                www.milesbennett.co.uk

                Sent from my BlackBerry® wireless device

                -----Original Message-----
                From: <jay.allen@...>

                Date: Thu, 30 Aug 2007 06:40:09
                To:<webanalytics@yahoogroups.com>
                Subject: RE: [webanalytics] Why do web analysts switch packages?


                Some of this is probably upgrading. In particular, I think new companies
                tend to underrate analytics initially. Then go and upgrade to a more
                robust system within a year or two. These initial switches probably
                bring the average way down.

                I have no evidence to prove this, just a hunch. But I've seen this
                happen on several occasions.

                Thanks,

                Jay Allen

                Cutter & Buck

                770-410-4700 x4708

                ________________________________

                From: webanalytics@ <mailto:webanalytics%40yahoogroups.com> yahoogroups.com [mailto:webanalytics@ <mailto:webanalytics%40yahoogroups.com> yahoogroups.com]
                On Behalf Of robbinsteif
                Sent: Wednesday, August 29, 2007 10:24 PM
                To: webanalytics@ <mailto:webanalytics%40yahoogroups.com> yahoogroups.com
                Subject: [webanalytics] Why do web analysts switch packages?

                I have read (maybe in the 2002 Marketing Sherpa WA report? I must be
                an elephant) that the average live of a WA package is 18 months -
                companies switch a *lot*

                Why? Hosted analytics are so popular, and every time we switch, we
                lose all our data. It's true that now we have GA and a free resource
                to benchmark all our data and keep even when we switch a second
                package, but this was the trend (I believe) way back before GA.

                So why do analysts switch so much?

                Robbin

                [Non-text portions of this message have been removed]
              • Leslie Chacon
                Hi, I think for me honestly, the experience that I have encountered with each tool and the help from the vendors to make sure that I have created the tool I
                Message 7 of 14 , Aug 30, 2007
                View Source
                • 0 Attachment
                  Hi,

                  I think for me honestly, the experience that I have encountered with each tool and the help from the vendors to make sure that I have created the tool I needed, and not still set on default, can play a big part of switching tools. I have worked with a few vendors and in my experience HBX reps were probably the worst to work with unless your account was "enterprise" I felt (although they do have some really good reports, once you read the manuel). I also work with Clicktracks, and I must say that it seems that their strategy is like one of a boutique company. Friendly and timely staff that responds to you without trying to sell you more professioanl services. Another company that was really great (even for implementaion!) was Clickpath. They say a good analyst can pull actionable data from any tool but I guess as one gets more comfortable in their practice & methodologies one will want the tools and services that their comfortable using to optimize results.

                  Do Duong <dduong@...> wrote:
                  My top 5 reasons.

                  1) Blame it on the tool/vendor when you aren't generating any positive
                  change from it
                  2) Consultants who are tool whores
                  3) Vendor customer service does suck. However, I think customer service
                  should only be treated as customer service and not some kind of tier 1,
                  best practices resource. The vendors are being treated unfairly. Go to
                  a WA consultancy/agency if you want advanced implementation, business
                  recommendation, and eventual revenue lift.
                  4) Interdisciplinary field. There's a lot of ramp up to understand the
                  web, WA technology, WA reports, and then finally finding insight. So
                  blame it on the vendor.
                  5) Self-education. This one applies to all organization systems. So
                  blame it on the vendor.

                  It's the chicken and the egg with #1 and #3 combined.

                  I have never worked for a vendor (I have used Webtrends, Hitbox, and
                  SiteCatalyst).

                  Do

                  ________________________________

                  From: webanalytics@yahoogroups.com [mailto:webanalytics@yahoogroups.com]
                  On Behalf Of Debora Geary
                  Sent: Thursday, August 30, 2007 8:01 AM
                  To: webanalytics@yahoogroups.com
                  Subject: [webanalytics] Re: Why do web analysts switch packages?

                  >
                  > So why do analysts switch so much?
                  >

                  Great question! Top 3 reasons I see in my world:

                  1) Vendor customer service - after 18 months, when the tool is still
                  not working as advertised...
                  2) New people (turnover, never investng properly in training, etc -
                  new person has different favorite tool)
                  3) Tools in parallel - is it possible that the MarketingSherpa data
                  doesn't account for people adding a second tool, I'm seeing this a lot

                  Interested in what others think!

                  Debora

                  [Non-text portions of this message have been removed]






                  ---------------------------------
                  Looking for a deal? Find great prices on flights and hotels with Yahoo! FareChase.

                  [Non-text portions of this message have been removed]
                • bryan.cristina
                  My take is that they either didn t understand the tool well enough, or they got the wrong tool for the business in the first place. I can see a lot of
                  Message 8 of 14 , Aug 31, 2007
                  View Source
                  • 0 Attachment
                    My take is that they either didn't understand the tool well enough, or
                    they got the wrong tool for the business in the first place.

                    I can see a lot of companies changing their packages after going with
                    something like Google and then realizing that it's just far too
                    limited to do the kinds of things they're looking to do. Of course,
                    that didn't happen right away, it took a year or two of someone really
                    understanding the tool, trends in the industry, and realizing that it
                    just wasn't going to happen with what they had.

                    And then there's the others who just don't know how to configure a
                    tool properly and blame the tool for everything. Sure, every tool
                    sucks in its own ways, but you don't know that until you actually use
                    more than one. In my previous job I basically had exposure to 1 tool
                    and I had my issues with parts of it, but as I've moved on from that
                    company and have gotten to work with many tools in this field, I've
                    realized that:

                    -Sales people suck, generally, though there are some gems out there
                    -Information about the products on the company's website is worthless
                    -All products are great at something
                    -All products suck at something

                    I think if you make the right choice in the first place as to which
                    tool is a good fit, you can take advantage of the tools strengths and
                    hopefully not be too bothered by its weaknesses.
                  • jon bovard
                    Main reasons Web analytics fails is as follows Typical sequence for loser organisation 1. Ring vendors 2. Organise meetings/demo s 3. Listen to salesman,
                    Message 9 of 14 , Aug 31, 2007
                    View Source
                    • 0 Attachment
                      Main reasons Web analytics "fails" is as follows

                      Typical sequence for loser organisation
                      1. Ring vendors
                      2. Organise meetings/demo's
                      3. Listen to salesman, Read impressive case study spam. Listen to
                      rhetoric BS.
                      4. Have internal meetings, spend time looking at demo's
                      5. Perhaps Trial a vendor
                      6. Make decision based on gut-feel, coolest reports, funkiest gadgets,
                      best sales pitch, cheapest price, ease of integration, client list,
                      best reputation.
                      7. Put feet up on desk and wait for god or vendor to sort it out

                      The correct sequence for a winner organisation
                      1. Define business model/strategy
                      2. Identify drivers for success
                      3. Define specific KPI's based on those drivers and the levers that
                      improved those drivers
                      4. Find a tool that reports on those KPI's

                      Main reasons for failure:
                      - NUMBER 1 REASON!! Poor understanding of business drivers and/or
                      business model and inability of commercial staff to define KPI's based
                      on business strategy/business drivers
                      - Inability of tool to report KPI's
                      - Inability to combine web data with other data
                      - Poor vendor support

                      Define strategy and KPI's and find the tool. Not the other way around
                      folks.

                      cheers
                      jon


                      --- In webanalytics@yahoogroups.com, "bryan.cristina"
                      <bryan.cristina@...> wrote:
                      >
                      > My take is that they either didn't understand the tool well enough, or
                      > they got the wrong tool for the business in the first place.
                      >
                      > I can see a lot of companies changing their packages after going with
                      > something like Google and then realizing that it's just far too
                      > limited to do the kinds of things they're looking to do. Of course,
                      > that didn't happen right away, it took a year or two of someone really
                      > understanding the tool, trends in the industry, and realizing that it
                      > just wasn't going to happen with what they had.
                      >
                      > And then there's the others who just don't know how to configure a
                      > tool properly and blame the tool for everything. Sure, every tool
                      > sucks in its own ways, but you don't know that until you actually use
                      > more than one. In my previous job I basically had exposure to 1 tool
                      > and I had my issues with parts of it, but as I've moved on from that
                      > company and have gotten to work with many tools in this field, I've
                      > realized that:
                      >
                      > -Sales people suck, generally, though there are some gems out there
                      > -Information about the products on the company's website is worthless
                      > -All products are great at something
                      > -All products suck at something
                      >
                      > I think if you make the right choice in the first place as to which
                      > tool is a good fit, you can take advantage of the tools strengths and
                      > hopefully not be too bothered by its weaknesses.
                      >
                    • Paula Thornton
                      Jon gave a great list. And while we always like to see lists (they help us organize the critical information after the fact), I can tell you that the lists are
                      Message 10 of 14 , Aug 31, 2007
                      View Source
                      • 0 Attachment
                        Jon gave a great list. And while we always like to see lists (they help us
                        organize the critical information after the fact), I can tell you that the
                        lists are nearly useless: you can't see going in what you realize coming
                        out.

                        While my involvement in tools has been more from a behavioral perspective
                        than a transactional perspective (e.g. absolutely no interest in SEO or
                        commerce, but in effectiveness of design and assessment of needs), you don't
                        realize the value or the limitations of the tools until you work with the
                        data.

                        People underestimate data. They see it as a collection of inert artifacts.
                        Data has personality -- and it changes. The appropriateness of a tool (and
                        how to tune it) are very specific to the business model and the data that
                        goes with it. You can't intellectualize that on the way in.

                        I can categorically tell you that the things I've found most valuable or
                        least valuable about tools was only discovered when running the data through
                        it.

                        Consider this analogy (one that I don't even have first-hand experience
                        with). Say you're looking for a pasta maker. You do all the research and
                        even look at blogs/discussions. You pick one. It's an absolute disaster.
                        Why? Because you live in a high altitude, dry environment and the texture of
                        your pasta dough is far different than the other conditions in which the
                        device performed flawlessly.


                        [Non-text portions of this message have been removed]
                      • robbinsteif
                        What a great analogy, pasta making. (Of course, the tool whore is my favorite entry, it is too bad that I am not running a contest.) Of course, this would
                        Message 11 of 14 , Aug 31, 2007
                        View Source
                        • 0 Attachment
                          What a great analogy, pasta making. (Of course, the tool whore is my
                          favorite entry, it is too bad that I am not running a contest.)

                          Of course, this would imply that we mess up the first time only
                          because we don't understand our atmosphere. So the second time we
                          should know what to buy. (This reminds me of a great blog post, but
                          everyone has already read it by now.)

                          So then we can say, "It makes perfect sense that people change the
                          first time. Why do we change twice and three times and...."??

                          It could be a) People still sit around and have those feet up on the
                          table, watch the fancy stuff analyses, i.e. the Jon Bovard answer b)
                          Someone new comes in and doesn't know what has to get measured (and of
                          course, no one else in the organization understands WA) and we go
                          through it again c) we really are tool whores.

                          Or it could be that the needs of the organization change.

                          These were all such great thoughts, yours and everyone else's.

                          Robbin

                          --- In webanalytics@yahoogroups.com, "Paula Thornton" <iknovate@...>
                          wrote:
                          >
                          > Jon gave a great list. And while we always like to see lists (they
                          help us
                          > organize the critical information after the fact), I can tell you
                          that the
                          > lists are nearly useless: you can't see going in what you realize coming
                          > out.
                          >
                          > While my involvement in tools has been more from a behavioral
                          perspective
                          > than a transactional perspective (e.g. absolutely no interest in SEO or
                          > commerce, but in effectiveness of design and assessment of needs),
                          you don't
                          > realize the value or the limitations of the tools until you work
                          with the
                          > data.
                          >
                          > People underestimate data. They see it as a collection of inert
                          artifacts.
                          > Data has personality -- and it changes. The appropriateness of a
                          tool (and
                          > how to tune it) are very specific to the business model and the data
                          that
                          > goes with it. You can't intellectualize that on the way in.
                          >
                          > I can categorically tell you that the things I've found most valuable or
                          > least valuable about tools was only discovered when running the data
                          through
                          > it.
                          >
                          > Consider this analogy (one that I don't even have first-hand experience
                          > with). Say you're looking for a pasta maker. You do all the research and
                          > even look at blogs/discussions. You pick one. It's an absolute disaster.
                          > Why? Because you live in a high altitude, dry environment and the
                          texture of
                          > your pasta dough is far different than the other conditions in which the
                          > device performed flawlessly.
                          >
                          >
                          > [Non-text portions of this message have been removed]
                          >
                        • romanojon
                          Hey Robbin, As you know, we use the Omniture grouping of products. We ve had it for a year now and dealt with the shifting and changing. I admit it doesn t
                          Message 12 of 14 , Sep 1, 2007
                          View Source
                          • 0 Attachment
                            Hey Robbin,

                            As you know, we use the Omniture grouping of products. We've had it
                            for a year now and dealt with the shifting and changing. I admit it
                            doesn't surprise me that analysts would switch. It does surprise me
                            that you infer that this is much more frequently than I anticipated.
                            For some reason it just doesn't seem likely in the age of much more
                            sophisticated tag-based and hybrid solutions. Maybe you have some
                            stats on that. :)

                            Omniture SiteCatalyst does exhibit some limitations which we would
                            like to overcome, but, in all, I think Paul and I agree that the
                            adoption of a new solution is not in our best interest. Here my
                            reasons for this:

                            1. Costs of a new solution, implementation, interface training, agency
                            adaptation and re-centering of data is exorbitant and prohibitive. I
                            see it as completely unnecessary in the absence of the promise of a
                            superior insight driving crystal ball.

                            2. The ability to really nurture something like predictive analysis in
                            your process relies on the ability to seamlessly compare data in an
                            apples to apples environment and to isolate events which cause
                            movement in your trend lines and prepare for them. Package-hopping
                            removes that necessary aggregate reporting. There is no way around
                            this without taking enormous steps at the integrated dashboard level
                            to accommodate the transition.

                            3. On the macro scale of this topic, constant solution-hopping from
                            analysts everywhere only leads to an abundance of underdeveloped tools
                            wrought from the impetuous complaints of half-baked analytics
                            programs. Think of this one most...if at every advent of a new KPI or
                            metric of relative importance we switched to the vendor who had this
                            packaged or offered some better-marketed version of something we each
                            already have but are unaware of, the true structure of what we're
                            operating in is made of sticks and rice-paper.

                            In the interest of this becoming an exhaustive work for Saturday AM,
                            I'll quit there, but, I will say, if the trend and propensity of
                            analysts to switch tools is occurring with the frequency which you
                            describe, I would become critical of it and pay very close attention
                            to the solution-vendor market tectonics. Watch as the innovations
                            stagnate to meet the impulsive demands of a consumer and not the
                            comprehensive needs of the practitioner.

                            Sincerely,

                            Daniel W. Shields
                            Analyst
                            CableOrganizer.com

                            http://danalytics.blogspot.com



                            --- In webanalytics@yahoogroups.com, "robbinsteif" <steif@...> wrote:
                            >
                            > I have read (maybe in the 2002 Marketing Sherpa WA report? I must be
                            > an elephant) that the average live of a WA package is 18 months -
                            > companies switch a *lot*
                            >
                            > Why? Hosted analytics are so popular, and every time we switch, we
                            > lose all our data. It's true that now we have GA and a free resource
                            > to benchmark all our data and keep even when we switch a second
                            > package, but this was the trend (I believe) way back before GA.
                            >
                            > So why do analysts switch so much?
                            >
                            >
                            > Robbin
                            >
                          • metronomelabs
                            In my expereince, it is either: a) because they outgrow the starter package b) because the features they were sold did not work as expected and eventually
                            Message 13 of 14 , Sep 4, 2007
                            View Source
                            • 0 Attachment
                              In my expereince, it is either:
                              a) because they outgrow the starter package
                              b) because the features they were sold did not work as expected and
                              eventually became a show-stopper.
                              c) Marketing wanted more detail but IT baulked at more and more
                              custom tagging and replaced the data colelction with packet sniffing
                              emulating the tag server feeding the same analytics package.

                              Doug Watt

                              --- In webanalytics@yahoogroups.com, "romanojon" <daniel@...> wrote:
                              >
                              > Hey Robbin,
                              >
                              > As you know, we use the Omniture grouping of products. We've had it
                              > for a year now and dealt with the shifting and changing. I admit it
                              > doesn't surprise me that analysts would switch. It does surprise
                              me
                              > that you infer that this is much more frequently than I
                              anticipated.
                              > For some reason it just doesn't seem likely in the age of much more
                              > sophisticated tag-based and hybrid solutions. Maybe you have some
                              > stats on that. :)
                              >
                              > Omniture SiteCatalyst does exhibit some limitations which we would
                              > like to overcome, but, in all, I think Paul and I agree that the
                              > adoption of a new solution is not in our best interest. Here my
                              > reasons for this:
                              >
                              > 1. Costs of a new solution, implementation, interface training,
                              agency
                              > adaptation and re-centering of data is exorbitant and prohibitive.
                              I
                              > see it as completely unnecessary in the absence of the promise of a
                              > superior insight driving crystal ball.
                              >
                              > 2. The ability to really nurture something like predictive
                              analysis in
                              > your process relies on the ability to seamlessly compare data in an
                              > apples to apples environment and to isolate events which cause
                              > movement in your trend lines and prepare for them. Package-hopping
                              > removes that necessary aggregate reporting. There is no way around
                              > this without taking enormous steps at the integrated dashboard
                              level
                              > to accommodate the transition.
                              >
                              > 3. On the macro scale of this topic, constant solution-hopping from
                              > analysts everywhere only leads to an abundance of underdeveloped
                              tools
                              > wrought from the impetuous complaints of half-baked analytics
                              > programs. Think of this one most...if at every advent of a new KPI
                              or
                              > metric of relative importance we switched to the vendor who had
                              this
                              > packaged or offered some better-marketed version of something we
                              each
                              > already have but are unaware of, the true structure of what we're
                              > operating in is made of sticks and rice-paper.
                              >
                              > In the interest of this becoming an exhaustive work for Saturday
                              AM,
                              > I'll quit there, but, I will say, if the trend and propensity of
                              > analysts to switch tools is occurring with the frequency which you
                              > describe, I would become critical of it and pay very close
                              attention
                              > to the solution-vendor market tectonics. Watch as the innovations
                              > stagnate to meet the impulsive demands of a consumer and not the
                              > comprehensive needs of the practitioner.
                              >
                              > Sincerely,
                              >
                              > Daniel W. Shields
                              > Analyst
                              > CableOrganizer.com
                              >
                              > http://danalytics.blogspot.com
                              >
                              >
                              >
                              > --- In webanalytics@yahoogroups.com, "robbinsteif" <steif@> wrote:
                              > >
                              > > I have read (maybe in the 2002 Marketing Sherpa WA report? I
                              must be
                              > > an elephant) that the average live of a WA package is 18 months -
                              > > companies switch a *lot*
                              > >
                              > > Why? Hosted analytics are so popular, and every time we switch,
                              we
                              > > lose all our data. It's true that now we have GA and a free
                              resource
                              > > to benchmark all our data and keep even when we switch a second
                              > > package, but this was the trend (I believe) way back before GA.
                              > >
                              > > So why do analysts switch so much?
                              > >
                              > >
                              > > Robbin
                              > >
                              >
                            • Scribner, Craig (Web Analytics and Testi
                              In my experience, the most pressure to switch comes either from new exec s that are used to a different tool, or some buzz is floating around about a new or
                              Message 14 of 14 , Sep 4, 2007
                              View Source
                              • 0 Attachment
                                In my experience, the most pressure to switch comes either from new
                                exec's that are used to a different tool, or some buzz is floating
                                around about a new or different product.



                                I went to vendor's conference three or four years ago, and one of the
                                panelists asked his fellow users to raise our hands if our
                                implementation was over a year old. To those 10 or 15 percent of
                                respondents he grinned and said, "feels good, doesn't it?" I don't think
                                that statement was a validation of the specific vendor we were all
                                using, but more as a commentary on our profession. There is tremendous
                                value in sticking with the system you've got. Being able to product
                                baselines and benchmarks at any time for almost any user activity on our
                                site is critical for developing new tests. We're tweaking our capture
                                methods all the time to bring into focus pieces about which we've only
                                had partial, "blurry" data. But thankfully we've never had to start
                                over.



                                One of my most valuable contributions to my current company has been
                                dealing with the buzz surrounding an alternate solution by saying,
                                "Sounds like a great idea. Let me put it on the shelf for you."



                                PS. Please ignore this message if you're a vendor. Give us better tools
                                or we'll jump ship this second! :-)





                                [Non-text portions of this message have been removed]
                              Your message has been successfully submitted and would be delivered to recipients shortly.