Loading ...
Sorry, an error occurred while loading the content.
 

Mixing Analytics Tools - Great Advice from the Pros

Expand Messages
  • Daniel
    I happen to be attending Online Marketing Summit 2010 in San Diego today. Just finished listening to a panel of six experts on Web Analytics. A good part of
    Message 1 of 17 , Feb 24, 2010
      I happen to be attending Online Marketing Summit 2010 in San Diego today. Just finished listening to a panel of six experts on Web Analytics. A good part of the conversation covered a topic of frequent concern here.

      The consensus of these folks? If you are using multiple tools, don't even think about trying to get them to give you identical results. The way the tools work, differences in page load times, latency and all sorts of other variables that are out of your control and for the most part invisible all contribute to inevitable disparity. Accuracy is not the watchword.

      Eric Petersen of WebAnalyticsDemystified said quite strongly that he recommends not using multiple tools. Most such use, he says, arises from a lack of understanding how to use a given tool properly or having chosen the wrong tool initially.

      Great stuff.
    • Pierre DeBois
      Did Eric offer ideas on how tools should be selected, or did the topic mostly focus on why mixing tools is bad. There are more analytics tools being
      Message 2 of 17 , Feb 25, 2010
        Did Eric offer ideas on how tools should be selected, or did the topic
        mostly focus on why mixing tools is bad. There are more analytics tools
        being introduced or adding features, and I would love to know more recent
        thoughts on selection.

        On Wed, Feb 24, 2010 at 6:27 PM, Daniel <dan@...> wrote:

        >
        >
        > I happen to be attending Online Marketing Summit 2010 in San Diego today.
        > Just finished listening to a panel of six experts on Web Analytics. A good
        > part of the conversation covered a topic of frequent concern here.
        >
        > The consensus of these folks? If you are using multiple tools, don't even
        > think about trying to get them to give you identical results. The way the
        > tools work, differences in page load times, latency and all sorts of other
        > variables that are out of your control and for the most part invisible all
        > contribute to inevitable disparity. Accuracy is not the watchword.
        >
        > Eric Petersen of WebAnalyticsDemystified said quite strongly that he
        > recommends not using multiple tools. Most such use, he says, arises from a
        > lack of understanding how to use a given tool properly or having chosen the
        > wrong tool initially.
        >
        > Great stuff.
        >
        >
        >


        [Non-text portions of this message have been removed]
      • shamel67
        Peterson is right: generally you want to avoid using multiple web analytics tools. But the answer is never as straight as this and my take is posted at
        Message 3 of 17 , Feb 25, 2010
          Peterson is right: generally you want to avoid using multiple web analytics tools. But the answer is never as straight as this and my take is posted at http://blog.immeria.net/2010/02/mixing-analytics-tools-my-take.html

          Stéphane Hamel
          http://immeria.net

          --- In webanalytics@yahoogroups.com, "Daniel" <dan@...> wrote:
          >
          > I happen to be attending Online Marketing Summit 2010 in San Diego today. Just finished listening to a panel of six experts on Web Analytics. A good part of the conversation covered a topic of frequent concern here.
          >
          > The consensus of these folks? If you are using multiple tools, don't even think about trying to get them to give you identical results. The way the tools work, differences in page load times, latency and all sorts of other variables that are out of your control and for the most part invisible all contribute to inevitable disparity. Accuracy is not the watchword.
          >
          > Eric Petersen of WebAnalyticsDemystified said quite strongly that he recommends not using multiple tools. Most such use, he says, arises from a lack of understanding how to use a given tool properly or having chosen the wrong tool initially.
          >
          > Great stuff.
          >
        • eefsafe
          Hey Dan, Thanks for bringing this up ... and what an awesome panel that was yesterday huh? Lots of people don t know but Matt Belkin from Omniture was one of
          Message 4 of 17 , Feb 25, 2010
            Hey Dan,

            Thanks for bringing this up ... and what an awesome panel that was yesterday huh? Lots of people don't know but Matt Belkin from Omniture was one of the very, very early contributors to this group so it was nice to see him again. And as usual, Bill Bruno, Ali Behnam, Amanda Kahlow, and Enrique Gonzales (AARP) had a ton to offer.

            Some small correction on your repeat of my comment, and S. Hamel got this right in his follow-up on this thread: There are clear reasons to have multiple digital measurement tools when you consider the broad set.

            Traditional clickstream analytics, voice of customer (ForeSee, iPerceptions, OpinionLab, hundreds more), and customer experience management are three I typically recommend my clients use concurrently because collectively they broaden the range of questions that can be asked and answered (vs. clickstream alone). If you're interested in how these three tools can be integrated I would refer you to our "Free Whitepapers" section on the Demystified web site where you can download two papers on the "Web Analytics Ecosystem":

            http://www.webanalyticsdemystified.com/content/white-papers.asp

            However, and this was the comment I actually made, I am still unconvinced of the value and much of the reasoning behind ** multiple tools deployed that primarily do the same thing. ** For example, Google Analytics and Omniture SiteCatalyst deployed on the same pages.

            While this is unfortunately common, and trust me I have heard all of the reasons, what I often see when I find two (or more) of the same tools being used is a ** breakdown in web analytics process, strategy, and governance. ** Typically "our business users don't like/understand/want to use Solution A so we also deployed Solution B which is easier to use" or "we don't trust Solution A completely so we use Solution B as a backup."

            As I said in the panel, when you co-deploy same solutions you end up with "two copies of the books." I personally have observed people going from one solution to the other looking for "the right answer" which, of course, does not exist. Breakdown in process.

            Does that make sense?

            I actually wrote a long piece on why I think this happens called "The Coming Bifurcation in Web Analytics Tools" a few weeks back that you might be interested in reading. Tons of smart folks commented on the post so make sure to read the comments as well:

            http://bit.ly/9DfLxz

            Anyway this is a recurring theme in the group so it will be interesting what other people have to offer.

            Eric T. Peterson
            Web Analytics Demystified, Inc.
            http://www.webanalyticsdemystified.com






            --- In webanalytics@yahoogroups.com, "Daniel" <dan@...> wrote:
            >
            > I happen to be attending Online Marketing Summit 2010 in San Diego today. Just finished listening to a panel of six experts on Web Analytics. A good part of the conversation covered a topic of frequent concern here.
            >
            > The consensus of these folks? If you are using multiple tools, don't even think about trying to get them to give you identical results. The way the tools work, differences in page load times, latency and all sorts of other variables that are out of your control and for the most part invisible all contribute to inevitable disparity. Accuracy is not the watchword.
            >
            > Eric Petersen of WebAnalyticsDemystified said quite strongly that he recommends not using multiple tools. Most such use, he says, arises from a lack of understanding how to use a given tool properly or having chosen the wrong tool initially.
            >
            > Great stuff.
            >
          • Daniel
            In the session I was in, he didn t. His observation was just that most of the tools work as well as you know how to work with them. Each has advantages. So you
            Message 5 of 17 , Feb 25, 2010
              In the session I was in, he didn't. His observation was just that most of the tools work as well as you know how to work with them. Each has advantages. So you start by figuring out your needs, match up with a tool, master it, and stay with it.

              Only if/when you discover a need that legitimately cannot be met with your existing tool do you consider switching, not adding, a tool.

              His site undoubtedly has more.

              --- In webanalytics@yahoogroups.com, "Daniel" <dan@...> wrote:
              >
              > I happen to be attending Online Marketing Summit 2010 in San Diego today. Just finished listening to a panel of six experts on Web Analytics. A good part of the conversation covered a topic of frequent concern here.
              >
              > The consensus of these folks? If you are using multiple tools, don't even think about trying to get them to give you identical results. The way the tools work, differences in page load times, latency and all sorts of other variables that are out of your control and for the most part invisible all contribute to inevitable disparity. Accuracy is not the watchword.
              >
              > Eric Petersen of WebAnalyticsDemystified said quite strongly that he recommends not using multiple tools. Most such use, he says, arises from a lack of understanding how to use a given tool properly or having chosen the wrong tool initially.
              >
              > Great stuff.
              >
            • Craig Scribner
              A couple of years ago my company’s PPC Manager requested that I add a second tag to our website, and today I deeply regret *not* doing it. He wanted Google
              Message 6 of 17 , Feb 25, 2010
                A couple of years ago my company’s PPC Manager requested that I add a second
                tag to our website, and today I deeply regret *not* doing it.



                He wanted Google Analytics tags when we already had Omniture tags, and I
                shot him down with all the usual lines about how we could get all of the
                insights he was asking for with SiteCatalyst, and in my heart I was just
                dreading having to answer people’s questions about why one tool reported X
                visitors and the other one reported Y.



                But every tool has strengths and weaknesses, and since I started consulting
                for other companies, there have been many times that having a secondary tool
                available has bailed me out when I hit a roadblock with the primary tool.
                Plus the fact that Google Analytics’ Advanced Segments can deliver reports
                that I’d need Omniture’s Discover to rival.



                Best regards,

                Craig



                From: webanalytics@yahoogroups.com [mailto:webanalytics@yahoogroups.com] On
                Behalf Of shamel67
                Sent: Thursday, February 25, 2010 8:52 AM
                To: webanalytics@yahoogroups.com
                Subject: [webanalytics] Re: Mixing Analytics Tools - Great Advice from the
                Pros





                Peterson is right: generally you want to avoid using multiple web analytics
                tools. But the answer is never as straight as this and my take is posted at
                http://blog.immeria.net/2010/02/mixing-analytics-tools-my-take.html

                Stéphane Hamel
                http://immeria.net

                --- In webanalytics@yahoogroups.com <mailto:webanalytics%40yahoogroups.com>
                , "Daniel" <dan@...> wrote:
                >
                > I happen to be attending Online Marketing Summit 2010 in San Diego today.
                Just finished listening to a panel of six experts on Web Analytics. A good
                part of the conversation covered a topic of frequent concern here.
                >
                > The consensus of these folks? If you are using multiple tools, don't even
                think about trying to get them to give you identical results. The way the
                tools work, differences in page load times, latency and all sorts of other
                variables that are out of your control and for the most part invisible all
                contribute to inevitable disparity. Accuracy is not the watchword.
                >
                > Eric Petersen of WebAnalyticsDemystified said quite strongly that he
                recommends not using multiple tools. Most such use, he says, arises from a
                lack of understanding how to use a given tool properly or having chosen the
                wrong tool initially.
                >
                > Great stuff.
                >





                [Non-text portions of this message have been removed]
              • Daniel
                Eric, Thanks for clearing up my confusing statement. I heard it the way you said it here, but I obviously didn t say it clearly. Your point is well taken and
                Message 7 of 17 , Feb 25, 2010
                  Eric,

                  Thanks for clearing up my confusing statement. I heard it the way you said it here, but I obviously didn't say it clearly.

                  Your point is well taken and I'm glad you pointed it out. I'll look at the additional sources you pointed to in this post.

                  Dan

                  --- In webanalytics@yahoogroups.com, "eefsafe" <eric.peterson@...> wrote:
                  >
                  > Hey Dan,
                  >
                  > Thanks for bringing this up ... and what an awesome panel that was yesterday huh? Lots of people don't know but Matt Belkin from Omniture was one of the very, very early contributors to this group so it was nice to see him again. And as usual, Bill Bruno, Ali Behnam, Amanda Kahlow, and Enrique Gonzales (AARP) had a ton to offer.
                  >
                  > Some small correction on your repeat of my comment, and S. Hamel got this right in his follow-up on this thread: There are clear reasons to have multiple digital measurement tools when you consider the broad set.
                  >
                  > Traditional clickstream analytics, voice of customer (ForeSee, iPerceptions, OpinionLab, hundreds more), and customer experience management are three I typically recommend my clients use concurrently because collectively they broaden the range of questions that can be asked and answered (vs. clickstream alone). If you're interested in how these three tools can be integrated I would refer you to our "Free Whitepapers" section on the Demystified web site where you can download two papers on the "Web Analytics Ecosystem":
                  >
                  > http://www.webanalyticsdemystified.com/content/white-papers.asp
                  >
                  > However, and this was the comment I actually made, I am still unconvinced of the value and much of the reasoning behind ** multiple tools deployed that primarily do the same thing. ** For example, Google Analytics and Omniture SiteCatalyst deployed on the same pages.
                  >
                  > While this is unfortunately common, and trust me I have heard all of the reasons, what I often see when I find two (or more) of the same tools being used is a ** breakdown in web analytics process, strategy, and governance. ** Typically "our business users don't like/understand/want to use Solution A so we also deployed Solution B which is easier to use" or "we don't trust Solution A completely so we use Solution B as a backup."
                  >
                  > As I said in the panel, when you co-deploy same solutions you end up with "two copies of the books." I personally have observed people going from one solution to the other looking for "the right answer" which, of course, does not exist. Breakdown in process.
                  >
                  > Does that make sense?
                  >
                  > I actually wrote a long piece on why I think this happens called "The Coming Bifurcation in Web Analytics Tools" a few weeks back that you might be interested in reading. Tons of smart folks commented on the post so make sure to read the comments as well:
                  >
                  > http://bit.ly/9DfLxz
                  >
                  > Anyway this is a recurring theme in the group so it will be interesting what other people have to offer.
                  >
                  > Eric T. Peterson
                  > Web Analytics Demystified, Inc.
                  > http://www.webanalyticsdemystified.com
                  >
                  >
                  >
                  >
                  >
                  >
                  > --- In webanalytics@yahoogroups.com, "Daniel" <dan@> wrote:
                  > >
                  > > I happen to be attending Online Marketing Summit 2010 in San Diego today. Just finished listening to a panel of six experts on Web Analytics. A good part of the conversation covered a topic of frequent concern here.
                  > >
                  > > The consensus of these folks? If you are using multiple tools, don't even think about trying to get them to give you identical results. The way the tools work, differences in page load times, latency and all sorts of other variables that are out of your control and for the most part invisible all contribute to inevitable disparity. Accuracy is not the watchword.
                  > >
                  > > Eric Petersen of WebAnalyticsDemystified said quite strongly that he recommends not using multiple tools. Most such use, he says, arises from a lack of understanding how to use a given tool properly or having chosen the wrong tool initially.
                  > >
                  > > Great stuff.
                  > >
                  >
                • Stephane Hamel
                  Thanks for adding and clarifying Eric, and can I say *yes*, we re in total agreement on this one! :) Stéphane Hamel http://immeria.net
                  Message 8 of 17 , Feb 26, 2010
                    Thanks for adding and clarifying Eric, and can I say *yes*, we're in total agreement on this one! :)

                    Stéphane Hamel
                    http://immeria.net

                    --- In webanalytics@yahoogroups.com, "eefsafe" <eric.peterson@...> wrote:
                    >
                    > Hey Dan,
                    >
                    > Thanks for bringing this up ... and what an awesome panel that was yesterday huh? Lots of people don't know but Matt Belkin from Omniture was one of the very, very early contributors to this group so it was nice to see him again. And as usual, Bill Bruno, Ali Behnam, Amanda Kahlow, and Enrique Gonzales (AARP) had a ton to offer.
                    >
                    > Some small correction on your repeat of my comment, and S. Hamel got this right in his follow-up on this thread: There are clear reasons to have multiple digital measurement tools when you consider the broad set.
                    >
                    > Traditional clickstream analytics, voice of customer (ForeSee, iPerceptions, OpinionLab, hundreds more), and customer experience management are three I typically recommend my clients use concurrently because collectively they broaden the range of questions that can be asked and answered (vs. clickstream alone). If you're interested in how these three tools can be integrated I would refer you to our "Free Whitepapers" section on the Demystified web site where you can download two papers on the "Web Analytics Ecosystem":
                    >
                    > http://www.webanalyticsdemystified.com/content/white-papers.asp
                    >
                    > However, and this was the comment I actually made, I am still unconvinced of the value and much of the reasoning behind ** multiple tools deployed that primarily do the same thing. ** For example, Google Analytics and Omniture SiteCatalyst deployed on the same pages.
                    >
                    > While this is unfortunately common, and trust me I have heard all of the reasons, what I often see when I find two (or more) of the same tools being used is a ** breakdown in web analytics process, strategy, and governance. ** Typically "our business users don't like/understand/want to use Solution A so we also deployed Solution B which is easier to use" or "we don't trust Solution A completely so we use Solution B as a backup."
                    >
                    > As I said in the panel, when you co-deploy same solutions you end up with "two copies of the books." I personally have observed people going from one solution to the other looking for "the right answer" which, of course, does not exist. Breakdown in process.
                    >
                    > Does that make sense?
                    >
                    > I actually wrote a long piece on why I think this happens called "The Coming Bifurcation in Web Analytics Tools" a few weeks back that you might be interested in reading. Tons of smart folks commented on the post so make sure to read the comments as well:
                    >
                    > http://bit.ly/9DfLxz
                    >
                    > Anyway this is a recurring theme in the group so it will be interesting what other people have to offer.
                    >
                    > Eric T. Peterson
                    > Web Analytics Demystified, Inc.
                    > http://www.webanalyticsdemystified.com
                    >
                    >
                    >
                    >
                    >
                    >
                    > --- In webanalytics@yahoogroups.com, "Daniel" <dan@> wrote:
                    > >
                    > > I happen to be attending Online Marketing Summit 2010 in San Diego today. Just finished listening to a panel of six experts on Web Analytics. A good part of the conversation covered a topic of frequent concern here.
                    > >
                    > > The consensus of these folks? If you are using multiple tools, don't even think about trying to get them to give you identical results. The way the tools work, differences in page load times, latency and all sorts of other variables that are out of your control and for the most part invisible all contribute to inevitable disparity. Accuracy is not the watchword.
                    > >
                    > > Eric Petersen of WebAnalyticsDemystified said quite strongly that he recommends not using multiple tools. Most such use, he says, arises from a lack of understanding how to use a given tool properly or having chosen the wrong tool initially.
                    > >
                    > > Great stuff.
                    > >
                    >
                  • Blakeley, Robert
                    For Omniture customers, if you have not seen the new Idea Exchange it s worth a look even if you do not have product suggestions of your own. It is clear
                    Message 9 of 17 , Feb 26, 2010
                      For Omniture customers, if you have not seen the new Idea Exchange it's worth a look even if you do not have product suggestions of your own. It is clear Omniture is paying close attention to this. Smart idea smartly done.

                      http://ideas.omniture.com


                      Robert Blakeley | Product Manager | BI
                      212.624.3854



                      [Non-text portions of this message have been removed]
                    • David Simmons
                      Craig, I completely agree. As flexible as SiteCatalyst is, I’m finding it very often the case that GA will answer many of the most important questions that I
                      Message 10 of 17 , Feb 26, 2010
                        Craig,



                        I completely agree. As flexible as SiteCatalyst is, I’m finding it very
                        often the case that GA will answer many of the most important questions that
                        I need to ask of it, also as a consultant, in a matter of seconds or
                        minutes, particularly with the advanced segments functionality.



                        OTOH, GA’s high level product approach for events tracking leaves me
                        slightly mystified at times (why, for example, can’t events be used in
                        conversion funnels?! And why are events in a separate section of the report
                        interface?) , and further, from a straight feature POV, it leaves quite a
                        lot out.



                        But regardless, as much as I like Omniture, and as often as I’ve heard
                        sophisticated people dismiss GA as a product for amateurs at various
                        industry events over the years, I must say that I do think they complement
                        each other extremely well.



                        At my last job, I helped design our own video measurement & analytics tool
                        because none of the analytics tools on the market with the exception of
                        Visual Sciences, now OI, came remotely close to meeting our requirements. So
                        we used this tool plus GA (mainly to do basic analysis of traffic sources &
                        campaigns), which worked well.



                        Obviously the same metric, to the extent that it is the same of course, as
                        reported in one tool will never match the other for the usual reasons, but
                        so what?*





                        David



                        *assuming you’ve considered different metric definitions, the tags are OK,
                        time zones are the same, and so forth…



                        From: webanalytics@yahoogroups.com [mailto:webanalytics@yahoogroups.com] On
                        Behalf Of Craig Scribner
                        Sent: Thursday, February 25, 2010 9:42 AM
                        To: webanalytics@yahoogroups.com
                        Subject: RE: [webanalytics] Re: Mixing Analytics Tools - Great Advice from
                        the Pros





                        A couple of years ago my company’s PPC Manager requested that I add a second
                        tag to our website, and today I deeply regret *not* doing it.

                        He wanted Google Analytics tags when we already had Omniture tags, and I
                        shot him down with all the usual lines about how we could get all of the
                        insights he was asking for with SiteCatalyst, and in my heart I was just
                        dreading having to answer people’s questions about why one tool reported X
                        visitors and the other one reported Y.

                        But every tool has strengths and weaknesses, and since I started consulting
                        for other companies, there have been many times that having a secondary tool
                        available has bailed me out when I hit a roadblock with the primary tool.
                        Plus the fact that Google Analytics’ Advanced Segments can deliver reports
                        that I’d need Omniture’s Discover to rival.

                        Best regards,

                        Craig

                        From: webanalytics@yahoogroups.com <mailto:webanalytics%40yahoogroups.com>
                        [mailto:webanalytics@yahoogroups.com <mailto:webanalytics%40yahoogroups.com>
                        ] On
                        Behalf Of shamel67
                        Sent: Thursday, February 25, 2010 8:52 AM
                        To: webanalytics@yahoogroups.com <mailto:webanalytics%40yahoogroups.com>
                        Subject: [webanalytics] Re: Mixing Analytics Tools - Great Advice from the
                        Pros

                        Peterson is right: generally you want to avoid using multiple web analytics
                        tools. But the answer is never as straight as this and my take is posted at
                        http://blog.immeria.net/2010/02/mixing-analytics-tools-my-take.html

                        Stéphane Hamel
                        http://immeria.net

                        --- In webanalytics@yahoogroups.com <mailto:webanalytics%40yahoogroups.com>
                        <mailto:webanalytics%40yahoogroups.com>
                        , "Daniel" <dan@...> wrote:
                        >
                        > I happen to be attending Online Marketing Summit 2010 in San Diego today.
                        Just finished listening to a panel of six experts on Web Analytics. A good
                        part of the conversation covered a topic of frequent concern here.
                        >
                        > The consensus of these folks? If you are using multiple tools, don't even
                        think about trying to get them to give you identical results. The way the
                        tools work, differences in page load times, latency and all sorts of other
                        variables that are out of your control and for the most part invisible all
                        contribute to inevitable disparity. Accuracy is not the watchword.
                        >
                        > Eric Petersen of WebAnalyticsDemystified said quite strongly that he
                        recommends not using multiple tools. Most such use, he says, arises from a
                        lack of understanding how to use a given tool properly or having chosen the
                        wrong tool initially.
                        >
                        > Great stuff.
                        >

                        [Non-text portions of this message have been removed]





                        [Non-text portions of this message have been removed]
                      • David Simmons
                        I think Eric’s point about process is tremendously important, especially with respect to tagging problems, which seemed to be last week’s big theme in the
                        Message 11 of 17 , Feb 26, 2010
                          I think Eric’s point about process is tremendously important, especially
                          with respect to tagging problems, which seemed to be last week’s big theme
                          in the forum last week, to the extent that a lot of people seem to want to
                          dispense with tag-based measurement altogether! J



                          It seems that in a lot of large organizations, tagging is done by a
                          completely different group that is often not accountable to the group that
                          is responsible for analytics or insights (regardless of whether that group
                          sits in a dedicated analytics department, or marketing, market research, or
                          product management etc) and there are no easy solutions, because the same
                          formal--and arguably just as importantly, informal--organizational structure
                          has existed for 10+ years and is impossible to change. This seems to be
                          especially common in companies that have limited experience with iterative
                          marketing, product, or especially IT development cycles.



                          They know how to determine requirements, define scope, build or buy or hire
                          something, test it, put it into production, and then “set it and forget it”
                          to move onto the next project.



                          There simply may not be a process that exists to get buy-in & approval for
                          anything ongoing, such as tag change proposals from the analytics staff,
                          determine the priority for the changes (even if they are bug fixes), perform
                          QA on the changes in a staging environment, and then release the changes
                          into a production environment.



                          IMO, what this often may mean in practice is that an analytics tool that
                          requires any degree of customization outside of the initial instrumentation
                          to measure new site features, answer analysts’ new, previously unforeseen
                          questions (e.g. needing to add new tracking variables or uploading lookup
                          tables to the analytics vendor)etc may not be as effective for that company
                          as something like Google Analytics can be now, where more questions can be
                          answered right out of the box without having to make changes to the
                          instrumentation and thus w/o involving any other parties in the
                          organization.



                          My 2 cents….



                          David



                          From: webanalytics@yahoogroups.com [mailto:webanalytics@yahoogroups.com] On
                          Behalf Of Stephane Hamel
                          Sent: Friday, February 26, 2010 6:14 AM
                          To: webanalytics@yahoogroups.com
                          Subject: [webanalytics] Re: Mixing Analytics Tools - Great Advice from the
                          Pros





                          Thanks for adding and clarifying Eric, and can I say *yes*, we're in total
                          agreement on this one! :)

                          Stéphane Hamel
                          http://immeria.net

                          --- In webanalytics@yahoogroups.com <mailto:webanalytics%40yahoogroups.com>
                          , "eefsafe" <eric.peterson@...> wrote:
                          >
                          > Hey Dan,
                          >
                          > Thanks for bringing this up ... and what an awesome panel that was
                          yesterday huh? Lots of people don't know but Matt Belkin from Omniture was
                          one of the very, very early contributors to this group so it was nice to see
                          him again. And as usual, Bill Bruno, Ali Behnam, Amanda Kahlow, and Enrique
                          Gonzales (AARP) had a ton to offer.
                          >
                          > Some small correction on your repeat of my comment, and S. Hamel got this
                          right in his follow-up on this thread: There are clear reasons to have
                          multiple digital measurement tools when you consider the broad set.
                          >
                          > Traditional clickstream analytics, voice of customer (ForeSee,
                          iPerceptions, OpinionLab, hundreds more), and customer experience management
                          are three I typically recommend my clients use concurrently because
                          collectively they broaden the range of questions that can be asked and
                          answered (vs. clickstream alone). If you're interested in how these three
                          tools can be integrated I would refer you to our "Free Whitepapers" section
                          on the Demystified web site where you can download two papers on the "Web
                          Analytics Ecosystem":
                          >
                          > http://www.webanalyticsdemystified.com/content/white-papers.asp
                          >
                          > However, and this was the comment I actually made, I am still unconvinced
                          of the value and much of the reasoning behind ** multiple tools deployed
                          that primarily do the same thing. ** For example, Google Analytics and
                          Omniture SiteCatalyst deployed on the same pages.
                          >
                          > While this is unfortunately common, and trust me I have heard all of the
                          reasons, what I often see when I find two (or more) of the same tools being
                          used is a ** breakdown in web analytics process, strategy, and governance.
                          ** Typically "our business users don't like/understand/want to use Solution
                          A so we also deployed Solution B which is easier to use" or "we don't trust
                          Solution A completely so we use Solution B as a backup."
                          >
                          > As I said in the panel, when you co-deploy same solutions you end up with
                          "two copies of the books." I personally have observed people going from one
                          solution to the other looking for "the right answer" which, of course, does
                          not exist. Breakdown in process.
                          >
                          > Does that make sense?
                          >
                          > I actually wrote a long piece on why I think this happens called "The
                          Coming Bifurcation in Web Analytics Tools" a few weeks back that you might
                          be interested in reading. Tons of smart folks commented on the post so make
                          sure to read the comments as well:
                          >
                          > http://bit.ly/9DfLxz
                          >
                          > Anyway this is a recurring theme in the group so it will be interesting
                          what other people have to offer.
                          >
                          > Eric T. Peterson
                          > Web Analytics Demystified, Inc.
                          > http://www.webanalyticsdemystified.com
                          >
                          >
                          >
                          >
                          >
                          >
                          > --- In webanalytics@yahoogroups.com
                          <mailto:webanalytics%40yahoogroups.com> , "Daniel" <dan@> wrote:
                          > >
                          > > I happen to be attending Online Marketing Summit 2010 in San Diego
                          today. Just finished listening to a panel of six experts on Web Analytics. A
                          good part of the conversation covered a topic of frequent concern here.
                          > >
                          > > The consensus of these folks? If you are using multiple tools, don't
                          even think about trying to get them to give you identical results. The way
                          the tools work, differences in page load times, latency and all sorts of
                          other variables that are out of your control and for the most part invisible
                          all contribute to inevitable disparity. Accuracy is not the watchword.
                          > >
                          > > Eric Petersen of WebAnalyticsDemystified said quite strongly that he
                          recommends not using multiple tools. Most such use, he says, arises from a
                          lack of understanding how to use a given tool properly or having chosen the
                          wrong tool initially.
                          > >
                          > > Great stuff.
                          > >
                          >





                          [Non-text portions of this message have been removed]
                        • brian_clifton_uk
                          Wow! Eric Peterson and Matt Belkin in the same room. I would have loved to have seen that ;) Just to point out two solid reasons for using a second WA tool
                          Message 12 of 17 , Feb 26, 2010
                            Wow! Eric Peterson and Matt Belkin in the same room. I would have loved to have seen that ;)

                            Just to point out two solid reasons for using a second WA tool that collects the same data...

                            1] If you are a GA user, data cannot be reprocessed. Say for example, you setup an exclude filter and make a mistake by excluding all your visitor traffic - you cannot go to Google and ask for a reprocess. The data is lost. To avoid this, run Urchin software alongside GA and simply reprocess your logfiles when needed (Urchin = server-side web analytics software derived from GA)*

                            2] If you are a publisher and require an ABCE audit for proof of advertising rate card purposes, ABCE require the raw visit data. Google do not pass data to *any* third party so that is not going to work. Again use Urchin alongside GA to provide audit data to ABCE or similar.

                            Some useful articles form me:
                            What is Urhin 6
                            http://www.advanced-web-metrics.com/blog/2010/01/29/what-is-urchin-6/

                            Hosted v Software v Hybrid tools
                            http://www.advanced-web-metrics.com/blog/2007/10/07/hosted-v-software-v-hybrid-tools/

                            Backup your Google Analytics data and use Urchin
                            http://www.advanced-web-metrics.com/blog/2007/10/17/backup-your-ga-data-locally/

                            *GA is actually derived from Urchin

                            Best regards, Brian
                            Author, Advanced Web Metrics with Google Analytics
                            2nd edition (complete re-write) launched in 2 weeks!



                            --- In webanalytics@yahoogroups.com, "Daniel" <dan@...> wrote:
                            >
                            > Eric,
                            >
                            > Thanks for clearing up my confusing statement. I heard it the way you said it here, but I obviously didn't say it clearly.
                            >
                            > Your point is well taken and I'm glad you pointed it out. I'll look at the additional sources you pointed to in this post.
                            >
                            > Dan
                            >
                            > --- In webanalytics@yahoogroups.com, "eefsafe" <eric.peterson@> wrote:
                            > >
                            > > Hey Dan,
                            > >
                            > > Thanks for bringing this up ... and what an awesome panel that was yesterday huh? Lots of people don't know but Matt Belkin from Omniture was one of the very, very early contributors to this group so it was nice to see him again. And as usual, Bill Bruno, Ali Behnam, Amanda Kahlow, and Enrique Gonzales (AARP) had a ton to offer.
                            > >
                            > > Some small correction on your repeat of my comment, and S. Hamel got this right in his follow-up on this thread: There are clear reasons to have multiple digital measurement tools when you consider the broad set.
                            > >
                            > > Traditional clickstream analytics, voice of customer (ForeSee, iPerceptions, OpinionLab, hundreds more), and customer experience management are three I typically recommend my clients use concurrently because collectively they broaden the range of questions that can be asked and answered (vs. clickstream alone). If you're interested in how these three tools can be integrated I would refer you to our "Free Whitepapers" section on the Demystified web site where you can download two papers on the "Web Analytics Ecosystem":
                            > >
                            > > http://www.webanalyticsdemystified.com/content/white-papers.asp
                            > >
                            > > However, and this was the comment I actually made, I am still unconvinced of the value and much of the reasoning behind ** multiple tools deployed that primarily do the same thing. ** For example, Google Analytics and Omniture SiteCatalyst deployed on the same pages.
                            > >
                            > > While this is unfortunately common, and trust me I have heard all of the reasons, what I often see when I find two (or more) of the same tools being used is a ** breakdown in web analytics process, strategy, and governance. ** Typically "our business users don't like/understand/want to use Solution A so we also deployed Solution B which is easier to use" or "we don't trust Solution A completely so we use Solution B as a backup."
                            > >
                            > > As I said in the panel, when you co-deploy same solutions you end up with "two copies of the books." I personally have observed people going from one solution to the other looking for "the right answer" which, of course, does not exist. Breakdown in process.
                            > >
                            > > Does that make sense?
                            > >
                            > > I actually wrote a long piece on why I think this happens called "The Coming Bifurcation in Web Analytics Tools" a few weeks back that you might be interested in reading. Tons of smart folks commented on the post so make sure to read the comments as well:
                            > >
                            > > http://bit.ly/9DfLxz
                            > >
                            > > Anyway this is a recurring theme in the group so it will be interesting what other people have to offer.
                            > >
                            > > Eric T. Peterson
                            > > Web Analytics Demystified, Inc.
                            > > http://www.webanalyticsdemystified.com
                            > >
                            > >
                            > >
                            > >
                            > >
                            > >
                            > > --- In webanalytics@yahoogroups.com, "Daniel" <dan@> wrote:
                            > > >
                            > > > I happen to be attending Online Marketing Summit 2010 in San Diego today. Just finished listening to a panel of six experts on Web Analytics. A good part of the conversation covered a topic of frequent concern here.
                            > > >
                            > > > The consensus of these folks? If you are using multiple tools, don't even think about trying to get them to give you identical results. The way the tools work, differences in page load times, latency and all sorts of other variables that are out of your control and for the most part invisible all contribute to inevitable disparity. Accuracy is not the watchword.
                            > > >
                            > > > Eric Petersen of WebAnalyticsDemystified said quite strongly that he recommends not using multiple tools. Most such use, he says, arises from a lack of understanding how to use a given tool properly or having chosen the wrong tool initially.
                            > > >
                            > > > Great stuff.
                            > > >
                            > >
                            >
                          • eefsafe
                            Brian, You make two great cases ... for using Urchin software. Why would you also use GA if your business has the reprocessing and auditing requirement? Why
                            Message 13 of 17 , Feb 28, 2010
                              Brian,

                              You make two great cases ... for using Urchin software. Why would you also use GA if your business has the reprocessing and auditing requirement? Why not just use Urchin and avoid the litany of issues that arise from keeping "two sets of books?"

                              That ABCe thing especially seems to be a pretty strict requirement ... doesn't using GA internally but Urchin for external/audit create reconciliation issues?

                              Seems like yet another case of a lack of process and strategy to me.

                              Eric T. Peterson
                              Web Analytics Demystified, Inc.
                              http://www.webanalyticsdemystified.com

                              P.S. It was nice to see Matt again. Apparently he has left the mother-ship and now lives in sunny San Diego managing Omniture's visitor acquisition products. We didn't get to catch up, but Bill Bruno did put the moderator up to asking "is web analytics easy" at which point we realized ... that nothing ever really changes.

                              Ask me sometime about how he and I answered that question from the panel. Was pretty amusing from my POV. ;-)




                              --- In webanalytics@yahoogroups.com, "brian_clifton_uk" <brian@...> wrote:
                              >
                              > Wow! Eric Peterson and Matt Belkin in the same room. I would have loved to have seen that ;)
                              >
                              > Just to point out two solid reasons for using a second WA tool that collects the same data...
                              >
                              > 1] If you are a GA user, data cannot be reprocessed. Say for example, you setup an exclude filter and make a mistake by excluding all your visitor traffic - you cannot go to Google and ask for a reprocess. The data is lost. To avoid this, run Urchin software alongside GA and simply reprocess your logfiles when needed (Urchin = server-side web analytics software derived from GA)*
                              >
                              > 2] If you are a publisher and require an ABCE audit for proof of advertising rate card purposes, ABCE require the raw visit data. Google do not pass data to *any* third party so that is not going to work. Again use Urchin alongside GA to provide audit data to ABCE or similar.
                              >
                              > Some useful articles form me:
                              > What is Urhin 6
                              > http://www.advanced-web-metrics.com/blog/2010/01/29/what-is-urchin-6/
                              >
                              > Hosted v Software v Hybrid tools
                              > http://www.advanced-web-metrics.com/blog/2007/10/07/hosted-v-software-v-hybrid-tools/
                              >
                              > Backup your Google Analytics data and use Urchin
                              > http://www.advanced-web-metrics.com/blog/2007/10/17/backup-your-ga-data-locally/
                              >
                              > *GA is actually derived from Urchin
                              >
                              > Best regards, Brian
                              > Author, Advanced Web Metrics with Google Analytics
                              > 2nd edition (complete re-write) launched in 2 weeks!
                              >
                              >
                              >
                              > --- In webanalytics@yahoogroups.com, "Daniel" <dan@> wrote:
                              > >
                              > > Eric,
                              > >
                              > > Thanks for clearing up my confusing statement. I heard it the way you said it here, but I obviously didn't say it clearly.
                              > >
                              > > Your point is well taken and I'm glad you pointed it out. I'll look at the additional sources you pointed to in this post.
                              > >
                              > > Dan
                              > >
                              > > --- In webanalytics@yahoogroups.com, "eefsafe" <eric.peterson@> wrote:
                              > > >
                              > > > Hey Dan,
                              > > >
                              > > > Thanks for bringing this up ... and what an awesome panel that was yesterday huh? Lots of people don't know but Matt Belkin from Omniture was one of the very, very early contributors to this group so it was nice to see him again. And as usual, Bill Bruno, Ali Behnam, Amanda Kahlow, and Enrique Gonzales (AARP) had a ton to offer.
                              > > >
                              > > > Some small correction on your repeat of my comment, and S. Hamel got this right in his follow-up on this thread: There are clear reasons to have multiple digital measurement tools when you consider the broad set.
                              > > >
                              > > > Traditional clickstream analytics, voice of customer (ForeSee, iPerceptions, OpinionLab, hundreds more), and customer experience management are three I typically recommend my clients use concurrently because collectively they broaden the range of questions that can be asked and answered (vs. clickstream alone). If you're interested in how these three tools can be integrated I would refer you to our "Free Whitepapers" section on the Demystified web site where you can download two papers on the "Web Analytics Ecosystem":
                              > > >
                              > > > http://www.webanalyticsdemystified.com/content/white-papers.asp
                              > > >
                              > > > However, and this was the comment I actually made, I am still unconvinced of the value and much of the reasoning behind ** multiple tools deployed that primarily do the same thing. ** For example, Google Analytics and Omniture SiteCatalyst deployed on the same pages.
                              > > >
                              > > > While this is unfortunately common, and trust me I have heard all of the reasons, what I often see when I find two (or more) of the same tools being used is a ** breakdown in web analytics process, strategy, and governance. ** Typically "our business users don't like/understand/want to use Solution A so we also deployed Solution B which is easier to use" or "we don't trust Solution A completely so we use Solution B as a backup."
                              > > >
                              > > > As I said in the panel, when you co-deploy same solutions you end up with "two copies of the books." I personally have observed people going from one solution to the other looking for "the right answer" which, of course, does not exist. Breakdown in process.
                              > > >
                              > > > Does that make sense?
                              > > >
                              > > > I actually wrote a long piece on why I think this happens called "The Coming Bifurcation in Web Analytics Tools" a few weeks back that you might be interested in reading. Tons of smart folks commented on the post so make sure to read the comments as well:
                              > > >
                              > > > http://bit.ly/9DfLxz
                              > > >
                              > > > Anyway this is a recurring theme in the group so it will be interesting what other people have to offer.
                              > > >
                              > > > Eric T. Peterson
                              > > > Web Analytics Demystified, Inc.
                              > > > http://www.webanalyticsdemystified.com
                              > > >
                              > > >
                              > > >
                              > > >
                              > > >
                              > > >
                              > > > --- In webanalytics@yahoogroups.com, "Daniel" <dan@> wrote:
                              > > > >
                              > > > > I happen to be attending Online Marketing Summit 2010 in San Diego today. Just finished listening to a panel of six experts on Web Analytics. A good part of the conversation covered a topic of frequent concern here.
                              > > > >
                              > > > > The consensus of these folks? If you are using multiple tools, don't even think about trying to get them to give you identical results. The way the tools work, differences in page load times, latency and all sorts of other variables that are out of your control and for the most part invisible all contribute to inevitable disparity. Accuracy is not the watchword.
                              > > > >
                              > > > > Eric Petersen of WebAnalyticsDemystified said quite strongly that he recommends not using multiple tools. Most such use, he says, arises from a lack of understanding how to use a given tool properly or having chosen the wrong tool initially.
                              > > > >
                              > > > > Great stuff.
                              > > > >
                              > > >
                              > >
                              >
                            • brian_clifton_uk
                              Eric T. Yes, my example is GA-Urchin specific. QQ: does anyone know if other vendors are able to offer this i.e have both a SaaS and server-side software
                              Message 14 of 17 , Mar 4, 2010
                                Eric T.

                                Yes, my example is GA-Urchin specific. QQ: does anyone know if other vendors are able to offer this i.e have both a SaaS and server-side software solutions that can work together? I know WebTrends used to.

                                There are two types of clients that have used my combined recommendation:

                                1) Publishers that require ABCE Audits. ABCE only require high level numbers - visits, unique visitors and pageviews, so a default install of Urchin is all that is required. The IT department looks after this and it solves the requirement, while the Marketing Team focuses on GA.

                                Why do Marketers prefer to focus on GA?
                                I just published an article on this very subject - http://www.advanced-web-metrics.com/blog/2010/03/04/how-to-choose-between-urchin-or-google-analytics/


                                2) Some clients simply like a backup of their data. The second set of "books" are rarely looked at, but are there for the comfort factor. Who knows, Google may go out of business one day or get eaten by a competitor...

                                Best regards, Brian



                                --- In webanalytics@yahoogroups.com, "eefsafe" <eric.peterson@...> wrote:
                                >
                                > Brian,
                                >
                                > You make two great cases ... for using Urchin software. Why would you also use GA if your business has the reprocessing and auditing requirement? Why not just use Urchin and avoid the litany of issues that arise from keeping "two sets of books?"
                                >
                                > That ABCe thing especially seems to be a pretty strict requirement ... doesn't using GA internally but Urchin for external/audit create reconciliation issues?
                                >
                                > Seems like yet another case of a lack of process and strategy to me.
                                >
                                > Eric T. Peterson
                                > Web Analytics Demystified, Inc.
                                > http://www.webanalyticsdemystified.com
                                >
                                > P.S. It was nice to see Matt again. Apparently he has left the mother-ship and now lives in sunny San Diego managing Omniture's visitor acquisition products. We didn't get to catch up, but Bill Bruno did put the moderator up to asking "is web analytics easy" at which point we realized ... that nothing ever really changes.
                                >
                                > Ask me sometime about how he and I answered that question from the panel. Was pretty amusing from my POV. ;-)
                                >
                                >
                                >
                                >
                                > --- In webanalytics@yahoogroups.com, "brian_clifton_uk" <brian@> wrote:
                                > >
                                > > Wow! Eric Peterson and Matt Belkin in the same room. I would have loved to have seen that ;)
                                > >
                                > > Just to point out two solid reasons for using a second WA tool that collects the same data...
                                > >
                                > > 1] If you are a GA user, data cannot be reprocessed. Say for example, you setup an exclude filter and make a mistake by excluding all your visitor traffic - you cannot go to Google and ask for a reprocess. The data is lost. To avoid this, run Urchin software alongside GA and simply reprocess your logfiles when needed (Urchin = server-side web analytics software derived from GA)*
                                > >
                                > > 2] If you are a publisher and require an ABCE audit for proof of advertising rate card purposes, ABCE require the raw visit data. Google do not pass data to *any* third party so that is not going to work. Again use Urchin alongside GA to provide audit data to ABCE or similar.
                                > >
                                > > Some useful articles form me:
                                > > What is Urhin 6
                                > > http://www.advanced-web-metrics.com/blog/2010/01/29/what-is-urchin-6/
                                > >
                                > > Hosted v Software v Hybrid tools
                                > > http://www.advanced-web-metrics.com/blog/2007/10/07/hosted-v-software-v-hybrid-tools/
                                > >
                                > > Backup your Google Analytics data and use Urchin
                                > > http://www.advanced-web-metrics.com/blog/2007/10/17/backup-your-ga-data-locally/
                                > >
                                > > *GA is actually derived from Urchin
                                > >
                                > > Best regards, Brian
                                > > Author, Advanced Web Metrics with Google Analytics
                                > > 2nd edition (complete re-write) launched in 2 weeks!
                                > >
                                > >
                                > >
                                > > --- In webanalytics@yahoogroups.com, "Daniel" <dan@> wrote:
                                > > >
                                > > > Eric,
                                > > >
                                > > > Thanks for clearing up my confusing statement. I heard it the way you said it here, but I obviously didn't say it clearly.
                                > > >
                                > > > Your point is well taken and I'm glad you pointed it out. I'll look at the additional sources you pointed to in this post.
                                > > >
                                > > > Dan
                                > > >
                                > > > --- In webanalytics@yahoogroups.com, "eefsafe" <eric.peterson@> wrote:
                                > > > >
                                > > > > Hey Dan,
                                > > > >
                                > > > > Thanks for bringing this up ... and what an awesome panel that was yesterday huh? Lots of people don't know but Matt Belkin from Omniture was one of the very, very early contributors to this group so it was nice to see him again. And as usual, Bill Bruno, Ali Behnam, Amanda Kahlow, and Enrique Gonzales (AARP) had a ton to offer.
                                > > > >
                                > > > > Some small correction on your repeat of my comment, and S. Hamel got this right in his follow-up on this thread: There are clear reasons to have multiple digital measurement tools when you consider the broad set.
                                > > > >
                                > > > > Traditional clickstream analytics, voice of customer (ForeSee, iPerceptions, OpinionLab, hundreds more), and customer experience management are three I typically recommend my clients use concurrently because collectively they broaden the range of questions that can be asked and answered (vs. clickstream alone). If you're interested in how these three tools can be integrated I would refer you to our "Free Whitepapers" section on the Demystified web site where you can download two papers on the "Web Analytics Ecosystem":
                                > > > >
                                > > > > http://www.webanalyticsdemystified.com/content/white-papers.asp
                                > > > >
                                > > > > However, and this was the comment I actually made, I am still unconvinced of the value and much of the reasoning behind ** multiple tools deployed that primarily do the same thing. ** For example, Google Analytics and Omniture SiteCatalyst deployed on the same pages.
                                > > > >
                                > > > > While this is unfortunately common, and trust me I have heard all of the reasons, what I often see when I find two (or more) of the same tools being used is a ** breakdown in web analytics process, strategy, and governance. ** Typically "our business users don't like/understand/want to use Solution A so we also deployed Solution B which is easier to use" or "we don't trust Solution A completely so we use Solution B as a backup."
                                > > > >
                                > > > > As I said in the panel, when you co-deploy same solutions you end up with "two copies of the books." I personally have observed people going from one solution to the other looking for "the right answer" which, of course, does not exist. Breakdown in process.
                                > > > >
                                > > > > Does that make sense?
                                > > > >
                                > > > > I actually wrote a long piece on why I think this happens called "The Coming Bifurcation in Web Analytics Tools" a few weeks back that you might be interested in reading. Tons of smart folks commented on the post so make sure to read the comments as well:
                                > > > >
                                > > > > http://bit.ly/9DfLxz
                                > > > >
                                > > > > Anyway this is a recurring theme in the group so it will be interesting what other people have to offer.
                                > > > >
                                > > > > Eric T. Peterson
                                > > > > Web Analytics Demystified, Inc.
                                > > > > http://www.webanalyticsdemystified.com
                                > > > >
                                > > > >
                                > > > >
                                > > > >
                                > > > >
                                > > > >
                                > > > > --- In webanalytics@yahoogroups.com, "Daniel" <dan@> wrote:
                                > > > > >
                                > > > > > I happen to be attending Online Marketing Summit 2010 in San Diego today. Just finished listening to a panel of six experts on Web Analytics. A good part of the conversation covered a topic of frequent concern here.
                                > > > > >
                                > > > > > The consensus of these folks? If you are using multiple tools, don't even think about trying to get them to give you identical results. The way the tools work, differences in page load times, latency and all sorts of other variables that are out of your control and for the most part invisible all contribute to inevitable disparity. Accuracy is not the watchword.
                                > > > > >
                                > > > > > Eric Petersen of WebAnalyticsDemystified said quite strongly that he recommends not using multiple tools. Most such use, he says, arises from a lack of understanding how to use a given tool properly or having chosen the wrong tool initially.
                                > > > > >
                                > > > > > Great stuff.
                                > > > > >
                                > > > >
                                > > >
                                > >
                                >
                              • uta635
                                Hi: Unica NetInsight has both SaaS and locally installed versions - we use the latter. Win Hayes Burlington, CT
                                Message 15 of 17 , Mar 5, 2010
                                  Hi:

                                  Unica NetInsight has both SaaS and locally installed versions - we use the latter.

                                  Win Hayes
                                  Burlington, CT

                                  --- In webanalytics@yahoogroups.com, "brian_clifton_uk" <brian@...> wrote:
                                  >
                                  > Eric T.
                                  >
                                  > Yes, my example is GA-Urchin specific. QQ: does anyone know if other vendors are able to offer this i.e have both a SaaS and server-side software solutions that can work together? I know WebTrends used to.
                                  >
                                • Craig Scribner
                                  Sometimes we analysts enter the field of battle as mercenaries, rather than field generals. Especially as a consultant, I often am asked to pull reports on
                                  Message 16 of 17 , Mar 5, 2010
                                    Sometimes we analysts enter the field of battle as mercenaries, rather than
                                    field generals. Especially as a consultant, I often am asked to pull reports
                                    on sites that were tagged long before I arrived on the scene. Clients don't
                                    like to hear that their questions can't be answered just because someone
                                    else's tagging was broken or sloppy-they tend to see me not as the
                                    messenger, but the culprit. So I've been gratefully saved in a few
                                    situations where a secondary tagging was available to me when my primary set
                                    broke down.



                                    -Craig



                                    From: webanalytics@yahoogroups.com [mailto:webanalytics@yahoogroups.com] On
                                    Behalf Of brian_clifton_uk
                                    Sent: Thursday, March 04, 2010 6:30 AM
                                    To: webanalytics@yahoogroups.com
                                    Subject: [webanalytics] Re: Mixing Analytics Tools - Great Advice from the
                                    Pros





                                    Eric T.

                                    Yes, my example is GA-Urchin specific. QQ: does anyone know if other vendors
                                    are able to offer this i.e have both a SaaS and server-side software
                                    solutions that can work together? I know WebTrends used to.

                                    There are two types of clients that have used my combined recommendation:

                                    1) Publishers that require ABCE Audits. ABCE only require high level numbers
                                    - visits, unique visitors and pageviews, so a default install of Urchin is
                                    all that is required. The IT department looks after this and it solves the
                                    requirement, while the Marketing Team focuses on GA.

                                    Why do Marketers prefer to focus on GA?
                                    I just published an article on this very subject -
                                    http://www.advanced-web-metrics.com/blog/2010/03/04/how-to-choose-between-ur
                                    chin-or-google-analytics/

                                    2) Some clients simply like a backup of their data. The second set of
                                    "books" are rarely looked at, but are there for the comfort factor. Who
                                    knows, Google may go out of business one day or get eaten by a competitor...

                                    Best regards, Brian

                                    --- In webanalytics@yahoogroups.com <mailto:webanalytics%40yahoogroups.com>
                                    , "eefsafe" <eric.peterson@...> wrote:
                                    >
                                    > Brian,
                                    >
                                    > You make two great cases ... for using Urchin software. Why would you also
                                    use GA if your business has the reprocessing and auditing requirement? Why
                                    not just use Urchin and avoid the litany of issues that arise from keeping
                                    "two sets of books?"
                                    >
                                    > That ABCe thing especially seems to be a pretty strict requirement ...
                                    doesn't using GA internally but Urchin for external/audit create
                                    reconciliation issues?
                                    >
                                    > Seems like yet another case of a lack of process and strategy to me.
                                    >
                                    > Eric T. Peterson
                                    > Web Analytics Demystified, Inc.
                                    > http://www.webanalyticsdemystified.com
                                    >
                                    > P.S. It was nice to see Matt again. Apparently he has left the mother-ship
                                    and now lives in sunny San Diego managing Omniture's visitor acquisition
                                    products. We didn't get to catch up, but Bill Bruno did put the moderator up
                                    to asking "is web analytics easy" at which point we realized ... that
                                    nothing ever really changes.
                                    >
                                    > Ask me sometime about how he and I answered that question from the panel.
                                    Was pretty amusing from my POV. ;-)
                                    >
                                    >
                                    >
                                    >
                                    > --- In webanalytics@yahoogroups.com
                                    <mailto:webanalytics%40yahoogroups.com> , "brian_clifton_uk" <brian@> wrote:
                                    > >
                                    > > Wow! Eric Peterson and Matt Belkin in the same room. I would have loved
                                    to have seen that ;)
                                    > >
                                    > > Just to point out two solid reasons for using a second WA tool that
                                    collects the same data...
                                    > >
                                    > > 1] If you are a GA user, data cannot be reprocessed. Say for example,
                                    you setup an exclude filter and make a mistake by excluding all your visitor
                                    traffic - you cannot go to Google and ask for a reprocess. The data is lost.
                                    To avoid this, run Urchin software alongside GA and simply reprocess your
                                    logfiles when needed (Urchin = server-side web analytics software derived
                                    from GA)*
                                    > >
                                    > > 2] If you are a publisher and require an ABCE audit for proof of
                                    advertising rate card purposes, ABCE require the raw visit data. Google do
                                    not pass data to *any* third party so that is not going to work. Again use
                                    Urchin alongside GA to provide audit data to ABCE or similar.
                                    > >
                                    > > Some useful articles form me:
                                    > > What is Urhin 6
                                    > > http://www.advanced-web-metrics.com/blog/2010/01/29/what-is-urchin-6/
                                    > >
                                    > > Hosted v Software v Hybrid tools
                                    > >
                                    http://www.advanced-web-metrics.com/blog/2007/10/07/hosted-v-software-v-hybr
                                    id-tools/
                                    > >
                                    > > Backup your Google Analytics data and use Urchin
                                    > >
                                    http://www.advanced-web-metrics.com/blog/2007/10/17/backup-your-ga-data-loca
                                    lly/
                                    > >
                                    > > *GA is actually derived from Urchin
                                    > >
                                    > > Best regards, Brian
                                    > > Author, Advanced Web Metrics with Google Analytics
                                    > > 2nd edition (complete re-write) launched in 2 weeks!
                                    > >
                                    > >
                                    > >
                                    > > --- In webanalytics@yahoogroups.com
                                    <mailto:webanalytics%40yahoogroups.com> , "Daniel" <dan@> wrote:
                                    > > >
                                    > > > Eric,
                                    > > >
                                    > > > Thanks for clearing up my confusing statement. I heard it the way you
                                    said it here, but I obviously didn't say it clearly.
                                    > > >
                                    > > > Your point is well taken and I'm glad you pointed it out. I'll look at
                                    the additional sources you pointed to in this post.
                                    > > >
                                    > > > Dan
                                    > > >
                                    > > > --- In webanalytics@yahoogroups.com
                                    <mailto:webanalytics%40yahoogroups.com> , "eefsafe" <eric.peterson@> wrote:
                                    > > > >
                                    > > > > Hey Dan,
                                    > > > >
                                    > > > > Thanks for bringing this up ... and what an awesome panel that was
                                    yesterday huh? Lots of people don't know but Matt Belkin from Omniture was
                                    one of the very, very early contributors to this group so it was nice to see
                                    him again. And as usual, Bill Bruno, Ali Behnam, Amanda Kahlow, and Enrique
                                    Gonzales (AARP) had a ton to offer.
                                    > > > >
                                    > > > > Some small correction on your repeat of my comment, and S. Hamel got
                                    this right in his follow-up on this thread: There are clear reasons to have
                                    multiple digital measurement tools when you consider the broad set.
                                    > > > >
                                    > > > > Traditional clickstream analytics, voice of customer (ForeSee,
                                    iPerceptions, OpinionLab, hundreds more), and customer experience management
                                    are three I typically recommend my clients use concurrently because
                                    collectively they broaden the range of questions that can be asked and
                                    answered (vs. clickstream alone). If you're interested in how these three
                                    tools can be integrated I would refer you to our "Free Whitepapers" section
                                    on the Demystified web site where you can download two papers on the "Web
                                    Analytics Ecosystem":
                                    > > > >
                                    > > > > http://www.webanalyticsdemystified.com/content/white-papers.asp
                                    > > > >
                                    > > > > However, and this was the comment I actually made, I am still
                                    unconvinced of the value and much of the reasoning behind ** multiple tools
                                    deployed that primarily do the same thing. ** For example, Google Analytics
                                    and Omniture SiteCatalyst deployed on the same pages.
                                    > > > >
                                    > > > > While this is unfortunately common, and trust me I have heard all of
                                    the reasons, what I often see when I find two (or more) of the same tools
                                    being used is a ** breakdown in web analytics process, strategy, and
                                    governance. ** Typically "our business users don't like/understand/want to
                                    use Solution A so we also deployed Solution B which is easier to use" or "we
                                    don't trust Solution A completely so we use Solution B as a backup."
                                    > > > >
                                    > > > > As I said in the panel, when you co-deploy same solutions you end up
                                    with "two copies of the books." I personally have observed people going from
                                    one solution to the other looking for "the right answer" which, of course,
                                    does not exist. Breakdown in process.
                                    > > > >
                                    > > > > Does that make sense?
                                    > > > >
                                    > > > > I actually wrote a long piece on why I think this happens called
                                    "The Coming Bifurcation in Web Analytics Tools" a few weeks back that you
                                    might be interested in reading. Tons of smart folks commented on the post so
                                    make sure to read the comments as well:
                                    > > > >
                                    > > > > http://bit.ly/9DfLxz
                                    > > > >
                                    > > > > Anyway this is a recurring theme in the group so it will be
                                    interesting what other people have to offer.
                                    > > > >
                                    > > > > Eric T. Peterson
                                    > > > > Web Analytics Demystified, Inc.
                                    > > > > http://www.webanalyticsdemystified.com
                                    > > > >
                                    > > > >
                                    > > > >
                                    > > > >
                                    > > > >
                                    > > > >
                                    > > > > --- In webanalytics@yahoogroups.com
                                    <mailto:webanalytics%40yahoogroups.com> , "Daniel" <dan@> wrote:
                                    > > > > >
                                    > > > > > I happen to be attending Online Marketing Summit 2010 in San Diego
                                    today. Just finished listening to a panel of six experts on Web Analytics. A
                                    good part of the conversation covered a topic of frequent concern here.
                                    > > > > >
                                    > > > > > The consensus of these folks? If you are using multiple tools,
                                    don't even think about trying to get them to give you identical results. The
                                    way the tools work, differences in page load times, latency and all sorts of
                                    other variables that are out of your control and for the most part invisible
                                    all contribute to inevitable disparity. Accuracy is not the watchword.
                                    > > > > >
                                    > > > > > Eric Petersen of WebAnalyticsDemystified said quite strongly that
                                    he recommends not using multiple tools. Most such use, he says, arises from
                                    a lack of understanding how to use a given tool properly or having chosen
                                    the wrong tool initially.
                                    > > > > >
                                    > > > > > Great stuff.
                                    > > > > >
                                    > > > >
                                    > > >
                                    > >
                                    >





                                    [Non-text portions of this message have been removed]
                                  • Tim Leighton-Boyce
                                    ... a real meeting with a client last week: It s slightly too easy in GA for anyone with admin access to accidentally or maliciously delete one or all
                                    Message 17 of 17 , Mar 6, 2010
                                      On 4 March 2010 13:29, brian_clifton_uk <brian@...>wrote:

                                      >
                                      > 2) Some clients simply like a backup of their data. The second set of
                                      > "books" are rarely looked at, but are there for the comfort factor. Who
                                      > knows, Google may go out of business one day or get eaten by a competitor...
                                      >
                                      > Another reason for the 'GA + something else for backup' was put forward in
                                      a real meeting with a client last week:

                                      It's slightly too easy in GA for anyone with admin access to accidentally or
                                      maliciously delete one or all profiles. I don't know whether they were
                                      paranoid or had direct experience of the 'malicious' aspect!

                                      Tim


                                      [Non-text portions of this message have been removed]
                                    Your message has been successfully submitted and would be delivered to recipients shortly.