Loading ...
Sorry, an error occurred while loading the content.

Tool discrepancy - Troubleshoot process

Expand Messages
  • jason6346
    Hi there, Perhaps a question that has been asked many of times within this forum and probably a popular challenge within the Analytics industry full sop so
    Message 1 of 14 , Apr 26, 2012
    • 0 Attachment
      Hi there,

      Perhaps a question that has been asked many of times within this forum and probably a popular challenge within the Analytics industry full sop so please bear with me.

      We are currently experiencing issues where our two web analytics solutions are telling us two different things. Whilst I am aware it is unlikely for two tools to recordthe same numbers, I do expect for numbers to at least tell the same story.We are using both Site catalyst and Google Analytics and have found whilst they trend the same, they actually tell us different Year on Year performances. Google shows our traffic is up YoY, where Sitecatalyst shows we are down YoY.

      We have used WASP site audit but strangely enough it has identified our site as having less GA tags than Site catalyst which is opposite to what our numbers are tell us. With SiteCatalyst showing being down we'd have expected pages to be missing more Sitecatalyst tags than GA.

      What I would like to know is whether there is a clear process on how to troubleshoot this? Has anyone else experienced this and how did you rectify it?

      Your help will be very much appreciated.

      Thanks

      J
    • jason6346
      Just wondered if anyone had any answers to share on this one? Thanks J
      Message 2 of 14 , May 1, 2012
      • 0 Attachment
        Just wondered if anyone had any answers to share on this one?

        Thanks

        J

        --- In webanalytics@yahoogroups.com, "jason6346" <darkus1@...> wrote:
        >
        > Hi there,
        >
        > Perhaps a question that has been asked many of times within this forum and probably a popular challenge within the Analytics industry full sop so please bear with me.
        >
        > We are currently experiencing issues where our two web analytics solutions are telling us two different things. Whilst I am aware it is unlikely for two tools to recordthe same numbers, I do expect for numbers to at least tell the same story.We are using both Site catalyst and Google Analytics and have found whilst they trend the same, they actually tell us different Year on Year performances. Google shows our traffic is up YoY, where Sitecatalyst shows we are down YoY.
        >
        > We have used WASP site audit but strangely enough it has identified our site as having less GA tags than Site catalyst which is opposite to what our numbers are tell us. With SiteCatalyst showing being down we'd have expected pages to be missing more Sitecatalyst tags than GA.
        >
        > What I would like to know is whether there is a clear process on how to troubleshoot this? Has anyone else experienced this and how did you rectify it?
        >
        > Your help will be very much appreciated.
        >
        > Thanks
        >
        > J
        >
      • VaBeachKevin
        There are tons of reasons why they could be different. Are you using first party cookies with SiteCatalyst? Are you doing any custom link tracking with either
        Message 3 of 14 , May 1, 2012
        • 0 Attachment
          There are tons of reasons why they could be different.

          Are you using first party cookies with SiteCatalyst?
          Are you doing any custom link tracking with either tool?
          Are both tools implemented in the same section of the page?
          Does the site have any ajax, popups, or iframed elements?
          Are you tracking video?
          Are you tracking any flash?


          Kevin Rogers
          http://webanalyticsland.com
          http://keystonesolutions.com


          --- In webanalytics@yahoogroups.com, "jason6346" <darkus1@...> wrote:
          >
          > Just wondered if anyone had any answers to share on this one?
          >
          > Thanks
          >
          > J
          >
          > --- In webanalytics@yahoogroups.com, "jason6346" <darkus1@> wrote:
          > >
          > > Hi there,
          > >
          > > Perhaps a question that has been asked many of times within this forum and probably a popular challenge within the Analytics industry full sop so please bear with me.
          > >
          > > We are currently experiencing issues where our two web analytics solutions are telling us two different things. Whilst I am aware it is unlikely for two tools to recordthe same numbers, I do expect for numbers to at least tell the same story.We are using both Site catalyst and Google Analytics and have found whilst they trend the same, they actually tell us different Year on Year performances. Google shows our traffic is up YoY, where Sitecatalyst shows we are down YoY.
          > >
          > > We have used WASP site audit but strangely enough it has identified our site as having less GA tags than Site catalyst which is opposite to what our numbers are tell us. With SiteCatalyst showing being down we'd have expected pages to be missing more Sitecatalyst tags than GA.
          > >
          > > What I would like to know is whether there is a clear process on how to troubleshoot this? Has anyone else experienced this and how did you rectify it?
          > >
          > > Your help will be very much appreciated.
          > >
          > > Thanks
          > >
          > > J
          > >
          >
        • Moczygemba, Mollie
          Hi Jason, My guess would be that this is due to Google being a sample based system if you have over 500,000 visits. -Mollie From: webanalytics@yahoogroups.com
          Message 4 of 14 , May 1, 2012
          • 0 Attachment
            Hi Jason,



            My guess would be that this is due to Google being a sample based system
            if you have over 500,000 visits.



            -Mollie



            From: webanalytics@yahoogroups.com [mailto:webanalytics@yahoogroups.com]
            On Behalf Of jason6346
            Sent: Tuesday, May 01, 2012 9:24 AM
            To: webanalytics@yahoogroups.com
            Subject: [webanalytics] Re: Tool discrepancy - Troubleshoot process





            Just wondered if anyone had any answers to share on this one?

            Thanks

            J

            --- In webanalytics@yahoogroups.com
            <mailto:webanalytics%40yahoogroups.com> , "jason6346" <darkus1@...>
            wrote:
            >
            > Hi there,
            >
            > Perhaps a question that has been asked many of times within this forum
            and probably a popular challenge within the Analytics industry full sop
            so please bear with me.
            >
            > We are currently experiencing issues where our two web analytics
            solutions are telling us two different things. Whilst I am aware it is
            unlikely for two tools to recordthe same numbers, I do expect for
            numbers to at least tell the same story.We are using both Site catalyst
            and Google Analytics and have found whilst they trend the same, they
            actually tell us different Year on Year performances. Google shows our
            traffic is up YoY, where Sitecatalyst shows we are down YoY.
            >
            > We have used WASP site audit but strangely enough it has identified
            our site as having less GA tags than Site catalyst which is opposite to
            what our numbers are tell us. With SiteCatalyst showing being down we'd
            have expected pages to be missing more Sitecatalyst tags than GA.
            >
            > What I would like to know is whether there is a clear process on how
            to troubleshoot this? Has anyone else experienced this and how did you
            rectify it?
            >
            > Your help will be very much appreciated.
            >
            > Thanks
            >
            > J
            >





            [Non-text portions of this message have been removed]
          • Stephane Hamel
            Being the mad mind behind WASP, I can tell you conducting an audit isn t easy business. You can use WASP or other tools to get a sense of those discrepancies,
            Message 5 of 14 , May 1, 2012
            • 0 Attachment
              Being the mad mind behind WASP, I can tell you conducting an audit isn't easy business. You can use WASP or other tools to get a sense of those discrepancies, but at the end of the day a real independent audit might be the best approach for many reasons: deep knowledge & experience, put the practitioner outside of political risks, unbiased assessment, etc.

              Audits is one of the service Cardinal Path offers. You can certainly seek for help from other agencies too.

              Stéphane Hamel, MBA, CWA
              Director, Strategic Services
              LinkedIn: linkedin.com/in/shamel
              Twitter: @SHamelCP | @CardinalPath
              www.cardinalpath.com

              --- In webanalytics@yahoogroups.com, "jason6346" <darkus1@...> wrote:
              >
              > Hi there,
              >
              > Perhaps a question that has been asked many of times within this forum and probably a popular challenge within the Analytics industry full sop so please bear with me.
              >
              > We are currently experiencing issues where our two web analytics solutions are telling us two different things. Whilst I am aware it is unlikely for two tools to recordthe same numbers, I do expect for numbers to at least tell the same story.We are using both Site catalyst and Google Analytics and have found whilst they trend the same, they actually tell us different Year on Year performances. Google shows our traffic is up YoY, where Sitecatalyst shows we are down YoY.
              >
              > We have used WASP site audit but strangely enough it has identified our site as having less GA tags than Site catalyst which is opposite to what our numbers are tell us. With SiteCatalyst showing being down we'd have expected pages to be missing more Sitecatalyst tags than GA.
              >
              > What I would like to know is whether there is a clear process on how to troubleshoot this? Has anyone else experienced this and how did you rectify it?
              >
              > Your help will be very much appreciated.
              >
              > Thanks
              >
              > J
              >
            • Wandering Dave Rhee
              Hi, Jason, Unfortunately, no, there is no clear process to troubleshoot discrepancies between two vendor tools. However, there are some things you might
              Message 6 of 14 , May 1, 2012
              • 0 Attachment
                Hi, Jason,

                Unfortunately, no, there is no clear process to troubleshoot discrepancies
                between two vendor tools. However, there are some things you might check,
                that can give you some clues. Every situation is unique, but at least you
                can start by ruling out a few things.

                The tag audit you did is a great idea, and definitely the first thing to
                check. Were any of the untagged pages very likely to have high volumes,
                particularly of bounced visits (e.g., landing pages that would increment a
                unique visitor count for one tool, but not the other)?

                Can you segment the site such that you are looking only at certain subsets
                of it, and compare that between tools? For example, only a brand site,
                omitting the ecommerce site. Or only the ecommerce site -- and in that
                case, see which tool correlates best with your back-end financials, for a
                third tool tiebreaker.

                Otherwise, try narrowing down the trend duration, from year-to-year, down
                to month-to-month, and see if the tools agree, at least directionally.

                In both of the above cases, the idea is to narrow the scope until you see
                some agreement, and then widen it back up until the discrepancy re-emerges.
                You might do the same with certain types of traffic -- for example, look
                only at traffic referred by a certain source, and if it matches, then
                gradually add in more sources until you see a discrepancy.

                Of course, one big issue, depending on your volume, will be sampling. Part
                of the value of reducing scope is that you are more likely to be looking at
                entire data sets, whereas if you look at an entire year's worth of traffic,
                your tool may instead give you a sample that is theoretically valid, but in
                practice, garbage. (I'm making this as a general comment, not about any of
                the specific tools you might be using.) If you find that whole data gives
                you valid results, and sampling does not, then you will have to decide if
                the additional costs (not just money, but going back and re-working some or
                all of your historic reporting and analysis) are worth the benefits. I
                hope it's clear what I would recommend.

                There are other issues, but hopefully you have already eliminated them.
                Such as one tool running javascript in the header, and the other tool in
                the footer. If you have a user population that aborts page loads part-way
                through, you will see the first tool register much more traffic than the
                second. Or if you have different tagging methods between the tools, where
                only one of them might be blocked (e.g., a mobile OS disabling selectively).

                If page views are in sync between the tools, but not unique visitors, then
                look at the specific methodology by which deduplication is done. If visits
                are an issue, then check that your session timeouts are the same for each
                tool. If you have weird things happening across day or week boundaries,
                make sure both tools are set to record in the same time zone, and that the
                way they record a session that spans two days (midnight) or two weeks is
                handled identically. (Of course, those last couple won't matter over a
                whole year, but you ought to know what the settings are anyway.)

                There's more you can check, but that's the basic approach I would take.
                Reduce your data set until you have reasonably good correlation, then
                expand again until you don't. If you need additional help, I'm sure any
                one of a hundred consultants here would be glad to assist. ;-)

                Do us a favor, though, and check back in and let us know if you ever get
                this resolved! The next person will thank you profusely.

                WDave Rhee
                (Moderator here, but posting these just as my personal opinions)

                On Tue, May 1, 2012 at 6:23 PM, jason6346 <darkus1@...> wrote:

                > **
                >
                >
                > Just wondered if anyone had any answers to share on this one?
                >
                > Thanks
                >
                > J
                >
                >
                > --- In webanalytics@yahoogroups.com, "jason6346" <darkus1@...> wrote:
                > >
                > > Hi there,
                > >
                > > Perhaps a question that has been asked many of times within this forum
                > and probably a popular challenge within the Analytics industry full sop so
                > please bear with me.
                > >
                > > We are currently experiencing issues where our two web analytics
                > solutions are telling us two different things. Whilst I am aware it is
                > unlikely for two tools to recordthe same numbers, I do expect for numbers
                > to at least tell the same story.We are using both Site catalyst and Google
                > Analytics and have found whilst they trend the same, they actually tell us
                > different Year on Year performances. Google shows our traffic is up YoY,
                > where Sitecatalyst shows we are down YoY.
                > >
                > > We have used WASP site audit but strangely enough it has identified our
                > site as having less GA tags than Site catalyst which is opposite to what
                > our numbers are tell us. With SiteCatalyst showing being down we'd have
                > expected pages to be missing more Sitecatalyst tags than GA.
                > >
                > > What I would like to know is whether there is a clear process on how to
                > troubleshoot this? Has anyone else experienced this and how did you rectify
                > it?
                > >
                > > Your help will be very much appreciated.
                > >
                > > Thanks
                > >
                > > J
                > >
                >
                >
                >


                [Non-text portions of this message have been removed]
              • Tim Leighton-Boyce
                What a remarkably generous reply. That seems to cover it all. +1 to the key tip about starting with a v small date range (the day before yesterday...) and then
                Message 7 of 14 , May 1, 2012
                • 0 Attachment
                  What a remarkably generous reply. That seems to cover it all.

                  +1 to the key tip about starting with a v small date range (the day before
                  yesterday...) and then expanding it.

                  Tim

                  On Tuesday, May 1, 2012, Wandering Dave Rhee wrote:

                  > **
                  >
                  >
                  > Hi, Jason,
                  >
                  > Unfortunately, no, there is no clear process to troubleshoot discrepancies
                  > between two vendor tools. However, there are some things you might check,
                  > that can give you some clues. Every situation is unique, but at least you
                  > can start by ruling out a few things.
                  >
                  > The tag audit you did is a great idea, and definitely the first thing to
                  > check. Were any of the untagged pages very likely to have high volumes,
                  > particularly of bounced visits (e.g., landing pages that would increment a
                  > unique visitor count for one tool, but not the other)?
                  >
                  > Can you segment the site such that you are looking only at certain subsets
                  > of it, and compare that between tools? For example, only a brand site,
                  > omitting the ecommerce site. Or only the ecommerce site -- and in that
                  > case, see which tool correlates best with your back-end financials, for a
                  > third tool tiebreaker.
                  >
                  > Otherwise, try narrowing down the trend duration, from year-to-year, down
                  > to month-to-month, and see if the tools agree, at least directionally.
                  >
                  > In both of the above cases, the idea is to narrow the scope until you see
                  > some agreement, and then widen it back up until the discrepancy re-emerges.
                  > You might do the same with certain types of traffic -- for example, look
                  > only at traffic referred by a certain source, and if it matches, then
                  > gradually add in more sources until you see a discrepancy.
                  >
                  > Of course, one big issue, depending on your volume, will be sampling. Part
                  > of the value of reducing scope is that you are more likely to be looking at
                  > entire data sets, whereas if you look at an entire year's worth of traffic,
                  > your tool may instead give you a sample that is theoretically valid, but in
                  > practice, garbage. (I'm making this as a general comment, not about any of
                  > the specific tools you might be using.) If you find that whole data gives
                  > you valid results, and sampling does not, then you will have to decide if
                  > the additional costs (not just money, but going back and re-working some or
                  > all of your historic reporting and analysis) are worth the benefits. I
                  > hope it's clear what I would recommend.
                  >
                  > There are other issues, but hopefully you have already eliminated them.
                  > Such as one tool running javascript in the header, and the other tool in
                  > the footer. If you have a user population that aborts page loads part-way
                  > through, you will see the first tool register much more traffic than the
                  > second. Or if you have different tagging methods between the tools, where
                  > only one of them might be blocked (e.g., a mobile OS disabling
                  > selectively).
                  >
                  > If page views are in sync between the tools, but not unique visitors, then
                  > look at the specific methodology by which deduplication is done. If visits
                  > are an issue, then check that your session timeouts are the same for each
                  > tool. If you have weird things happening across day or week boundaries,
                  > make sure both tools are set to record in the same time zone, and that the
                  > way they record a session that spans two days (midnight) or two weeks is
                  > handled identically. (Of course, those last couple won't matter over a
                  > whole year, but you ought to know what the settings are anyway.)
                  >
                  > There's more you can check, but that's the basic approach I would take.
                  > Reduce your data set until you have reasonably good correlation, then
                  > expand again until you don't. If you need additional help, I'm sure any
                  > one of a hundred consultants here would be glad to assist. ;-)
                  >
                  > Do us a favor, though, and check back in and let us know if you ever get
                  > this resolved! The next person will thank you profusely.
                  >
                  > WDave Rhee
                  > (Moderator here, but posting these just as my personal opinions)
                  >
                  > On Tue, May 1, 2012 at 6:23 PM, jason6346 <darkus1@...<javascript:_e({}, 'cvml', 'darkus1%40btinternet.com');>>
                  > wrote:
                  >
                  > > **
                  > >
                  > >
                  > > Just wondered if anyone had any answers to share on this one?
                  > >
                  > > Thanks
                  > >
                  > > J
                  > >
                  > >
                  > > --- In webanalytics@yahoogroups.com <javascript:_e({}, 'cvml',
                  > 'webanalytics%40yahoogroups.com');>, "jason6346" <darkus1@...> wrote:
                  > > >
                  > > > Hi there,
                  > > >
                  > > > Perhaps a question that has been asked many of times within this forum
                  > > and probably a popular challenge within the Analytics industry full sop
                  > so
                  > > please bear with me.
                  > > >
                  > > > We are currently experiencing issues where our two web analytics
                  > > solutions are telling us two different things. Whilst I am aware it is
                  > > unlikely for two tools to recordthe same numbers, I do expect for numbers
                  > > to at least tell the same story.We are using both Site catalyst and
                  > Google
                  > > Analytics and have found whilst they trend the same, they actually tell
                  > us
                  > > different Year on Year performances. Google shows our traffic is up YoY,
                  > > where Sitecatalyst shows we are down YoY.
                  > > >
                  > > > We have used WASP site audit but strangely enough it has identified our
                  > > site as having less GA tags than Site catalyst which is opposite to what
                  > > our numbers are tell us. With SiteCatalyst showing being down we'd have
                  > > expected pages to be missing more Sitecatalyst tags than GA.
                  > > >
                  > > > What I would like to know is whether there is a clear process on how to
                  > > troubleshoot this? Has anyone else experienced this and how did you
                  > rectify
                  > > it?
                  > > >
                  > > > Your help will be very much appreciated.
                  > > >
                  > > > Thanks
                  > > >
                  > > > J
                  > > >
                  > >
                  > >
                  > >
                  >
                  > [Non-text portions of this message have been removed]
                  >
                  >
                  >


                  --
                  From my phone - please forgive typos!


                  [Non-text portions of this message have been removed]
                • Keith MacDonald
                  Hi Jason, I ve certainly run into similar agreement problems between SiteCat and GA. The advice from Kevin, Stéphane and Dave is excellent and I d echo
                  Message 8 of 14 , May 1, 2012
                  • 0 Attachment
                    Hi Jason,

                    I've certainly run into similar agreement problems between SiteCat and GA. The advice from Kevin, Stéphane and Dave is excellent and I'd echo everything they've said.

                    Perhaps easily overlooked, are the pages pushing data into your report suite (defined by RSID) the same as the pages pushing data into your GA account? Looking from the other direction, are all of your tagged pages pushing data to the same RSID and GA account?

                    (I've seen a case where our report suite was collecting from 4 different domains while Google was set up with a separate account per domain, despite being labelled the same.)

                    Look for reasons the javascript might not execute 100% of the time. It's possible that, without javascript support in the browser as example, SiteCat will still count something (page view, visit, etc.) and Google will not. (That said, there really shouldn't be a large discrepancy caused by JS support, unless you have a large proportion of mobile traffic, and even then...).

                    You could be seeing filtering problems after the data is received. For GA this would be profiles (make sure you're looking at an unfiltered profile). It's a little trickier with SiteCat - you may have VISTA rules running and/or data exclusions by IP address that don't align with GA.

                    (Quick example: your positive YOY trend in GA could be a reflection of increasing activity from your staff while your SiteCat declining YOY trend is a reflection of your customers with "internal" traffic filtered out - hopefully that's not the case!)

                    I can't recall who, but someone at last year's eMetrics conference in Toronto claimed that, after a *lot* of effort, they'd got SiteCat and GA within 2% of each other. That's probably a best-case scenario, but regardless of the actual discrepancy, the trends should align.

                    Keep at it, and please do keep us posted!

                    Keith

                    Keith MacDonald
                    http://unilytics.com
                    @keithmacd


                    --- In webanalytics@yahoogroups.com, "jason6346" <darkus1@...> wrote:
                    >
                    > Just wondered if anyone had any answers to share on this one?
                    >
                    > Thanks
                    >
                    > J
                    >
                    > --- In webanalytics@yahoogroups.com, "jason6346" <darkus1@> wrote:
                    > >
                    > > Hi there,
                    > >
                    > > Perhaps a question that has been asked many of times within this forum and probably a popular challenge within the Analytics industry full sop so please bear with me.
                    > >
                    > > We are currently experiencing issues where our two web analytics solutions are telling us two different things. Whilst I am aware it is unlikely for two tools to recordthe same numbers, I do expect for numbers to at least tell the same story.We are using both Site catalyst and Google Analytics and have found whilst they trend the same, they actually tell us different Year on Year performances. Google shows our traffic is up YoY, where Sitecatalyst shows we are down YoY.
                    > >
                    > > We have used WASP site audit but strangely enough it has identified our site as having less GA tags than Site catalyst which is opposite to what our numbers are tell us. With SiteCatalyst showing being down we'd have expected pages to be missing more Sitecatalyst tags than GA.
                    > >
                    > > What I would like to know is whether there is a clear process on how to troubleshoot this? Has anyone else experienced this and how did you rectify it?
                    > >
                    > > Your help will be very much appreciated.
                    > >
                    > > Thanks
                    > >
                    > > J
                    > >
                    >
                  • Julien Coquet
                    Tim, Dave s answers are *always* generous :-) Julien Coquet Le 1 mai 2012 à 21:19, Tim Leighton-Boyce a écrit : What a remarkably
                    Message 9 of 14 , May 1, 2012
                    • 0 Attachment
                      Tim,

                      Dave's answers are *always* generous :-)

                      Julien Coquet


                      Le 1 mai 2012 à 21:19, Tim Leighton-Boyce <tim.lboyce@...> a écrit :



                      What a remarkably generous reply. That seems to cover it all.

                      +1 to the key tip about starting with a v small date range (the day before
                      yesterday...) and then expanding it.

                      Tim

                      On Tuesday, May 1, 2012, Wandering Dave Rhee wrote:

                      > **
                      >
                      >
                      > Hi, Jason,
                      >
                      > Unfortunately, no, there is no clear process to troubleshoot discrepancies
                      > between two vendor tools. However, there are some things you might check,
                      > that can give you some clues. Every situation is unique, but at least you
                      > can start by ruling out a few things.
                      >
                      > The tag audit you did is a great idea, and definitely the first thing to
                      > check. Were any of the untagged pages very likely to have high volumes,
                      > particularly of bounced visits (e.g., landing pages that would increment a
                      > unique visitor count for one tool, but not the other)?
                      >
                      > Can you segment the site such that you are looking only at certain subsets
                      > of it, and compare that between tools? For example, only a brand site,
                      > omitting the ecommerce site. Or only the ecommerce site -- and in that
                      > case, see which tool correlates best with your back-end financials, for a
                      > third tool tiebreaker.
                      >
                      > Otherwise, try narrowing down the trend duration, from year-to-year, down
                      > to month-to-month, and see if the tools agree, at least directionally.
                      >
                      > In both of the above cases, the idea is to narrow the scope until you see
                      > some agreement, and then widen it back up until the discrepancy
                      re-emerges.
                      > You might do the same with certain types of traffic -- for example, look
                      > only at traffic referred by a certain source, and if it matches, then
                      > gradually add in more sources until you see a discrepancy.
                      >
                      > Of course, one big issue, depending on your volume, will be sampling. Part
                      > of the value of reducing scope is that you are more likely to be looking
                      at
                      > entire data sets, whereas if you look at an entire year's worth of
                      traffic,
                      > your tool may instead give you a sample that is theoretically valid, but
                      in
                      > practice, garbage. (I'm making this as a general comment, not about any of
                      > the specific tools you might be using.) If you find that whole data gives
                      > you valid results, and sampling does not, then you will have to decide if
                      > the additional costs (not just money, but going back and re-working some
                      or
                      > all of your historic reporting and analysis) are worth the benefits. I
                      > hope it's clear what I would recommend.
                      >
                      > There are other issues, but hopefully you have already eliminated them.
                      > Such as one tool running javascript in the header, and the other tool in
                      > the footer. If you have a user population that aborts page loads part-way
                      > through, you will see the first tool register much more traffic than the
                      > second. Or if you have different tagging methods between the tools, where
                      > only one of them might be blocked (e.g., a mobile OS disabling
                      > selectively).
                      >
                      > If page views are in sync between the tools, but not unique visitors, then
                      > look at the specific methodology by which deduplication is done. If visits
                      > are an issue, then check that your session timeouts are the same for each
                      > tool. If you have weird things happening across day or week boundaries,
                      > make sure both tools are set to record in the same time zone, and that the
                      > way they record a session that spans two days (midnight) or two weeks is
                      > handled identically. (Of course, those last couple won't matter over a
                      > whole year, but you ought to know what the settings are anyway.)
                      >
                      > There's more you can check, but that's the basic approach I would take.
                      > Reduce your data set until you have reasonably good correlation, then
                      > expand again until you don't. If you need additional help, I'm sure any
                      > one of a hundred consultants here would be glad to assist. ;-)
                      >
                      > Do us a favor, though, and check back in and let us know if you ever get
                      > this resolved! The next person will thank you profusely.
                      >
                      > WDave Rhee
                      > (Moderator here, but posting these just as my personal opinions)
                      >
                      > On Tue, May 1, 2012 at 6:23 PM, jason6346 <darkus1@...<javascript:_e({},
                      'cvml', 'darkus1%40btinternet.com');>>
                      > wrote:
                      >
                      > > **
                      > >
                      > >
                      > > Just wondered if anyone had any answers to share on this one?
                      > >
                      > > Thanks
                      > >
                      > > J
                      > >
                      > >
                      > > --- In webanalytics@yahoogroups.com <javascript:_e({}, 'cvml',
                      > 'webanalytics%40yahoogroups.com');>, "jason6346" <darkus1@...> wrote:
                      > > >
                      > > > Hi there,
                      > > >
                      > > > Perhaps a question that has been asked many of times within this forum
                      > > and probably a popular challenge within the Analytics industry full sop
                      > so
                      > > please bear with me.
                      > > >
                      > > > We are currently experiencing issues where our two web analytics
                      > > solutions are telling us two different things. Whilst I am aware it is
                      > > unlikely for two tools to recordthe same numbers, I do expect for
                      numbers
                      > > to at least tell the same story.We are using both Site catalyst and
                      > Google
                      > > Analytics and have found whilst they trend the same, they actually tell
                      > us
                      > > different Year on Year performances. Google shows our traffic is up YoY,
                      > > where Sitecatalyst shows we are down YoY.
                      > > >
                      > > > We have used WASP site audit but strangely enough it has identified
                      our
                      > > site as having less GA tags than Site catalyst which is opposite to what
                      > > our numbers are tell us. With SiteCatalyst showing being down we'd have
                      > > expected pages to be missing more Sitecatalyst tags than GA.
                      > > >
                      > > > What I would like to know is whether there is a clear process on how
                      to
                      > > troubleshoot this? Has anyone else experienced this and how did you
                      > rectify
                      > > it?
                      > > >
                      > > > Your help will be very much appreciated.
                      > > >
                      > > > Thanks
                      > > >
                      > > > J
                      > > >
                      > >
                      > >
                      > >
                      >
                      > [Non-text portions of this message have been removed]
                      >
                      >
                      >

                      --
                      From my phone - please forgive typos!

                      [Non-text portions of this message have been removed]




                      [Non-text portions of this message have been removed]
                    • Wandering Dave Rhee
                      Heh -- I take that as a polite way of saying I don t know when to stop talking. ;-) @All -- Julien and I have worked together on a few projects, including
                      Message 10 of 14 , May 1, 2012
                      • 0 Attachment
                        Heh -- I take that as a polite way of saying I don't know when to stop
                        talking. ;-)

                        @All -- Julien and I have worked together on a few projects, including some
                        pro bono work for the Michael J. Fox Foundation, and I can testify that his
                        answers are equally always as generous! <heh heh>

                        WDave

                        On Tue, May 1, 2012 at 9:24 PM, Julien Coquet <julien.coquet@...>wrote:

                        > **
                        >
                        >
                        > Tim,
                        >
                        > Dave's answers are *always* generous :-)
                        >
                        > Julien Coquet
                        >
                        > Le 1 mai 2012 � 21:19, Tim Leighton-Boyce <tim.lboyce@...> a �crit :
                        >
                        >
                        > What a remarkably generous reply. That seems to cover it all.
                        >
                        > +1 to the key tip about starting with a v small date range (the day before
                        > yesterday...) and then expanding it.
                        >
                        > Tim
                        >
                        > On Tuesday, May 1, 2012, Wandering Dave Rhee wrote:
                        >
                        > > **
                        > >
                        > >
                        > > Hi, Jason,
                        > >
                        > > Unfortunately, no, there is no clear process to troubleshoot
                        > discrepancies
                        > > between two vendor tools. However, there are some things you might check,
                        > > that can give you some clues. Every situation is unique, but at least you
                        > > can start by ruling out a few things.
                        > >
                        > > The tag audit you did is a great idea, and definitely the first thing to
                        > > check. Were any of the untagged pages very likely to have high volumes,
                        > > particularly of bounced visits (e.g., landing pages that would increment
                        > a
                        > > unique visitor count for one tool, but not the other)?
                        > >
                        > > Can you segment the site such that you are looking only at certain
                        > subsets
                        > > of it, and compare that between tools? For example, only a brand site,
                        > > omitting the ecommerce site. Or only the ecommerce site -- and in that
                        > > case, see which tool correlates best with your back-end financials, for a
                        > > third tool tiebreaker.
                        > >
                        > > Otherwise, try narrowing down the trend duration, from year-to-year, down
                        > > to month-to-month, and see if the tools agree, at least directionally.
                        > >
                        > > In both of the above cases, the idea is to narrow the scope until you see
                        > > some agreement, and then widen it back up until the discrepancy
                        > re-emerges.
                        > > You might do the same with certain types of traffic -- for example, look
                        > > only at traffic referred by a certain source, and if it matches, then
                        > > gradually add in more sources until you see a discrepancy.
                        > >
                        > > Of course, one big issue, depending on your volume, will be sampling.
                        > Part
                        > > of the value of reducing scope is that you are more likely to be looking
                        > at
                        > > entire data sets, whereas if you look at an entire year's worth of
                        > traffic,
                        > > your tool may instead give you a sample that is theoretically valid, but
                        > in
                        > > practice, garbage. (I'm making this as a general comment, not about any
                        > of
                        > > the specific tools you might be using.) If you find that whole data gives
                        > > you valid results, and sampling does not, then you will have to decide if
                        > > the additional costs (not just money, but going back and re-working some
                        > or
                        > > all of your historic reporting and analysis) are worth the benefits. I
                        > > hope it's clear what I would recommend.
                        > >
                        > > There are other issues, but hopefully you have already eliminated them.
                        > > Such as one tool running javascript in the header, and the other tool in
                        > > the footer. If you have a user population that aborts page loads part-way
                        > > through, you will see the first tool register much more traffic than the
                        > > second. Or if you have different tagging methods between the tools, where
                        > > only one of them might be blocked (e.g., a mobile OS disabling
                        > > selectively).
                        > >
                        > > If page views are in sync between the tools, but not unique visitors,
                        > then
                        > > look at the specific methodology by which deduplication is done. If
                        > visits
                        > > are an issue, then check that your session timeouts are the same for each
                        > > tool. If you have weird things happening across day or week boundaries,
                        > > make sure both tools are set to record in the same time zone, and that
                        > the
                        > > way they record a session that spans two days (midnight) or two weeks is
                        > > handled identically. (Of course, those last couple won't matter over a
                        > > whole year, but you ought to know what the settings are anyway.)
                        > >
                        > > There's more you can check, but that's the basic approach I would take.
                        > > Reduce your data set until you have reasonably good correlation, then
                        > > expand again until you don't. If you need additional help, I'm sure any
                        > > one of a hundred consultants here would be glad to assist. ;-)
                        > >
                        > > Do us a favor, though, and check back in and let us know if you ever get
                        > > this resolved! The next person will thank you profusely.
                        > >
                        > > WDave Rhee
                        > > (Moderator here, but posting these just as my personal opinions)
                        > >
                        > > On Tue, May 1, 2012 at 6:23 PM, jason6346 <darkus1@...
                        > <javascript:_e({},
                        > 'cvml', 'darkus1%40btinternet.com');>>
                        > > wrote:
                        > >
                        > > > **
                        > > >
                        > > >
                        > > > Just wondered if anyone had any answers to share on this one?
                        > > >
                        > > > Thanks
                        > > >
                        > > > J
                        > > >
                        > > >
                        > > > --- In webanalytics@yahoogroups.com <javascript:_e({}, 'cvml',
                        > > 'webanalytics%40yahoogroups.com');>, "jason6346" <darkus1@...> wrote:
                        > > > >
                        > > > > Hi there,
                        > > > >
                        > > > > Perhaps a question that has been asked many of times within this
                        > forum
                        > > > and probably a popular challenge within the Analytics industry full sop
                        > > so
                        > > > please bear with me.
                        > > > >
                        > > > > We are currently experiencing issues where our two web analytics
                        > > > solutions are telling us two different things. Whilst I am aware it is
                        > > > unlikely for two tools to recordthe same numbers, I do expect for
                        > numbers
                        > > > to at least tell the same story.We are using both Site catalyst and
                        > > Google
                        > > > Analytics and have found whilst they trend the same, they actually tell
                        > > us
                        > > > different Year on Year performances. Google shows our traffic is up
                        > YoY,
                        > > > where Sitecatalyst shows we are down YoY.
                        > > > >
                        > > > > We have used WASP site audit but strangely enough it has identified
                        > our
                        > > > site as having less GA tags than Site catalyst which is opposite to
                        > what
                        > > > our numbers are tell us. With SiteCatalyst showing being down we'd have
                        > > > expected pages to be missing more Sitecatalyst tags than GA.
                        > > > >
                        > > > > What I would like to know is whether there is a clear process on how
                        > to
                        > > > troubleshoot this? Has anyone else experienced this and how did you
                        > > rectify
                        > > > it?
                        > > > >
                        > > > > Your help will be very much appreciated.
                        > > > >
                        > > > > Thanks
                        > > > >
                        > > > > J
                        > > > >
                        > > >
                        > > >
                        > > >
                        > >
                        > > [Non-text portions of this message have been removed]
                        > >
                        > >
                        > >
                        >
                        > --
                        > From my phone - please forgive typos!
                        >
                        > [Non-text portions of this message have been removed]
                        >
                        > [Non-text portions of this message have been removed]
                        >
                        >
                        >


                        [Non-text portions of this message have been removed]
                      • Stephane Hamel
                        Maybe this older post will help: http://blog.immeria.net/2010/02/testing-web-analytics-implementation.html as well as this one, which also includes links to a
                        Message 11 of 14 , May 1, 2012
                        • 0 Attachment
                          Maybe this older post will help: http://blog.immeria.net/2010/02/testing-web-analytics-implementation.html

                          as well as this one, which also includes links to a bunch of other resources: http://blog.immeria.net/2009/01/quality-assurance-of-web-analytics-tags.html

                          --- In webanalytics@yahoogroups.com, "jason6346" <darkus1@...> wrote:
                          >
                          > Hi there,
                          >
                          > Perhaps a question that has been asked many of times within this forum and probably a popular challenge within the Analytics industry full sop so please bear with me.
                          >
                          > We are currently experiencing issues where our two web analytics solutions are telling us two different things. Whilst I am aware it is unlikely for two tools to recordthe same numbers, I do expect for numbers to at least tell the same story.We are using both Site catalyst and Google Analytics and have found whilst they trend the same, they actually tell us different Year on Year performances. Google shows our traffic is up YoY, where Sitecatalyst shows we are down YoY.
                          >
                          > We have used WASP site audit but strangely enough it has identified our site as having less GA tags than Site catalyst which is opposite to what our numbers are tell us. With SiteCatalyst showing being down we'd have expected pages to be missing more Sitecatalyst tags than GA.
                          >
                          > What I would like to know is whether there is a clear process on how to troubleshoot this? Has anyone else experienced this and how did you rectify it?
                          >
                          > Your help will be very much appreciated.
                          >
                          > Thanks
                          >
                          > J
                          >
                        • jason6346
                          Thank you all very much for your responses, they are all very helpful. I ll let you know how i get on Jason.
                          Message 12 of 14 , May 2, 2012
                          • 0 Attachment
                            Thank you all very much for your responses, they are all very helpful.
                            I'll let you know how i get on

                            Jason.

                            --- In webanalytics@yahoogroups.com, "Stephane Hamel" <shamel67@...> wrote:
                            >
                            > Maybe this older post will help: http://blog.immeria.net/2010/02/testing-web-analytics-implementation.html
                            >
                            > as well as this one, which also includes links to a bunch of other resources: http://blog.immeria.net/2009/01/quality-assurance-of-web-analytics-tags.html
                            >
                            > --- In webanalytics@yahoogroups.com, "jason6346" <darkus1@> wrote:
                            > >
                            > > Hi there,
                            > >
                            > > Perhaps a question that has been asked many of times within this forum and probably a popular challenge within the Analytics industry full sop so please bear with me.
                            > >
                            > > We are currently experiencing issues where our two web analytics solutions are telling us two different things. Whilst I am aware it is unlikely for two tools to recordthe same numbers, I do expect for numbers to at least tell the same story.We are using both Site catalyst and Google Analytics and have found whilst they trend the same, they actually tell us different Year on Year performances. Google shows our traffic is up YoY, where Sitecatalyst shows we are down YoY.
                            > >
                            > > We have used WASP site audit but strangely enough it has identified our site as having less GA tags than Site catalyst which is opposite to what our numbers are tell us. With SiteCatalyst showing being down we'd have expected pages to be missing more Sitecatalyst tags than GA.
                            > >
                            > > What I would like to know is whether there is a clear process on how to troubleshoot this? Has anyone else experienced this and how did you rectify it?
                            > >
                            > > Your help will be very much appreciated.
                            > >
                            > > Thanks
                            > >
                            > > J
                            > >
                            >
                          • Béate Vervaecke | e-Zen
                            Google has changed the way it calculates sessions on August 11 of 2011. When a traffic source (on keyword-level) changes during the session, the old session is
                            Message 13 of 14 , May 5, 2012
                            • 0 Attachment
                              Google has changed the way it calculates sessions on August 11 of 2011.

                              When a traffic source (on keyword-level) changes during the session, the old
                              session is closed, and a new sessions starts, hence some see more visits.

                              https://developers.google.com/analytics/community/gajs_changelog#release-2011-08

                              Sites that are consulted intensively through search can be influenced by it.



                              Béate Vervaecke



                              From: webanalytics@yahoogroups.com [mailto:webanalytics@yahoogroups.com] On
                              Behalf Of jason6346
                              Sent: donderdag 26 april 2012 18:17
                              To: webanalytics@yahoogroups.com
                              Subject: [webanalytics] Tool discrepancy - Troubleshoot process





                              Hi there,

                              Perhaps a question that has been asked many of times within this forum and
                              probably a popular challenge within the Analytics industry full sop so
                              please bear with me.

                              We are currently experiencing issues where our two web analytics solutions
                              are telling us two different things. Whilst I am aware it is unlikely for
                              two tools to recordthe same numbers, I do expect for numbers to at least
                              tell the same story.We are using both Site catalyst and Google Analytics and
                              have found whilst they trend the same, they actually tell us different Year
                              on Year performances. Google shows our traffic is up YoY, where Sitecatalyst
                              shows we are down YoY.

                              We have used WASP site audit but strangely enough it has identified our site
                              as having less GA tags than Site catalyst which is opposite to what our
                              numbers are tell us. With SiteCatalyst showing being down we'd have expected
                              pages to be missing more Sitecatalyst tags than GA.

                              What I would like to know is whether there is a clear process on how to
                              troubleshoot this? Has anyone else experienced this and how did you rectify
                              it?

                              Your help will be very much appreciated.

                              Thanks

                              J





                              [Non-text portions of this message have been removed]
                            • Damien
                              Hi Jason, Whoa! Lots of great advice from our industry peers. I have always heard it is good practice to expect up to 10% difference in data recorded between
                              Message 14 of 14 , May 5, 2012
                              • 0 Attachment
                                Hi Jason,

                                Whoa! Lots of great advice from our industry peers. I have always heard it
                                is good practice to expect up to 10% difference in data recorded between
                                the different analytic systems. Sage advice to be mindful of the different
                                de-duplication methodologies at play, or changes to those from time to
                                time, when looking at the visit or visitor containers.

                                Certainly, any filtering or other rules based include or exclude of data
                                from systems is easily overlooked, I have pulled my hair out in the past
                                trying to understand system differences, only to find the simplest answer
                                held most of the reward.

                                One thing I like to do is look at the webserver logs (if it is web traffic
                                you are looking at) to see which system correlates at the page view level
                                best. It is much trickier to do this if you want to get to visit and
                                visitor as webserver log file data de-duplication is even tricker than .js
                                based analytic applications!

                                Good luck and look forward to seeing how you get on.

                                Damien
                                ---
                                *Damien Anderson*
                                e: damien@..., w: www.echwa.com, m: +44 (0) 773 819 9357




                                On 26 April 2012 17:17, jason6346 <darkus1@...> wrote:

                                > **
                                >
                                >
                                > Hi there,
                                >
                                > Perhaps a question that has been asked many of times within this forum and
                                > probably a popular challenge within the Analytics industry full sop so
                                > please bear with me.
                                >
                                > We are currently experiencing issues where our two web analytic solutions
                                > are telling us two different things. Whilst I am aware it is unlikely for
                                > two tools to record the same numbers, I do expect for numbers to at least
                                > tell the same story.We are using both Site catalyst and Google Analytics
                                > and have found whilst they trend the same, they actually tell us different
                                > Year on Year performances. Google shows our traffic is up YoY, where
                                > Sitecatalyst shows we are down YoY.
                                >
                                > We have used WASP site audit but strangely enough it has identified our
                                > site as having less GA tags than Site catalyst which is opposite to what
                                > our numbers are tell us. With SiteCatalyst showing being down we'd have
                                > expected pages to be missing more Sitecatalyst tags than GA.
                                >
                                > What I would like to know is whether there is a clear process on how to
                                > troubleshoot this? Has anyone else experienced this and how did you rectify
                                > it?
                                >
                                > Your help will be very much appreciated.
                                >
                                > Thanks
                                >
                                > J
                                >
                                >
                                >


                                [Non-text portions of this message have been removed]
                              Your message has been successfully submitted and would be delivered to recipients shortly.