Re: Reconciling Statistics
- What does Abode support have to say about the issue?
Before wasting any internal resources or time investigating an issue with vendor data, which occurred after a point release, I suggest asking the vendor. Simply, in the context of web analytics is "easy." If you already have asked the vendor, do post your synthesis and perception of the vendor response.
In the context of web analytics is "complex:"
2) if you, your company, or your team did not do anything to the tool upstream or downstream in data collection or tool config, the point release is the likely culprit (obviously) - and the onus should be on the vendor to explain and offer you a solution to correct. Vendors sometimes have issues with point releases because they fail to adequately and fully test all use cases. Perhaps you have have implemented/applied the technology in a funky way that now no longer works. Good luck with support in most cases if you spend small change or have no budget - as professional services may be required, and will almost certainly be suggested, at a cost. Who replaced Ben Gaines anyway - shouldn't they be responding? :-)
In the context of web analytics is "hard," Jason has good advice - and the technologists who implement and configure tools on the list always have good ideas for data investigations into the innards of tools.
In the spirit of occam's razor (the concept) ask Adobe first, but before doing so determine if any other metrics have also moved in unusual directions... And investigate carefully whether or not it could be you, not the vendor. :-)
Sent from my iPhone
- Judah -
Ben Gaines replaced Ben Gaines! http://www.bengaines.com/2012/03/faq/
--- In email@example.com, JudahPhillips <judahphillips@...> wrote:
> Who replaced Ben Gaines anyway - shouldn't they be responding? :-)
> Sent from my iPhone
- This was good advice from Jason. The only thing I would suggest is that his last paragraph should have gone first. I have seen so much time wasted in exercises like this. Use one tool, implement and use it correctly and stop wasting time trying to get data to match Google.
Just my two cents.
--- In firstname.lastname@example.org, Jason Palmer <jasonlpalmer@...> wrote:
> There are so many things that can cause the numbers to be different. You
> will want to take one stat at a time as you are trying to reconcile. First
> start with Pageviews. In theory they should be the same (equivelent to the
> number of tags fired). First check to see if everything is tagged the
> same, typically it is not. For example if you have a form tagged to
> examine abandonment as someone is working through the form, that may
> increase your pageview count depending on how it is tagged. Tagged videos,
> rich media content, etc. If you increased these types of tags in one
> measurement tool but not the other that will show an increase in one and
> not the other.
> That being said the next to look at is Visits and Visitors. Make sure that
> you session timeout and other settings are the same these numbers should be
> closer when comparing same time periods.
> However, each solution will have different default filters and custom
> filters for removing traffic. For example you can filter out internal
> traffic from your own company to the website. This can cause descrepencies
> in both solutions. For example if you set it to filter all bot traffic
> from your reports, most solutions do this by default. Each vendor has
> different filter lists so these will never be the same.
> This would explain some of the differences, but my recommendation is that
> you need to focus on using one solution not two. If the reason why you are
> doing it is to try and use one to audit the other, it will be a wasted
> effort. Spend your time making sure one is set up properly, everything
> tagged, and that you understand how it measures. If you have spent the
> money on SiteCatalyst, just focus on making it measure what you want.
> [Non-text portions of this message have been removed]