Loading ...
Sorry, an error occurred while loading the content.

RE: [webanalytics] A question about when large portals report their uniques?

Expand Messages
  • Narong, Jon
    Josh, I think there are a lot of variables, depending on your particular audience segment profiles, which could skew these results. For instance, are there a
    Message 1 of 15 , May 1, 2007
    • 0 Attachment
      Josh,

      I think there are a lot of variables, depending on your particular
      audience segment profiles, which could skew these results. For instance,
      are there a lot of students that could be logging in from a shared
      library computer? On the other side of the spectrum, if you have a more
      mature demographic (60+), odds are that visitors are not logging in from
      multiple computers.

      I don't think there's a general rule to calculate unique eyeballs from
      unique visitors.

      Jon

      ________________________________

      From: webanalytics@yahoogroups.com [mailto:webanalytics@yahoogroups.com]
      On Behalf Of Josh Chasin
      Sent: Monday, April 30, 2007 5:28 PM
      To: webanalytics@yahoogroups.com
      Subject: Re: [webanalytics] A question about when large portals report
      their uniques?



      Since Unique Visitors is a way to get at reach, how do folks here think
      that Unique Visitors translates into unique persons, given the recent
      IAB/comScore/NetRatings hubbub? E.g., cookie deletion, individual
      persons using multiple computers, etc.

      --josh--
      (New here)

      Joshua Chasin
      Principal
      Warp Speed Marketing, Inc.
      345 East 81st Street 14K
      New York, NY 10028
      jchasin@... <mailto:jchasin%40nyc.rr.com>
      office: 212.517.8917
      cell: 646,623.1201
      Yahoo IM: joshchasin

      ----- Original Message -----
      From: Shorful Islam
      To: webanalytics@yahoogroups.com <mailto:webanalytics%40yahoogroups.com>

      Sent: Monday, April 30, 2007 5:08 PM
      Subject: Re: [webanalytics] A question about when large portals report
      their uniques?

      We refer to Unique visitors, it demonstrates our reach

      regards

      Shorful

      ----- Original Message -----
      From: nicolerawski
      To: webanalytics@yahoogroups.com <mailto:webanalytics%40yahoogroups.com>

      Sent: Monday, April 30, 2007 5:23 PM
      Subject: [webanalytics] A question about when large portals report their
      uniques?

      When large portals report that they have XYZ# of uniques. Are they
      referring to
      unique visits to the site or unique visitors to the site?

      The two numbers are very different and I understand the difference, but
      would like to know what the industry standard is when some refers to
      their "uniques."

      Thank you,

      Nicole

      [Non-text portions of this message have been removed]

      [Non-text portions of this message have been removed]






      [Non-text portions of this message have been removed]
    • Jon Whitehead
      Hi all does anyone have any recommendations on content being archived/deleted based on a minimum usage, say for example only 100 unique visitors to a section
      Message 2 of 15 , May 2, 2007
      • 0 Attachment
        Hi all

        does anyone have any recommendations on content being
        archived/deleted
        based on a minimum usage, say for example only 100 unique visitors to
        a section over 6 months, where the overall site has recieved over
        100,000 uniques in that time?

        Obviously when content has not been accessed at all over a period
        then it should be discarded/rewritten/rethought, but I'm not sure
        where to draw a line when the content is visited but at a low level.

        cheers

        Jon Whitehead
      • Steve
        It s been a while with no (apparent?) answer, so I ll have a shot at it for you. ... As with everything there is a balance. * Cost of maintaining old data -
        Message 3 of 15 , May 6, 2007
        • 0 Attachment
          It's been a while with no (apparent?) answer, so I'll have a shot at it for you.

          On 5/3/07, Jon Whitehead <jonnywhitehead@...> wrote:
          > does anyone have any recommendations on content being
          > archived/deleted
          > based on a minimum usage, say for example only 100 unique visitors to
          > a section over 6 months, where the overall site has recieved over
          > 100,000 uniques in that time?
          >
          > Obviously when content has not been accessed at all over a period
          > then it should be discarded/rewritten/rethought, but I'm not sure
          > where to draw a line when the content is visited but at a low level.


          As with everything there is a balance.
          * Cost of maintaining old data - ranges from significant to beyond
          trivial. Is generally higher than most people realise tho.
          * Relevance of old data
          * Archive vs Delete (as you mention)
          * Value of old data. Is it complimentary to the rest of the site, or
          jarring; to hit the extremes.
          * Is the info duplicated elsewhere?
          * Transient in nature (eg news)
          * Incoming links?
          * Search Engine Results?
          * You mention a ratio of 0.1%. Are they "High Value" users? If so,
          perhaps the data should be updated and given greater relevance to the
          rest of the site?


          The summary point being that visitor numbers, IMHO, alone shouldn't be
          the sole determinate in what should/shouldn't go.

          To give a live example, the content team at work semi automatically
          delete old news items quite regularly. They are effectively archived
          into the database, but are no longer active. No Analytics numbers are
          checked to make this call.

          Old pages are either merged with more relevant ones, or eventually
          archived. This is done without any reference to Analytics in the first
          instance. We prefer to operate reactively on that - if we see largish
          numbers hit the old page we'll put in a redirect as/to appropriate. I
          don't think I've ever seen more than 100 users over a week hit a
          dumped page. Which for us, is a beyond trivial number. Ergo we don't
          expend precious resources fixing problems that don't exist.


          HTH?

          Cheers!

          - Steve
        • Debbie Pascoe
          Tagging onto Steve s comments, you may have pages that are getting NO traffic - those pages would be ideal candidates for decommissioning and archiving, as
          Message 4 of 15 , May 7, 2007
          • 0 Attachment
            Tagging onto Steve's comments, you may have pages that are getting NO
            traffic - those pages would be ideal candidates for decommissioning
            and archiving, as they are costing $$ while delivering no value in
            return.

            Debbie Pascoe
            MAXAMINE, Inc.

            > As with everything there is a balance.
            > * Cost of maintaining old data - ranges from significant to beyond
            > trivial. Is generally higher than most people realise tho.
            > * Relevance of old data
            > * Archive vs Delete (as you mention)
            > * Value of old data. Is it complimentary to the rest of the site, or
            > jarring; to hit the extremes.
            > * Is the info duplicated elsewhere?
            > * Transient in nature (eg news)
            > * Incoming links?
            > * Search Engine Results?
            > * You mention a ratio of 0.1%. Are they "High Value" users? If so,
            > perhaps the data should be updated and given greater relevance to the
            > rest of the site?
            >
            >
            > The summary point being that visitor numbers, IMHO, alone shouldn't
            be the sole determinate in what should/shouldn't go....
          • Paul Holstein
            Debbie, That s a great point. Do you know of a way to get a report of pages with no traffic? As far as I can tell, Omniture doesn t do it. The only way I
            Message 5 of 15 , May 8, 2007
            • 0 Attachment
              Debbie,

              That's a great point. Do you know of a way to get a report of pages
              with no traffic? As far as I can tell, Omniture doesn't do it.

              The only way I would think to do it would be to export a list of all
              my URLs (from my own crawl) and then compare it to a list of all
              trafficked URLs. The difficult part may be to parse out the URLs to
              de-dupe them (because of tracking codes and system parameters).
              Anyone have any other ideas?

              -- Paul


              --- In webanalytics@yahoogroups.com, "Debbie Pascoe" <dpascoe@...> wrote:
              >
              > Tagging onto Steve's comments, you may have pages that are getting NO
              > traffic - those pages would be ideal candidates for decommissioning
              > and archiving, as they are costing $$ while delivering no value in
              > return.
              >
              > Debbie Pascoe
              > MAXAMINE, Inc.
              >
              > > As with everything there is a balance.
              > > * Cost of maintaining old data - ranges from significant to beyond
              > > trivial. Is generally higher than most people realise tho.
              > > * Relevance of old data
              > > * Archive vs Delete (as you mention)
              > > * Value of old data. Is it complimentary to the rest of the site, or
              > > jarring; to hit the extremes.
              > > * Is the info duplicated elsewhere?
              > > * Transient in nature (eg news)
              > > * Incoming links?
              > > * Search Engine Results?
              > > * You mention a ratio of 0.1%. Are they "High Value" users? If so,
              > > perhaps the data should be updated and given greater relevance to the
              > > rest of the site?
              > >
              > >
              > > The summary point being that visitor numbers, IMHO, alone shouldn't
              > be the sole determinate in what should/shouldn't go....
              >
            • Tim Wilson
              I have a sneaking suspicion that MAXAMINE offers something that will help with this... ;-) Regards, Tim ________________________________ From:
              Message 6 of 15 , May 8, 2007
              • 0 Attachment
                I have a sneaking suspicion that MAXAMINE offers something that will
                help with this... ;-)



                Regards,

                Tim



                ________________________________

                From: webanalytics@yahoogroups.com [mailto:webanalytics@yahoogroups.com]
                On Behalf Of Paul Holstein
                Sent: Tuesday, May 08, 2007 10:09 AM
                To: webanalytics@yahoogroups.com
                Subject: [webanalytics] Re: minimum usage



                Debbie,

                That's a great point. Do you know of a way to get a report of pages
                with no traffic? As far as I can tell, Omniture doesn't do it.

                The only way I would think to do it would be to export a list of all
                my URLs (from my own crawl) and then compare it to a list of all
                trafficked URLs. The difficult part may be to parse out the URLs to
                de-dupe them (because of tracking codes and system parameters).
                Anyone have any other ideas?

                -- Paul

                --- In webanalytics@yahoogroups.com
                <mailto:webanalytics%40yahoogroups.com> , "Debbie Pascoe" <dpascoe@...>
                wrote:
                >
                > Tagging onto Steve's comments, you may have pages that are getting NO
                > traffic - those pages would be ideal candidates for decommissioning
                > and archiving, as they are costing $$ while delivering no value in
                > return.
                >
                > Debbie Pascoe
                > MAXAMINE, Inc.
                >
                > > As with everything there is a balance.
                > > * Cost of maintaining old data - ranges from significant to beyond
                > > trivial. Is generally higher than most people realise tho.
                > > * Relevance of old data
                > > * Archive vs Delete (as you mention)
                > > * Value of old data. Is it complimentary to the rest of the site, or
                > > jarring; to hit the extremes.
                > > * Is the info duplicated elsewhere?
                > > * Transient in nature (eg news)
                > > * Incoming links?
                > > * Search Engine Results?
                > > * You mention a ratio of 0.1%. Are they "High Value" users? If so,
                > > perhaps the data should be updated and given greater relevance to
                the
                > > rest of the site?
                > >
                > >
                > > The summary point being that visitor numbers, IMHO, alone shouldn't
                > be the sole determinate in what should/shouldn't go....
                >





                [Non-text portions of this message have been removed]
              • Steve
                It s moderately tricky and typically very system specific to discover *all* pages on a site. eg. Crawling assumes all your pages are even accessible via a
                Message 7 of 15 , May 8, 2007
                • 0 Attachment
                  It's moderately tricky and typically very system specific to discover
                  *all* pages on a site.
                  eg. Crawling assumes all your pages are even accessible via a crawl.
                  Is dead easy to make pages that are inaccessible.
                  So you *may* need to be able to access the raw database and infer. In
                  some cases.
                  This is why I prefer using Search Engine crawls as a baseline to infer
                  unseen/unused pages. They "remember" pages that are no longer
                  accessible via crawls and hence to the average punter.


                  There was a discussion in Nov last year related to this.
                  http://tech.groups.yahoo.com/group/webanalytics/message/8576?l=1

                  The bonus is this is the sort of technically challenging question that
                  if phrased appropriately can get your IT folk falling all over
                  themselves to answer correctly. :-)


                  HTH?
                  Cheers!

                  - Steve

                  On 5/9/07, Paul Holstein <paul@...> wrote:
                  > Debbie,
                  >
                  > That's a great point. Do you know of a way to get a report of pages
                  > with no traffic? As far as I can tell, Omniture doesn't do it.
                  >
                  > The only way I would think to do it would be to export a list of all
                  > my URLs (from my own crawl) and then compare it to a list of all
                  > trafficked URLs. The difficult part may be to parse out the URLs to
                  > de-dupe them (because of tracking codes and system parameters).
                  > Anyone have any other ideas?
                • Debbie Pascoe
                  Hi Paul, The way to do it is to have an inventory of your pages, then compare your inventory with pages that have traffic to them - this comparison/gap
                  Message 8 of 15 , May 8, 2007
                  • 0 Attachment
                    Hi Paul,
                    The way to do it is to have an inventory of your pages, then compare
                    your inventory with pages that have traffic to them - this
                    comparison/gap analysis will give you the list of pages that had no
                    traffic - your "hit list" of candidates for retirement.

                    The key is to have the page inventory in the first place, so you're on
                    the right track re: doing a crawl; it sounds like the methodology
                    you're using is much harder than it has to be (de-duping, etc). And
                    while you might have the skills to create your own page collection
                    method, other organizations may not have that skill in-house, and
                    their efforts - if they attempt to tackle it inhouse - may be
                    complicated by the development techniques used to create the site.

                    Advanced web design techniques such as complex Java Scripting used for
                    menu creation, page tagging, client-side functionality, server side
                    scripting methods for page creation, catalog sites, transactional
                    sites, navigation links buried in flash, personalization via cookies
                    and password-protected areas all present complexities WRT creating a
                    scanning engine to collect a page inventory.

                    That commercial where the doctor is advising the patient about how to
                    conduct an abdominal incision with a kitchen knife comes to mind :-O

                    I can help with an accurate, complete coherent page inventory -
                    contact me off-line if you want to know more.

                    While I'm thinking about it, does anyone here have any current
                    cost-per-page figures WRT page maintenance?

                    Debbie Pascoe
                    MAXAMINE, Inc.



                    --- In webanalytics@yahoogroups.com, "Paul Holstein" <paul@...> wrote:
                    >
                    > Debbie,
                    >
                    > That's a great point. Do you know of a way to get a report of pages
                    > with no traffic? As far as I can tell, Omniture doesn't do it.
                    >
                    > The only way I would think to do it would be to export a list of all
                    > my URLs (from my own crawl) and then compare it to a list of all
                    > trafficked URLs. The difficult part may be to parse out the URLs to
                    > de-dupe them (because of tracking codes and system parameters).
                    > Anyone have any other ideas?
                    >
                    > -- Paul

                    >
                  • metronomelabs
                    Passive Data Capture sees all traffic in both directction at the HTTP/S, HTML, TCP/IP levels. It cleans, filters an dsessionizes in real-time an dcan emulate
                    Message 9 of 15 , May 11, 2007
                    • 0 Attachment
                      Passive Data Capture sees all traffic in both directction at the
                      HTTP/S, HTML, TCP/IP levels. It cleans, filters an dsessionizes in
                      real-time an dcan emulate any log file format with any data you want
                      including emulating tagging server logs.

                      www.metronomelabs.com

                      --- In webanalytics@yahoogroups.com, "Paul Holstein" <paul@...>
                      wrote:
                      >
                      > Debbie,
                      >
                      > That's a great point. Do you know of a way to get a report of
                      pages
                      > with no traffic? As far as I can tell, Omniture doesn't do it.
                      >
                      > The only way I would think to do it would be to export a list of
                      all
                      > my URLs (from my own crawl) and then compare it to a list of all
                      > trafficked URLs. The difficult part may be to parse out the URLs
                      to
                      > de-dupe them (because of tracking codes and system parameters).
                      > Anyone have any other ideas?
                      >
                      > -- Paul
                      >
                      >
                      > --- In webanalytics@yahoogroups.com, "Debbie Pascoe" <dpascoe@>
                      wrote:
                      > >
                      > > Tagging onto Steve's comments, you may have pages that are
                      getting NO
                      > > traffic - those pages would be ideal candidates for
                      decommissioning
                      > > and archiving, as they are costing $$ while delivering no value
                      in
                      > > return.
                      > >
                      > > Debbie Pascoe
                      > > MAXAMINE, Inc.
                      > >
                      > > > As with everything there is a balance.
                      > > > * Cost of maintaining old data - ranges from significant to
                      beyond
                      > > > trivial. Is generally higher than most people realise tho.
                      > > > * Relevance of old data
                      > > > * Archive vs Delete (as you mention)
                      > > > * Value of old data. Is it complimentary to the rest of the
                      site, or
                      > > > jarring; to hit the extremes.
                      > > > * Is the info duplicated elsewhere?
                      > > > * Transient in nature (eg news)
                      > > > * Incoming links?
                      > > > * Search Engine Results?
                      > > > * You mention a ratio of 0.1%. Are they "High Value" users? If
                      so,
                      > > > perhaps the data should be updated and given greater relevance
                      to the
                      > > > rest of the site?
                      > > >
                      > > >
                      > > > The summary point being that visitor numbers, IMHO, alone
                      shouldn't
                      > > be the sole determinate in what should/shouldn't go....
                      > >
                      >
                    Your message has been successfully submitted and would be delivered to recipients shortly.