Loading ...
Sorry, an error occurred while loading the content.

Re: [CMMi Process Improvement] defining complexity

Expand Messages
  • Amit
    Agreed with Pat, however, you still want to define complexity then you may accept user case based concept. Define complexity based on number of transactions in
    Message 1 of 4 , Nov 29, 2005
    • 0 Attachment
      Agreed with Pat, however, you still want to define complexity then you may accept user case based concept.
       
      Define complexity based on number of transactions in each use case.
      E.g. more than 10 transaction per use case, complexity would be 1
      more then 5 transaction per use case, complexity would be .5
       
      List out use cases for the module
      define transactions for each use case, and then
       
      <number of use cases more than 10 trans.> * 1 + <number of use cases more than 10 trans.> * .5 = <total> that is the complexity of that module.
       
       

      Patrick OToole <PACT.otoole@...> wrote:

      Will,

      When a builder receives the contract to build a new subdivision of 100
      cookie-cutter starter homes, s/he must carefully estimate the materials for
      the first house.  They carefully calculate the number of boards, nails,
      shingles, pipes, tiles, etc. they expect to use.  As this first house nears
      completion, one can reasonably expect that  the builder might need to take a
      few trips to the local hardware superstore to acquire a few of the
      underestimated materials.  Similarly, one might expect the builder to have
      extra sets of some of the other materials.

      The estimates for house #2 will again be detailed, but they will be much
      more accurate based on the experience gained with house #1.  Assuming that
      the houses are sufficiently similar in design, there should be significantly
      fewer trips to the hardware store, and much less leftover material.

      By the time builder gets to house #4 or 5, they no longer have to estimate
      at the micro-level of detail.  They simply say, "I'm building another house
      #3 with the following changes..."  The basic unit of measure goes from
      boards, nails, and shingles, to "another house #3."  The deviations may be
      estimated at the micro-level of detail, but the largest portion of the
      estimate will be at the macro-level.

      Staying in the housing domain, but moving from new homes to the sale of
      existing homes, think about the real estate report's use of "comps."  The
      report of your home's projected value shows several "comparable" homes in
      the same neighborhood, and then details the "pluses" and "minuses" between
      your home and each comp (i.e., +swimming pool, +$10,000; - finished
      basement, -$4000;  + upgraded kitchen +$5,000, + 3-season porch +$8000,
      etc.)  Imagine if the real estate broker had to estimate the value of your
      home based on boards, nails, shingles, pipes, etc.!

      It sounds like you're facing a similar situation with your project
      estimates.  Although probably not as "cookie-cutter" as the builder or real
      estate broker example, it is still reasonable to use the "referent project"
      approach if there is sufficient similarity between your projects.  Your
      primary unit of measure is "another Project ABC," while the differences
      between the current project and the referent project are documented and
      estimated in a +/- way.

      I find the referent project approach to be a perfectly acceptable means of
      estimating size, and it sounds like your organization does too.  If this is
      the way that you're currently getting from project requirements to effort
      and schedule estimates - and it's working for you - then keep doing it!

      Regards,

      Pat




      ----- Original Message -----
      From: "WoolmanGraphics" <wwoolman@...>
      To: <cmmi_process_improvement@yahoogroups.com>
      Sent: Tuesday, November 29, 2005 10:54 AM
      Subject: [CMMi Process Improvement] defining complexity


      > We are in a dilemma when it comes to sizing our projects. Currently
      > we do hour/size estimates on our future projects based on past hour
      > estimates and hours spent on past projects. As our projects
      > are "similar" to one another we generally find projects that have
      > similar requirements and deliverables and base our estimates on these
      > projects. Nowhere do we "define" the complexity of the deliverables.
      > Right now a gauge for complexity does not exist for our group. We
      > size our projects based on past projects of similar requirements.
      >
      > Any advice into what we can do to meet the intent of the
      > model?
      >
      > What I would like to do is to gather the data we have today, take
      > data from future projects and use this information to define a scale
      > of complexity to be used for future projects. However that does not
      > do much to help us out today.
      >
      > Thanks for your help
      >
      > Will
      >
      >
      >
      >
      >
      >
      >
      >
      >
      >
      >
      >
      > Yahoo! Groups Links
      >
      >
      >
      >
      >
      >
      >



      Yahoo! Music Unlimited - Access over 1 million songs. Try it free.

    • Buglione Luigi
      Hi Will, sorry to add a bit late other thoughts on your questions ;-( According to Amit s suggestion, if you measure the size of your projects with a
      Message 2 of 4 , Jan 5, 2006
      • 0 Attachment
        Message
        Hi Will,
         
        sorry to add a bit late other thoughts on your questions ;-(
        According to Amit's suggestion, if you measure the size of your projects with a Functional Size Measurement Method (such as FPA or COSMIC-FFP), another possible classification - after consindering of course the complexity of its basic elements - could be given directly by the final project size. For instance, you could take a look to these two papers, working in the same direction:
        * this 2005 paper by GUFPI-ISMA SBC (http://www.dpo.it/smef2005/filez/proceedings.pdf, pp. 39-48) analyzed the ISBSG r9 database, both on development and enhancement maintenance projects.
        * another paper by Grant Rule (http://www.gifpa.co.uk/library/Papers/Rule/RulesRelativeSizeScale%20v1b.pdf), linked also on the ISBSG website
         
        Referring to Pat's message, the overall functional size could be seen as the macro-level while the inner basic elements complexity the micro-level, having the detail about the distribution of the different kind of data and transations. 
        As usual, probably the best solution will be in a middle way, taking into account both levels in the analysis. For instance, a way could be to put such points on a scattered diagram using the two criteria (or also a mix of the two, if useful) and look for the solution that could be closer to your feelings and information on those projects. This could be a way to better understand the relationships among the different elements to allow you to define your own formula for "complexity" to be used for next estimates.
         
        Hoping it could be of some help,
         
        Best regards,
        -----Original Message-----
        From: Amit [mailto:techamitdev@...]
        Sent: mercoledì 30 novembre 2005 5.26
        To: cmmi_process_improvement@yahoogroups.com
        Subject: Re: [CMMi Process Improvement] defining complexity

        Agreed with Pat, however, you still want to define complexity then you may accept user case based concept.
         
        Define complexity based on number of transactions in each use case.
        E.g. more than 10 transaction per use case, complexity would be 1
        more then 5 transaction per use case, complexity would be .5
         
        List out use cases for the module
        define transactions for each use case, and then
         
        <number of use cases more than 10 trans.> * 1 + <number of use cases more than 10 trans.> * .5 = <total> that is the complexity of that module.
         
         

        Patrick OToole <PACT.otoole@...> wrote:

        Will,

        When a builder receives the contract to build a new subdiv ision of 100
        cookie-cutter starter homes, s/he must carefully estimate the materials for
        the first house.  They carefully calculate the number of boards, nails,
        shingles, pipes, tiles, etc. they expect to use.  As this first house nears
        completion, one can reasonably expect that  the builder might need to take a
        few trips to the local hardware superstore to acquire a few of the
        underestimated materials.  Similarly, one might expect the builder to have
        extra sets of some of the other materials.

        The estimates for house #2 will again be detailed, but they will be much
        more accurate based on the experience gained with house #1.  Assuming that
        the houses are sufficiently similar in design, there should be significantly
        fewer trips to the hardware store, and much less leftover material.

        By the time builder gets to house #4 or 5, they no longer have to estimate
        at the micro-level of detail.  They simply say, "I'm building another house
        #3 with the following changes..."  The basic unit of measure goes from
        boards, nails, and shingles, to "another house #3."  The deviations may be
        estimated at the micro-level of detail, but the largest portion of the
        estimate will be at the macro-level.

        Staying in the housing domain, but moving from new homes to the sale of
        existing homes, think about the real estate report's use of "comps."  The
        report of your home's projected value shows several "comparable" homes in
        the same neighborhood, and then details the "pluses" and "minuses" between
        your home and each comp (i.e., +swimming pool, +$10,000; - finished
        basement, -$4000;  + upgraded kitchen +$5,000, + 3-season porch +$8000,
        etc.)  Imagine if the real estate broker had to estimate the value of your
        home based on boards, nails, shingles, pipes, etc.!

        It sounds like you're facing a similar situation with your pro ject
        estimates.  Although probably not as "cookie-cutter" as the builder or real
        estate broker example, it is still reasonable to use the "referent project"
        approach if there is sufficient similarity between your projects.  Your
        primary unit of measure is "another Project ABC," while the differences
        between the current project and the referent project are documented and
        estimated in a +/- way.

        I find the referent project approach to be a perfectly acceptable means of
        estimating size, and it sounds like your organization does too.  If this is
        the way that you're currently getting from project requirements to effort
        and schedule estimates - and it's working for you - then keep doing it!

        Regards,

        Pat




        ----- Original Message -----
        From: "WoolmanGraphics" <wwoolman@...>
        To: <cmmi_process_improvement@yahoogroups.com>
        Sent: Tuesday, November 29, 2005 10:54 AM
        Subje ct: [CMMi Process Improvement] defining complexity


        > We are in a dilemma when it comes to sizing our projects. Currently
        > we do hour/size estimates on our future projects based on past hour
        > estimates and hours spent on past projects. As our projects
        > are "similar" to one another we generally find projects that have
        > similar requirements and deliverables and base our estimates on these
        > projects. Nowhere do we "define" the complexity of the deliverables.
        > Right now a gauge for complexity does not exist for our group. We
        > size our projects based on past projects of similar requirements.
        >
        > Any advice into what we can do to meet the intent of the  model?
        >
        > What I would like to do is to gather the data we have today, take
        > data from future projects and use this information to define a scale
        > of complexity to be used for future projects. However that does not
        > do muc h to help us out today.
        >
        > Thanks for your help
        >
        > Will
        >
      • Lisa Michela
        hi To sum it up What are the tow best and SIMPLE techniques for Estimating Cost and Effort,for smallto medium projects. Liz Buglione Luigi
        Message 3 of 4 , Jan 5, 2006
        • 0 Attachment
          hi
          To sum it up
          What are the tow best and SIMPLE techniques for Estimating Cost and Effort,for smallto medium projects.
          Liz
           
           


          Buglione Luigi <luigi.buglione@...> wrote:
          Hi Will,
           
          sorry to add a bit late other thoughts on your questions ;-(
          According to Amit's suggestion, if you measure the size of your projects with a Functional Size Measurement Method (such as FPA or COSMIC-FFP), another possible classification - after consindering of course the complexity of its basic elements - could be given directly by the final project size. For instance, you could take a look to these two papers, working in the same direction:
          * this 2005 paper by GUFPI-ISMA SBC (http://www.dpo.it/smef2005/filez/proceedings.pdf, pp. 39-48) analyzed the ISBSG r9 database, both on development and enhancement maintenance projects.
          * another paper by Grant Rule (http://www.gifpa.co.uk/library/Papers/Rule/RulesRelativeSizeScale%20v1b.pdf), linked also on the ISBSG website
           
          Referring to Pat's message, the overall functional size could be seen as the macro-level while the inner basic elements complexity the micro-level, having the detail about the distribution of the different kind of data and transations. 
          As usual, probably the best solution will be in a middle way, taking into account both levels in the analysis. For instance, a way could be to put such points on a scattered diagram using the two criteria (or also a mix of the two, if useful) and look for the solution that could be closer to your feelings and information on those projects. This could be a way to better understand the relationships among the different elements to allow you to define your own formula for "complexity" to be used for next estimates.
           
          Hoping it could be of some help,
           
          Best regards,
          -----Original Message-----
          From: Amit [mailto:techamitdev@...]
          Sent: mercoledì 30 novembre 2005 5.26
          To: cmmi_process_improvement@yahoogroups.com
          Subject: Re: [CMMi Process Improvement] defining complexity

          Agreed with Pat, however, you still want to define complexity then you may accept user case based concept.
           
          Define complexity based on number of transactions in each use case.
          E.g. more than 10 transaction per use case, complexity would be 1
          more then 5 transaction per use case, complexity would be .5
           
          List out use cases for the module
          define transactions for each use case, and then
           
          <number of use cases more than 10 trans.> * 1 + <number of use cases more than 10 trans.> * .5 = <total> that is the complexity of that module.
           
           

          Patrick OToole <PACT.otoole@...> wrote:

          Will,

          When a builder receives the contract to build a new subdiv ision of 100
          cookie-cutter starter homes, s/he must carefully estimate the materials for
          the first house.  They carefully calculate the number of boards, nails,
          shingles, pipes, tiles, etc. they expect to use.  As this first house nears
          completion, one can reasonably expect that  the builder might need to take a
          few trips to the local hardware superstore to acquire a few of the
          underestimated materials.  Similarly, one might expect the builder to have
          extra sets of some of the other materials.

          The estimates for house #2 will again be detailed, but they will be much
          more accurate based on the experience gained with house #1.  Assuming that
          the houses are sufficiently similar in design, there should be significantly
          fewer trips to the hardware store, and much less leftover material.

          By the time builder gets to house #4 or 5, they no longer have to estimate
          at the micro-level of detail.  They simply say, "I'm building another house
          #3 with the following changes..."  The basic unit of measure goes from
          boards, nails, and shingles, to "another house #3."  The deviations may be
          estimated at the micro-level of detail, but the largest portion of the
          estimate will be at the macro-level.

          Staying in the housing domain, but moving from new homes to the sale of
          existing homes, think about the real estate report's use of "comps."  The
          report of your home's projected value shows several "comparable" homes in
          the same neighborhood, and then details the "pluses" and "minuses" between
          your home and each comp (i.e., +swimming pool, +$10,000; - finished
          basement, -$4000;  + upgraded kitchen +$5,000, + 3-season porch +$8000,
          etc.)  Imagine if the real estate broker had to estimate the value of your
          home based on boards, nails, shingles, pipes, etc.!

          It sounds like you're facing a similar situation with your pro ject
          estimates.  Although probably not as "cookie-cutter" as the builder or real
          estate broker example, it is still reasonable to use the "referent project"
          approach if there is sufficient similarity between your projects.  Your
          primary unit of measure is "another Project ABC," while the differences
          between the current project and the referent project are documented and
          estimated in a +/- way.

          I find the referent project approach to be a perfectly acceptable means of
          estimating size, and it sounds like your organization does too.  If this is
          the way that you're currently getting from project requirements to effort
          and schedule estimates - and it's working for you - then keep doing it!

          Regards,

          Pat




          ----- Original Message -----
          From: "WoolmanGraphics" <wwoolman@...>
          To: <cmmi_process_improvement@yahoogroups.com>
          Sent: Tuesday, November 29, 2005 10:54 AM
          Subje ct: [CMMi Process Improvement] defining complexity


          > We are in a dilemma when it comes to sizing our projects. Currently
          > we do hour/size estimates on our future projects based on past hour
          > estimates and hours spent on past projects. As our projects
          > are "similar" to one another we generally find projects that have
          > similar requirements and deliverables and base our estimates on these
          > projects. Nowhere do we "define" the complexity of the deliverables.
          > Right now a gauge for complexity does not exist for our group. We
          > size our projects based on past projects of similar requirements.
          >
          > Any advice into what we can do to meet the intent of the  model?
          >
          > What I would like to do is to gather the data we have today, take
          > data from future projects and use this information to define a scale
          > of complexity to be used for future projects. However that does not
          > do muc h to help us out today.
          >
          > Thanks for your help
          >
          > Will
          >


          Yahoo! Photos
          Ring in the New Year with Photo Calendars. Add photos, events, holidays, whatever.

        Your message has been successfully submitted and would be delivered to recipients shortly.