Loading ...
Sorry, an error occurred while loading the content.

Re: [scrumdevelopment] Completeness definition

Expand Messages
  • Hubert Smits
    Higuyes, ... I would define ready as the software being in a state where the development team has no reasponable doubts or knowledge about the software not
    Message 1 of 15 , Mar 3, 2005
    • 0 Attachment
      Higuyes,

      > > > Maybe your production cycle is wrong then.
      > >
      > > You are probably correct! We are just now learning the ropes here...
      > >
      > > > SCRUM quite clearly states
      > > > that you need to be able to release a _complete_ product at the end of
      > > > every Sprint. That does not necessarily mean that you have all the
      > > > Features your Product Owner wanted and it does not mean that
      > > > everything that has been planne din the backlog is available yet, but
      > > > it does mean that I can tell you at the end of _any_ Sprint -->"go
      > > > live".
      > >
      > > These tests often
      > > take 2-3 weeks to complete on their own. We can't accelerate this
      > > time because stress testing requires running the system under load for
      > > a specified duration. Call me a non-purist, but I'm not willing to pay
      >
      > That is a good attitude. SCRUM is indeed no silver bullet and you will
      > always have to compromise at some point. I guess you are doing teh
      > right thing there. Such large scale tests could _maybe_ be seen as
      > their own Sprint though?
      > >
      > > If I can't truly "GO LIVE" at the end of each sprint, does that mean
      > > that I'm not doing Scrum?
      >
      > Technically speaking you are not doing "perfect" scrum. The scrum
      > definitition clearly states that you _have_ to be able to deliver at
      > the end of every sprint. However I do not think that this will ever be
      > possible in your particular setup.

      I would define 'ready' as the software being in a state where the
      development team has no reasponable doubts or knowledge about the
      software not passing the stress testing. I.e. they can demo it, all
      tests have run (except the stress testing), docs are there etc. If the
      company would decide to drop the stress tests then the product could
      be shrink wrapped.


      > >
      > > > THat also implies that this portion of the syste, is
      > > >
      > > > Regression Tested
      > > > Unit tested
      > > > User Acceptance tested
      > > > Code has been refactored
      > > > Test Driven Development has been used
      > > > The code is bug free (as opposed to being "believed bug-free")
      > >
      > > This is the sort of stuff I can live with and makes sense. I *have*
      > > wondered about the "Code has been refactored" requirement. What if
      > > code is well factored to start with? Do we always need to refactor?
      > >
      > Refactoring is a big part of Scrum and test driven development in my

      Scrum defines no engineering practices. That is not to say that
      refactoring or TDD isn't important, it is just not part of Scrum.

      > > For those of you out there that provide technical documentation for
      > > products that you deliver, what requirements do you place on docs
      > > within the sprint (I know Paul responded to this point in another
      > > response, but I'm curious about what others do).

      If it is a requirement for the potentially shippable product then it
      has to be delivered.

      --Hubert
    • Edwin Miller
      This is something we struggle with as well. We are near the end of a release cycle that has taken 5 months to complete. We changed an underlying data
      Message 2 of 15 , Mar 3, 2005
      • 0 Attachment
        This is something we struggle with as well.  We are near the end of a release cycle that has taken 5 months to complete.  We changed an underlying data architecture that broke 20+ modules of our application.  We can't ship part of it without all 20 modules being brought into compliance with the new design.  Sure, the work can be chunked into 30 day sprints, and we can build the application with localized testing for the module we just completed, but it can't go to production until everything's done.  It satisfies the test of being able to show the product owner running code, but it does not equal a releasable product in and of itself.
         
        We also have an a sprint devoted to what I call "ship-mode" (don't say it out loud or people will look at you funny), which is essentially all the prep work required to move the code to production.  This can include final regression, rehearsing the migration scripts (which involves migration to a staging environment and then doing a regression and load testing on bigger hardware), internal communication and all the things that go with "productifying" the work we've done.
         
        I'll be the first to admit that after using Scrum for almost 2 years and with two CSM's on board, we have by no means mastered it.  This is just one of many topics that requires careful consideration and thoughtful "inspection and adapt-ion".   Schwaber talks about the "empirical process" which is kind of anathema to following rules blindly (ie defined process).
         
        The other major controversial topic around here is what level of requirements detail are brought into the scrum team, but that deserves its own thread.
         
        Edwin Miller
        digiChart, Inc.
         


        From: Chamberlain, Eric [mailto:echamber@...]
        Sent: Wednesday, March 02, 2005 7:43 PM
        To: scrumdevelopment@yahoogroups.com
        Subject: RE: [scrumdevelopment] Completeness definition

        Personally, I think the journey towards creating a potentially shippable product is what you should be thinking about rather than fulfilling the letter-of-the-rule.  My current Scrum team doesn't deliver anything potentially shippable at the end of its sprints--yet.  I am working within the confines of the organization, the technology, and all the other factors to get there but I am not there yet. 

        I think if your acceptance testing is as time-consuming and expensive as you say, then maybe you need to lower the bar a bit for end-of-sprint acceptance and then, if the product owner gives the thumbs up (we like it!) then you cut your potential release, go through the whole testing deal.  But the product owner should be able to see something before the testing starts so to make a wise assessment.  That "something" is what you deliver at the end of each Sprint--something that the customer (representative) can kick around and evaluate.

        My 2 cents.

        == Eric Chamberlain ==

        -----Original Message-----
        From: Chris Brooks [mailto:brookscl@...]
        Sent: Wednesday, March 02, 2005 9:26 AM
        To: scrumdevelopment@yahoogroups.com
        Subject: Re: [scrumdevelopment] Completeness definition


        On Wed, 2 Mar 2005 12:38:11 +0100, David H. <dmalloc@...> wrote:
        > Maybe your
        production cycle is wrong then.

        You are probably correct!  We are just now learning the ropes here...

        > SCRUM quite clearly
        states
        > that you need to be able to release a _complete_ product at the
        end of
        > every Sprint. That does not necessarily mean that you have all
        the
        > Features your Product Owner wanted and it does not mean that
        > everything that has been planne din the backlog is available yet, but
        > it does mean that I can tell you at the end of _any_ Sprint -->"go
        > live".

        To give a concrete example: we supply online banking software for over 25% of the US population, including 4 of the top 10 US banks. Before we can safely ship new platform releases to clients, we need to validate in a rather large stress lab (often involving 60+ servers) certain levels of scalability and availability.  These tests often take 2-3 weeks to complete on their own.  We can't accelerate this time because stress testing requires running the system under load for a specified duration. Call me a non-purist, but I'm not willing to pay the cost of this sort of testing within each sprint; in fact, we often rent lab space from IBM, Microsoft, et al to achieve this work so travel is required, there a fixed costs, etc.  So, when I say we really can't call a sprint truly releasable until certain other activities are done, that's what I mean.  I don't think Scrum is going to do anything for me to help address issues like this, and I'm not looking for any silver bullets.  Our current application of Scrum involves planning for 1-2 release sprints at the end of a platform release for this very purpose.

        If I can't truly "GO LIVE" at the end of each sprint, does that mean that I'm not doing Scrum?

        > THat also implies that this portion
        of the syste, is
        >
        > Regression Tested
        > Unit tested
        >
        User Acceptance tested
        > Code has been refactored
        > Test Driven
        Development has been used
        > The code is bug free (as opposed to being
        "believed bug-free")

        This is the sort of stuff I can live with and makes sense.  I *have* wondered about the "Code has been refactored" requirement.  What if code is well factored to start with?  Do we always need to refactor?

        For those of you out there that provide technical documentation for products that you deliver, what requirements do you place on docs within the sprint (I know Paul responded to this point in another response, but I'm curious about what others do).

        --
        Chris Brooks
        http://www.chrisbrooks.org


        To Post a message, send it to:   scrumdevelopment@...
        To Unsubscribe, send a blank message to: scrumdevelopment-unsubscribe@...
        Yahoo! Groups Links









        To Post a message, send it to:   scrumdevelopment@...
        To Unsubscribe, send a blank message to: scrumdevelopment-unsubscribe@...



      • David H.
        ... I beg to differ. But I guess that is purely philosohpical. To me it is part of Scrum because scrum does define a methodology that leads to certain
        Message 3 of 15 , Mar 3, 2005
        • 0 Attachment
          > > >
          > > > This is the sort of stuff I can live with and makes sense. I *have*
          > > > wondered about the "Code has been refactored" requirement. What if
          > > > code is well factored to start with? Do we always need to refactor?
          > > >
          > > Refactoring is a big part of Scrum and test driven development in my
          >
          > Scrum defines no engineering practices. That is not to say that
          > refactoring or TDD isn't important, it is just not part of Scrum.
          >
          I beg to differ. But I guess that is purely philosohpical. To me it is
          part of Scrum because scrum does define a methodology that leads to
          certain processes. Maybe one could argue that it is more part of test
          driven development than scrum. But then again, who cares :)

          -d
        • Chris Brooks
          ... I should clarify my original comment. Fowler s definition of refactoring is the process of changing a software system in such a way that it does not
          Message 4 of 15 , Mar 3, 2005
          • 0 Attachment
            On Thu, 3 Mar 2005 19:55:23 +0100, David H. <dmalloc@...> wrote:
            >
            > > > >
            > > > > This is the sort of stuff I can live with and makes sense. I *have*
            > > > > wondered about the "Code has been refactored" requirement. What if
            > > > > code is well factored to start with? Do we always need to refactor?
            > > > >
            > > > Refactoring is a big part of Scrum and test driven development in my
            > >
            > > Scrum defines no engineering practices. That is not to say that
            > > refactoring or TDD isn't important, it is just not part of Scrum.
            > >
            > I beg to differ. But I guess that is purely philosohpical. To me it is
            > part of Scrum because scrum does define a methodology that leads to
            > certain processes. Maybe one could argue that it is more part of test
            > driven development than scrum. But then again, who cares :)

            I should clarify my original comment. Fowler's definition of
            refactoring is "the process of changing a software system in such a
            way that it does not alter the external behavior of the code yet
            improves its internal structure."

            My issue was with this statement as a suggested criteria for
            completness: "Code has been refactored". Refactoring describes a
            process, not an end state. I would rather make the criteria something
            like "Code is well factored" and describe a desirable state. To
            suggest that all new code written must be refactored is a bit strong,
            IMHO. Certainly there are cases were the initial implementation of
            software is reasonably well factored.

            --
            Chris Brooks
            http://www.chrisbrooks.org
          • David A Barrett
            ... I think, just like the pigs and chickens thing (which is only about who gets to speak in a daily scrum), people lose track of what the Scrum rule about
            Message 5 of 15 , Mar 4, 2005
            • 0 Attachment
              >Sure, the work can be chunked into 30
              >day sprints, and we can build the application with localized testing for
              >the module we just completed, but it can't go to production until
              >everything's done.

              I think, just like the "pigs and chickens" thing (which is only about who
              gets to speak in a daily scrum), people lose track of what the Scrum "rule"
              about "potentially implementable" features is all about.

              There isn't any rule that your entire product has to be deployable at the
              end of every Sprint. IMHO, the rule about making each Sprint Backlog item
              a "potentially implementable" feature has two purposes:

              1. To keep the team focused on "functionality". Creating an artifact, or
              investigating an approach is not a valid SB item. You need something that
              you can demo to the user at the end and show forward movement on
              functionality.

              2. To keep the team focused on finishing. Starting something doesn't
              count. Finishing it does. Size things so that they can be completed, even
              if this means that the incremental gain in functionality is so small as to
              be useless to the end user as a practical matter.


              I don't see any conflict here with scheduling releases to occur some
              quantity of Sprints in the future, nor do I see any conflict with dealing
              with necessary pre-release activities. I don't think you need to be filled
              with angst because some feature that you've developed can't be "released"
              until the whole system has been stress tested for 2 weeks in a lab. The
              rules do make you think about what you are doing, and potentially knock out
              a whole pantload of activities as valid SB items. For instance:

              1. Documentation
              2. UAT
              3. Unit Testing
              4. User Training
              5. Investigation
              6. Bug fixing (as an open-ended, general activity)

              Wouldn't ordinarily qualify. Instead if a feature needs those items
              completed in order to be considered "potentially implementable", then they
              should be included in the SB item for the development of that feature.
              Even here, though, you may need to make exceptions. For instance, you may
              have a separate documentation department, who work on their own schedule
              and need to see a final version of the product before they will update
              documention. The spirit of the thing is the most important: Only take on
              as much stuff as you can finish, and be clear about what "finishing" means.

              Dave Barrett,
              Lawyers' Professional Indemnity Company
            Your message has been successfully submitted and would be delivered to recipients shortly.