Loading ...
Sorry, an error occurred while loading the content.

Re: [XP] The Definition of Done

Expand Messages
  • Ron Jeffries
    Hello, Markus. On Thursday, October 15, 2009, at 6:42:21 PM, you ... I think I agree with what you are saying. At least, I agree with this: A good model for
    Message 1 of 31 , Oct 29, 2009
    • 0 Attachment
      Hello, Markus. On Thursday, October 15, 2009, at 6:42:21 PM, you
      wrote:

      > the connection is rather weak. I realized that later, too. The point I had in
      > mind is that I would leave out TDD from the tested definition there. Most that
      > would be left by then considering tested, that is not Exploring like in
      > Exploratory Testing are the Acceptance Tests or Customer Tests, or
      > Business-facing tests when considering the Agile Testing quadrants. If you
      > apply ATDD and define the specifications to your stories directly in exectuable
      > acceptance tests using FIT, Concordion or whatever turns out to be useful for
      > you, have them automated and watch them pass at the iteration demonstration,
      > you have just done this. You have discussed with your customer the
      > business-facing tests for the story, you have developed them, seen them passing
      > and therefore delivered the actual story, it's done.

      > Did you get my Model of "Tested" in this sense? What did you have in mind
      > regarding "Tested"?

      I think I agree with what you are saying. At least, I agree with
      this:

      A good model for acceptance in Agile projects is that automated
      Customer Tests, defined and agreed to at the beginning of the
      iteration, should be used to indicate that a story is "done".

      This will not result in perfection. Exploratory testing and users
      will often discover things that need to be fixed. To deal with
      these, two steps are particularly valuable:

      First, treat these changes as new stories, rather than waste time
      arguing over whether they should have been done "correctly". We
      can't edit the past: our purpose is to move forward. Schedule the
      changes and move on.

      Second, examine these occurrences with an eye to improving the
      process. Are there kinds of tests, kinds of discussions that, if
      present, would eliminate this kind of issue? If so, consider
      putting those things in place. This will improve the process,
      making it more and more "true" that the tests are the
      requirements. The team will become better and better.

      Ron Jeffries
      www.XProgramming.com
      www.xprogramming.com/blog
      Show me the features!
    • Ron Jeffries
      Hello, Markus. On Thursday, October 15, 2009, at 6:42:21 PM, you ... I think I agree with what you are saying. At least, I agree with this: A good model for
      Message 31 of 31 , Oct 29, 2009
      • 0 Attachment
        Hello, Markus. On Thursday, October 15, 2009, at 6:42:21 PM, you
        wrote:

        > the connection is rather weak. I realized that later, too. The point I had in
        > mind is that I would leave out TDD from the tested definition there. Most that
        > would be left by then considering tested, that is not Exploring like in
        > Exploratory Testing are the Acceptance Tests or Customer Tests, or
        > Business-facing tests when considering the Agile Testing quadrants. If you
        > apply ATDD and define the specifications to your stories directly in exectuable
        > acceptance tests using FIT, Concordion or whatever turns out to be useful for
        > you, have them automated and watch them pass at the iteration demonstration,
        > you have just done this. You have discussed with your customer the
        > business-facing tests for the story, you have developed them, seen them passing
        > and therefore delivered the actual story, it's done.

        > Did you get my Model of "Tested" in this sense? What did you have in mind
        > regarding "Tested"?

        I think I agree with what you are saying. At least, I agree with
        this:

        A good model for acceptance in Agile projects is that automated
        Customer Tests, defined and agreed to at the beginning of the
        iteration, should be used to indicate that a story is "done".

        This will not result in perfection. Exploratory testing and users
        will often discover things that need to be fixed. To deal with
        these, two steps are particularly valuable:

        First, treat these changes as new stories, rather than waste time
        arguing over whether they should have been done "correctly". We
        can't edit the past: our purpose is to move forward. Schedule the
        changes and move on.

        Second, examine these occurrences with an eye to improving the
        process. Are there kinds of tests, kinds of discussions that, if
        present, would eliminate this kind of issue? If so, consider
        putting those things in place. This will improve the process,
        making it more and more "true" that the tests are the
        requirements. The team will become better and better.

        Ron Jeffries
        www.XProgramming.com
        www.xprogramming.com/blog
        Show me the features!
      Your message has been successfully submitted and would be delivered to recipients shortly.