Loading ...
Sorry, an error occurred while loading the content.
 

Re: [scrumdevelopment] Re: Scrum and Traceability

Expand Messages
  • George Dinwiddie
    Hi, Scott, ... With cucumber, it s common practice to paste the story into the test. I think most acceptance test frameworks make this pretty easy. ... If a
    Message 1 of 137 , Mar 2, 2010
      Hi, Scott,

      scott preece wrote:
      > Not directly responding to Ron's note, but I'm curious...
      >
      > So, for those of you who treat automated tests as the documentation
      > of your requirements,do you do anything to trace them back to the
      > stories they come from?

      With cucumber, it's common practice to paste the story into the test. I
      think most acceptance test frameworks make this pretty easy.

      > Also, when you find, as a result of a new story, that you have to
      > change an existing test (implicitly changing the requirement that led
      > to that test), do you have any way to trace that test back to the
      > story it came from and the stakeholder responsible for that story, so
      > you can verify that the change is acceptable?

      If a new story conflicted with an old one, I'd have an immediate
      conversation with the product owner. One thing I would /NOT/ do is have
      multiple stakeholders giving direction to the team and expecting the
      developers to sort out the conflicts on their own. That is a really
      /bad/ idea.

      - George

      --
      ----------------------------------------------------------------------
      * George Dinwiddie * http://blog.gdinwiddie.com
      Software Development http://www.idiacomputing.com
      Consultant and Coach http://www.agilemaryland.org
      ----------------------------------------------------------------------
    • john_hermann
      @Mark Couldn t we write the tests such that they don t look like tests, but rather requirements? With one, and only one formal specification, which
      Message 137 of 137 , Apr 20, 2010
        @Mark
        <quote>
        Couldn't we write the tests such that they don't look like tests, but rather requirements?

        With one, and only one formal specification, which also happens to be executable against the actual system, aren't we better off than having to split time between two possibly out-of-sync artifacts?
        </quote>

        ThoughtWorks has a testing tool called Twist, which uses something called Business Workflows. And now it has a nestable declarative aggregator called a "Concept" (what a concept!).

        http://www.thoughtworks-studios.com/agile-test-automation
        <snip>
        Twist is... designed to help you deliver applications fully aligned with your business. It eliminates requirements mismatch as business users directly express intent in their domain language.
        </snip>

        I have not used the tool myself. If anyone has, please add some insight.

        -johnny
        P.S. I have no affiliation w/ ThoughtWorks.


        --- In scrumdevelopment@yahoogroups.com, "woynam" <woyna@...> wrote:
        >
        >
        >
        > --- In scrumdevelopment@yahoogroups.com, "pauloldfield1" <PaulOldfield1@> wrote:
        > >
        > > (responding to George)
        > >
        > > > I feel like a broken record with my questions.
        > >
        > > I guess I need to learn to answer you better :-)
        > >
        > > > pauloldfield1 wrote:
        > > > > IMHO Traceability, of itself, has no value. However some of the
        > > > > things that we DO value may be achieved readily if we have
        > > > > Traceability.
        > > >
        > > > What are those things?
        > >
        > > Well, I gave you a list of 15 things that some people value.
        > > I guess we could take a lead from Hillel's sig line and say
        > > they are all various categories of attempting to use process
        > > to cover for us being too stupid to be agile.
        > >
        > > We value knowing that we are testing to see that our system does
        > > what the customer wants (but we're too stupid to write the
        > > requirements directly as tests)... etc. etc.
        >
        > And this continues to irk the sh*t out of me. Why do we create another intermediate artifact that has to be translated by an error-prone human into a set of tests? What does the requirements document provide that the tests don't? Couldn't we write the tests such that they don't look like tests, but rather requirements?
        >
        > With one, and only one formal specification, which also happens to be executable against the actual system, aren't we better off than having to split time between two possibly out-of-sync artifacts?
        >
        > If you continue to have a separate requirements document, and your tests don't reflect the entirety of the requirements, what mechanism do you use to verify the uncovered requirements? How is that working for you?
        >
        > Mark
        >
        > "A man with one watch knows what time it is; A man with two watches is never quite sure."
        >
        >
        > >
        > > Paul Oldfield
        > > Capgemini
        > >
        >
      Your message has been successfully submitted and would be delivered to recipients shortly.