Re: Scrum and Traceability
- Hi Ron,
But I understand Matt's comment as "this is the best reason to think about having traceability". I don't see he's saying is cost effective nor suggesting a tool/diagram. Just stating what it can bring to the table. I think it's good that a scrum team can identify quickly what would need to be changed in the code for a requirement change and I am not sure that can be accomplished just by doing Scrum. I believe some XP practices like "collective code ownership" and pair programming improve traceability because they improve the knowledge of the entire team about the system. Some other ways to improve traceability could be agree on the format of the javadocs, the name of the tests, (well, write cleaner code..). Wouldn't that improve traceability in an Agile way?
--- In email@example.com, Ron Jeffries <ronjeffries@...> wrote:
> Hello, Matt.
> I wrote:
> >> Hello, matt. On Monday, March 1, 2010, at 10:45:26 AM, you wrote:
> >> > One of the best reasons for traceability is for planning, particularly in
> >> > mature projects or projects close to release. With traceability, my
> >> product
> >> > managers can have a discussion with the team and when they say "I would
> >> like
> >> > to change this requirement" or "I would like to modify that operational
> >> > behavior", the team can draw the traceability diagram and let the PM know
> >> > all the things that will need to be retested and validated and estimates
> >> can
> >> > be made to determine whether there is adequate ROI for the change. It
> >> > becomes part of the "costing" for the new or changed features.
> >> Why don't they just tell people how much it'll cost based on knowing
> >> what they are doing?
> On Monday, March 1, 2010, at 7:19:48 PM, you replied:
> > You know, I never thought of that.
> > Clearly I am far too naive and unschooled to have realized that.
> > Thank you so so very much Ron, for pointing this out.
> If in fact the above is sincere thanks, well, you're welcome.
> On the off chance that it was meant sarcastically: It turns out that
> teams who are working with Scrum as intended are a fully
> cross-functional team working together to produce running tested
> software every couple of weeks. To accomplish this, they become
> intimately conversant with how to write software in this particular
> They /know/ what has to be changed to make something happen. They
> /know/ how long it takes them to do things. They know this because
> every two weeks they look at a new batch of stories to be
> accomplished, commit to a volume of work, and two weeks later ship
> something that works. When the volume of that something doesn't
> quite match the volume they had in mind, they discuss, in the
> retrospective that they do every time, what happened, why it
> happened, and what they learned.
> This improves their knowledge of what has to be considered, what has
> to be changed, how long it takes.
> They get very good at doing this. They neither need nor rely on
> "traceability" documents or tools, because they are fully conversant
> with the work they are doing. Their planning goes faster, is more
> based in what the team can really do, and involves far less
> And that's the truth.
> Ron Jeffries
> I have tried in my way to be free. -- Leonard Cohen
Couldn't we write the tests such that they don't look like tests, but rather requirements?
With one, and only one formal specification, which also happens to be executable against the actual system, aren't we better off than having to split time between two possibly out-of-sync artifacts?
ThoughtWorks has a testing tool called Twist, which uses something called Business Workflows. And now it has a nestable declarative aggregator called a "Concept" (what a concept!).
Twist is... designed to help you deliver applications fully aligned with your business. It eliminates requirements mismatch as business users directly express intent in their domain language.
I have not used the tool myself. If anyone has, please add some insight.
P.S. I have no affiliation w/ ThoughtWorks.
--- In firstname.lastname@example.org, "woynam" <woyna@...> wrote:
> --- In email@example.com, "pauloldfield1" <PaulOldfield1@> wrote:
> > (responding to George)
> > > I feel like a broken record with my questions.
> > I guess I need to learn to answer you better :-)
> > > pauloldfield1 wrote:
> > > > IMHO Traceability, of itself, has no value. However some of the
> > > > things that we DO value may be achieved readily if we have
> > > > Traceability.
> > >
> > > What are those things?
> > Well, I gave you a list of 15 things that some people value.
> > I guess we could take a lead from Hillel's sig line and say
> > they are all various categories of attempting to use process
> > to cover for us being too stupid to be agile.
> > We value knowing that we are testing to see that our system does
> > what the customer wants (but we're too stupid to write the
> > requirements directly as tests)... etc. etc.
> And this continues to irk the sh*t out of me. Why do we create another intermediate artifact that has to be translated by an error-prone human into a set of tests? What does the requirements document provide that the tests don't? Couldn't we write the tests such that they don't look like tests, but rather requirements?
> With one, and only one formal specification, which also happens to be executable against the actual system, aren't we better off than having to split time between two possibly out-of-sync artifacts?
> If you continue to have a separate requirements document, and your tests don't reflect the entirety of the requirements, what mechanism do you use to verify the uncovered requirements? How is that working for you?
> "A man with one watch knows what time it is; A man with two watches is never quite sure."
> > Paul Oldfield
> > Capgemini