Re: [scrumdevelopment] Re: Use Cases?
- Hello, PSM. On Wednesday, November 3, 2010, at 6:49:43 PM, you
> I'm sorry, but automated tests are not understandable by lay people. AutomatedWhat is its point, then?
> tests are good at a lot of things, but not at communicating to lay people. They
> are excellent ways of communicating to techies, and they are excellent ways of
> *verifying* expected behavior and *analyzing* actual behavior, and they are even
> sometimes excellent at communicating expected/actual behavior to techies. The
> point of a Use Case is NOT to verify system behavior or even actual behavior.
Are you familiar with FitNesse? With Cucumber? How about a DSL for
acceptance testing? For the C3 project we built a testing harness
that displayed two checks (in a column format), showing the input
values, output values, and intermediate values. When they differed,
the differing ones were highlighted. Payroll clerks found those
quite easy to understand.
> What I said was that a team or lay person *might feel* more comfortable with useWhile I'm all for people feeling comfortable, I have not met many
> cases in a life/mission critical situation.
actual humans who were all that good at use cases. And I do not see
that as professional software developers, we should feel comfortable
with making people comfortable by writing a document which is not
directly verifiable as actually being true. That comfort, I'd
suggest, is not justified.
> Apparently a notion backed by Ken Schwaber. It doesn't mean it'sYes. That's because we know ways to describe things that communicate
> the it's the most efficient approach, it doesn't mean it's always
> the best approach, and it doesn't mean that you should focus less
> on testing. I never brought up testing at all, someone else did.
with actual humans, and with computers, in the same format, so that
the specification can be understood by people and verified to work.
We call this technique "Acceptance Test Driven Development". When I
took my CSM class with the above-mentioned Ken Schwaber, he asked me
to present the technique to the class. His views may not be what you
think they are. Since most of the people responding with the testing
angle could program him into the ground, I'm not sure just what
standing his position would have even if it really was that use
cases are what we need.
Naturally, a team can write use cases if they want to, the fact that
they are quite often actually an impediment notwithstanding.
> My guess is you fellas remain unconvinced as well, so I'm afraidMaybe that's all. I can think of a few more things you might do.
> that's all I can do here.
They wouldn't prove your point, but in my opinion that's because
your point isn't the best possible place to stand.
Everything elegant is simple, not everything simple is elegant and
nothing complex is ever elegant. --John Streiff
- "Oh, I'm sorry, when I said I wanted it to 'take off like a helicopter,' I meant 'take off like a helicopter from a warzone, where we can carry troops and equipment.'
And there's the rub. The devil is in the details, and sometimes those details means we're off the mark (Harrier vs. Osprey) or things take much longer than anticipated (Osprey).
Someone else made a point here recently about how it's extremely important to a) get to the details, by having the CEO appoint a point person and b) make sure you loop in the CEO iteratively and intelligently (respecting his time) so he can head off any incorrect interpretation of his vision.
I'm not against vision, but a vision is not a "software requirement". Further, a vision is not easily testable, nor is a business requirement, because they often lack the key ingredient of system behavior that you can test against.