Loading ...
Sorry, an error occurred while loading the content.

Re: Test automation in agile

Expand Messages
  • Alan Shalloway
    ... of ... Dave: Beautiful. ... Alan Shalloway CEO, Net Objectives Come see us at SQE Agile Best Practices, Orlando, December ... some ... no ... out ... to
    Message 1 of 9 , Sep 7, 2007
    • 0 Attachment
      --- In scrumdevelopment@yahoogroups.com, "Dave Nicolette"
      <dnicolet@...> wrote:
      >
      > Alan,
      >
      > Understanding the underlying principles is definitely important, and
      > better than just following practices by rote. Excellent point.
      >
      > Let me add another item to the list:
      >
      > Force 6: The enterprise wants to manage the total cost of ownership
      of
      > its software assets.


      Dave:
      Beautiful.
      :)

      Alan Shalloway
      CEO, Net Objectives
      Come see us at SQE Agile Best Practices, Orlando, December



      >
      > Here are two scenarios to illustrate the point.
      >
      > Scenario 1: No automated tests
      >
      > Situation 1: Development
      >
      > Let's say we're working on a team that doesn't use automated tests.
      > Everyone tests manually. Now, you tell us you've manually tested
      some
      > chunk of code and it works properly. Then, you go on vacation. Where
      > does that leave the rest of us? For all practical purposes, we have
      no
      > tests. We have no assurance the system works as is, and we have no
      > safety net for refactoring or adding features. Can we make progress
      > anyway? Sure we can. We're not stupid, after all. But is this the
      > optimal way to work? If we have to spend 60% of our time figuring
      out
      > what's what, and 40% adding value, is that as useful as spending 10%
      > of our time figuring out what's what, and 90% adding value?
      >
      > Situation 2: Production support
      >
      > A production ticket is opened. The production support team must
      > reproduce the reported behavior in the application. Without an
      > automated test suite as a starting point, they have to hack around
      to
      > create test data and to try and reproduce the problem. If they
      can't,
      > then they close the ticket with a comment to the effect that they
      were
      > unable to reproduce the error. If they can, then they have to hack
      > around with the code to fix the error, and hope against hope that
      > they've thought of everything necessary to test manually. Maybe they
      > did, and maybe they didn't. They can't really tell.
      >
      > Situation 3: Enhancement project
      >
      > A year passes. The system has been in production. Now a new project
      > comes along to make some significant enhancements. What's the new
      > team's starting point with respect to testing? They are at the
      > proverbial Square One. There is some amount of delay before the team
      > can begin to deliver customer-defined value. Changes may take longer
      > to get to "done," since modifications they make will probably
      > introduce bugs, and they don't have a test suite to expose the bugs
      > quickly and painlessly.
      >
      > Scenario 2: Automated tests
      >
      > Situation 1: Development
      >
      > In contrast, let's say our team was using automated testing. You add
      > tests to the test suite that test the same chunk of code as before.
      > Everything's green. You go on vacation. Anyone can run exactly the
      > same tests with exactly the same test data as you did before you
      went
      > on vacation. We can also add to the test and the "real" code while
      > you're gone. On your return, you can immediately see what's been
      going
      > on in your absence, and pick right up productively.
      >
      > Situation 2: Production support
      >
      > A production ticket is opened. The production support team must
      > reproduce the reported behavior in the application. Their first step
      > is to run the automated test suite. If it passes, they know they
      have
      > a valid starting point to reproduce the error. If it fails, they
      know
      > someone has modified the production code without going through a TDD
      > process and checking in tests along with application code. In that
      > case, their next step is to remediate that. In any case, when the
      > tests are green they can try to reproduce the problem. When they do,
      > they can add appropriate tests and application code to fix the
      problem
      > and prove it's fixed. They can then confidently check it all in and
      > close the ticket.
      >
      > Situation 3: Enhancement project
      >
      > A year passes. The system has been in production. Now a new project
      > comes along to make some significant enhancements. What's the new
      > team's starting point with respect to testing? They have a complete
      > regression test suite that gives them a great deal of useful
      > information about the codebase. They can continue to build on that
      > test suite as they work, adding functionality and refactoring safely
      > and keeping their velocity steady as they deliver customer-defined
      value.
      >
      > Dave
      >
      > --- In scrumdevelopment@yahoogroups.com, "Alan Shalloway"
      > <alshall@> wrote:
      > >
      > > --- In scrumdevelopment@yahoogroups.com, "kala_krish"
      > > <kalahariharan@> wrote:
      > > >
      > > >
      > > > Major belief is that the productivity is enhanced in an agile
      kind
      > > of
      > > > framework only if the test automation is in place.
      > > >
      > > > Is this a myth or a reality ....
      > > >
      > >
      > > It's neither myth, nor reality. It's an oversimplification. We
      > > tend to look at practices instead of underlying principles. In
      > > Design Patterns Explained we went underneath common design
      patterns
      > > and looked at the principles involved. This enabled the thinking
      of
      > > patterns to be useful even when the pattern's typical form didn't
      > > apply. In this case you want to look at the forces involved.
      This
      > > enables one to apply agile/scrum/lean practices in different
      > > situations.
      > >
      > > Force 1: Iterations may introduce the problem of changing code
      > > frequently which will require repeated testing.
      > > Force 2: Iterations may provide knowledge about how the code must
      > > interact with another development group
      > > Force 3: Iterations introduce the ability to gain feedback.
      > > Force 4: Feedback may reduce risk of building the wrong thing.
      > > Force 5: Testing code in an automated fashion reduces the cost of
      > > testing.
      > >
      > > Now, if you are in a situation where the cost of testing is high,
      > > automated testing may be necessary (as indicated by Force 5). But
      if
      > > the cost of testing is low, Force 5 may have little impact. I
      just
      > > did a webinar on Design Patterns in an Agile Environment where we
      > > didn't do any automated testing, nor did we do formal iterations,
      > > but we did follow the concepts of working code, working software,
      > > customer collaboration and responding to change. It was very
      > > successful. This should be ready to post on our website in
      another
      > > few days. I'll let you know when it is.
      > >
      > > Alan Shalloway
      > > CEO, Net Objectives
      > >
      >
    Your message has been successfully submitted and would be delivered to recipients shortly.