Loading ...
Sorry, an error occurred while loading the content.

[XP] Automated Testing Root Causes

Expand Messages
  • Craig Davidson
    Hi Folks, I ve posted a diagram tracing automated developer testing problems encountered in a recent assignment back to four principal causes of developer
    Message 1 of 11 , Oct 22, 2008
    • 0 Attachment
      Hi Folks,
      I've posted a diagram tracing automated developer testing problems
      encountered in a recent assignment back to four principal causes of
      developer testing strife, specifically:
      - Lack of rhythm
      - Poor production code design
      - Poor test code design
      - Poor team communication

      The project manifested (in my experience) very common automated test
      problems (e.g. tests are always broken, slow running tests, tests out of
      date, etc). The post then covers the XP practices too on board to
      successfully address those root causes on this particular assignment. If you
      are interested the post is at:

      http://www.agileadvisor.com/2008/10/automated-test-problems-address-root.html

      Cheers,

      Craig


      [Non-text portions of this message have been removed]
    • Steven Gordon
      On Wed, Oct 22, 2008 at 9:30 AM, Craig Davidson ... Craig, Impressive work, but seems like it would intimidate most audiences. Who is the intended audience? If
      Message 2 of 11 , Oct 22, 2008
      • 0 Attachment
        On Wed, Oct 22, 2008 at 9:30 AM, Craig Davidson
        <craigmdavidson@...> wrote:
        > Hi Folks,
        > I've posted a diagram tracing automated developer testing problems
        > encountered in a recent assignment back to four principal causes of
        > developer testing strife, specifically:
        > - Lack of rhythm
        > - Poor production code design
        > - Poor test code design
        > - Poor team communication
        >
        > The project manifested (in my experience) very common automated test
        > problems (e.g. tests are always broken, slow running tests, tests out of
        > date, etc). The post then covers the XP practices too on board to
        > successfully address those root causes on this particular assignment. If you
        > are interested the post is at:
        >
        > http://www.agileadvisor.com/2008/10/automated-test-problems-address-root.html
        >
        > Cheers,
        >
        > Craig
        >

        Craig,

        Impressive work, but seems like it would intimidate most audiences.
        Who is the intended audience?

        If the audience are teams that believe they are doing XP, then the
        advice boils down to "do XP right". What an XP team would need is a
        way to determine one or two practices to focus on instead of virtually
        all of XP.

        If the audience are teams that believe they are doing Scrum or some
        other form of Agile, then again the advice boils down to "do XP".
        This is probably good long term advice, but what one or two things
        should they do first to get there (or do you believe that Scrum teams
        should just adopt all the XP practices big bang)?

        Of course, I know of no formulaic way to help a team decide what one
        or two changes to make this week. It would seem to require
        face-to-face facilitation - "Individuals and interactions over
        processes and tools".
      • David Carlton
        ... I think the diagram could be helpful here: maybe some of the blue or yellow boxes will have more resonance than others, which will suggest which red boxes
        Message 3 of 11 , Oct 22, 2008
        • 0 Attachment
          On Wed, 22 Oct 2008 11:14:54 -0700, "Steven Gordon" <sgordonphd@...> said:
          > On Wed, Oct 22, 2008 at 9:30 AM, Craig Davidson
          > <craigmdavidson@...> wrote:

          >> I've posted a diagram tracing automated developer testing problems
          >> encountered in a recent assignment back to four principal causes of
          >> developer testing strife, specifically:
          >> - Lack of rhythm
          >> - Poor production code design
          >> - Poor test code design
          >> - Poor team communication

          >> http://www.agileadvisor.com/2008/10/automated-test-problems-address-root.html

          > Impressive work, but seems like it would intimidate most audiences.
          > Who is the intended audience?

          > If the audience are teams that believe they are doing XP, then the
          > advice boils down to "do XP right". What an XP team would need is a
          > way to determine one or two practices to focus on instead of
          > virtually all of XP.

          I think the diagram could be helpful here: maybe some of the blue or
          yellow boxes will have more resonance than others, which will suggest
          which red boxes to focus on, and hence which pratices to give a harder
          look at?

          > If the audience are teams that believe they are doing Scrum or some
          > other form of Agile, then again the advice boils down to "do XP".
          > This is probably good long term advice, but what one or two things
          > should they do first to get there (or do you believe that Scrum teams
          > should just adopt all the XP practices big bang)?

          Certainly I've heard people on this list suggest that one should leap
          into all of XP. But either way, I think this diagram could be useful
          in giving people a reason to try the practices. To me, the
          interesting point ind the diagram isn't the endpoint ("do XP"), it's
          the journey along the way.

          David Carlton
          carlton@...
        • Craig Davidson
          Hi Steven, Thanks for your encouragement. The intended audience is really anyone that believes automated testing is the way to cope with changing code but is
          Message 4 of 11 , Oct 22, 2008
          • 0 Attachment
            Hi Steven,

            Thanks for your encouragement. The intended audience is really anyone
            that believes automated testing is the way to cope with changing code
            but is struggling to make automated testing work for them.

            The aim of the diagram is really a rough road map to help that
            audience to move beyond symptoms (The tests are always broken) to
            phrase a problem in a way that may help them to move forward. Facing
            up to bad design in particular is hard on the ego.

            This may be teams that are learning that while Scrum helps them with
            keeping on top of fluid requirements, it doesn't much help with
            keeping on top of fluid code. It could be teams that are realizing
            that the effective Simple Design and Refactoring actually requires a
            very good understanding of software design principals and ways to
            communicate them. It could even be teams that just need to either
            "loosen up" or "tighten down" to get a nice groove on. In my
            experience teams new to Agile approaches focus on high level rhythms
            [each week, each day] and miss those low level rhythms [each minute,
            each hour] that make higher level rhythms sustainable.

            Do I believe teams should adopt all the XP practices big bang? I don't know.
            As you say, there is no formula nor generic answer as each team and
            their circumstances are different.

            For software endeavours, I do believe that teams should feel that they
            can cope with both changing requirements and changing code. For me,
            the easiest way of covering these basis is short iterations and fast
            running automated tests:
            http://www.agileadvisor.com/2007/12/are-you-agileenough.html

            Cheers,

            Craig


            2008/10/23 Steven Gordon <sgordonphd@...>
            >
            > On Wed, Oct 22, 2008 at 9:30 AM, Craig Davidson
            > <craigmdavidson@...> wrote:
            > > Hi Folks,
            > > I've posted a diagram tracing automated developer testing problems
            > > encountered in a recent assignment back to four principal causes of
            > > developer testing strife, specifically:
            > > - Lack of rhythm
            > > - Poor production code design
            > > - Poor test code design
            > > - Poor team communication
            > >
            > > The project manifested (in my experience) very common automated test
            > > problems (e.g. tests are always broken, slow running tests, tests out of
            > > date, etc). The post then covers the XP practices too on board to
            > > successfully address those root causes on this particular assignment. If you
            > > are interested the post is at:
            > >
            > > http://www.agileadvisor.com/2008/10/automated-test-problems-address-root.html
            > >
            > > Cheers,
            > >
            > > Craig
            > >
            >
            > Craig,
            >
            > Impressive work, but seems like it would intimidate most audiences.
            > Who is the intended audience?
            >
            > If the audience are teams that believe they are doing XP, then the
            > advice boils down to "do XP right". What an XP team would need is a
            > way to determine one or two practices to focus on instead of virtually
            > all of XP.
            >
            > If the audience are teams that believe they are doing Scrum or some
            > other form of Agile, then again the advice boils down to "do XP".
            > This is probably good long term advice, but what one or two things
            > should they do first to get there (or do you believe that Scrum teams
            > should just adopt all the XP practices big bang)?
            >
            > Of course, I know of no formulaic way to help a team decide what one
            > or two changes to make this week. It would seem to require
            > face-to-face facilitation - "Individuals and interactions over
            > processes and tools".
            >
            > ------------------------------------
            >
            > To Post a message, send it to: extremeprogramming@...
            >
            > To Unsubscribe, send a blank message to: extremeprogramming-unsubscribe@...
            >
            > ad-free courtesy of objectmentor.comYahoo! Groups Links
            >
            >
            >
          • Tim Ottinger
            ... The problem I have is that poor design is the overall story. Human process, test code, production code. I don t know that it helps to know this. I
            Message 5 of 11 , Oct 23, 2008
            • 0 Attachment
              > - Lack of rhythm
              > - Poor production code design
              > - Poor test code design
              > - Poor team communication

              The problem I have is that "poor design" is the overall story. Human process, test code, production code.
              I don't know that it helps to know this. I guess my gripe is that you say "root causes" and I think that you landed on four common symptoms instead.

              Why is the team out of rhythm?
              Why is the code designed poorly?
              Why do the tests tend to come out with this poor design?
              Why don't people communicate?

              Call it "common symptoms" and i think you're good on the "truth in advertising" bit.
            • Jeff Grigg
              ... Yes. I think the chart is a good example of one team working through what they see as the causes of the problems they are experiencing. I think that
              Message 6 of 11 , Oct 23, 2008
              • 0 Attachment
                >> - Lack of rhythm
                >> - Poor production code design
                >> - Poor test code design
                >> - Poor team communication

                --- Tim Ottinger <linux_tim@...> wrote:
                > The problem I have is that "poor design" is the overall story.
                > Human process, test code, production code. I don't know that
                > it helps to know this. I guess my gripe is that you say
                > "root causes" and I think that you landed on four common
                > symptoms instead.
                >
                > Why is the team out of rhythm?
                > Why is the code designed poorly?
                > Why do the tests tend to come out with this poor design?
                > Why don't people communicate?
                >
                > Call it "common symptoms" and i think you're good on the
                > "truth in advertising" bit.

                Yes. I think the chart is a good example of one team working through
                what they see as the causes of the problems they are experiencing. I
                think that other teams doing the same exercise would be likely to get
                somewhat different results.

                And yes, it also occurs to me that this team may benefit from asking
                "Why?" another couple of times. ;->
              • Jeff Grigg
                ... http://www.agileadvisor.com/2008/10/automated-test-problems-address-root.html Hmmm... The No examples upon which to base test cases box makes me want to
                Message 7 of 11 , Oct 23, 2008
                • 0 Attachment
                  --- "Craig Davidson" <craigmdavidson@...> wrote:
                  > Hi Folks,
                  > I've posted a diagram tracing automated developer testing problems
                  > encountered in a recent assignment [...]
                  >
                  http://www.agileadvisor.com/2008/10/automated-test-problems-address-root.html

                  Hmmm...
                  The "No examples upon which to base test cases" box makes me want to
                  stress improvement of communication between the team and the Customer
                  (the business owner / people who know the business well). The
                  approach you're pursuing seems to emphasize practices within the team
                  (except, to some extent, Customer Tests, of course).
                • Craig Davidson
                  Hi Jeff, I think the problem here is that you are viewing the Customer as external to the team. We consider the Customer role as part of the team. Whole team
                  Message 8 of 11 , Oct 23, 2008
                  • 0 Attachment
                    Hi Jeff,

                    I think the problem here is that you are viewing the Customer as
                    external to the team. We consider the Customer role as part of the team.
                    Whole team and all that.

                    We had to use a customer proxy (big company) which was a significant
                    force in our communications issues.

                    This analyst as middleman rather than analyst as facilitator is in my
                    experience a very common anti-pattern - especially in large financial
                    companies.

                    Cheers

                    Craig


                    On 23 Oct 2008, at 23:45, "Jeff Grigg" <jeffgrigg@...> wrote:

                    > --- "Craig Davidson" <craigmdavidson@...> wrote:
                    >> Hi Folks,
                    >> I've posted a diagram tracing automated developer testing problems
                    >> encountered in a recent assignment [...]
                    >>
                    > http://www.agileadvisor.com/2008/10/automated-test-problems-address-root.html
                    >
                    > Hmmm...
                    > The "No examples upon which to base test cases" box makes me want to
                    > stress improvement of communication between the team and the Customer
                    > (the business owner / people who know the business well). The
                    > approach you're pursuing seems to emphasize practices within the team
                    > (except, to some extent, Customer Tests, of course).
                    >
                    >
                    > ------------------------------------
                    >
                    > To Post a message, send it to: extremeprogramming@...
                    >
                    > To Unsubscribe, send a blank message to: extremeprogramming-unsubscribe@...
                    >
                    > ad-free courtesy of objectmentor.comYahoo! Groups Links
                    >
                    >
                    >
                  • Craig Davidson
                    Hi Tim and Jeff, Thanks for your thoughts. Apologies if I have misunderstood you. ... The team had never experienced a good rhythm. The team had extremely
                    Message 9 of 11 , Oct 25, 2008
                    • 0 Attachment
                      Hi Tim and Jeff,

                      Thanks for your thoughts. Apologies if I have misunderstood you.

                      >> Why is the team out of rhythm?
                      >> Why is the code designed poorly?
                      >> Why do the tests tend to come out with this poor design?
                      >> Why don't people communicate?

                      The team had never experienced a "good" rhythm. The team had extremely
                      limited code design skills. The team didn't really understand the
                      xUnit framework test structures (e.g. fixture->action->assertion). The
                      team had never got together to express their needs.
                      **
                      In my view none of these questions or answers are as useful as:
                      "what can we do right now to make rhythm, production code, test code
                      and communication a little better by the end of today than it was
                      yesterday?"

                      >> And yes, it also occurs to me that this team may benefit from
                      asking "Why?" another couple of times. ;->

                      At what point does further questioning "go meta" and cease to be of
                      pragmatic value to the team?
                      My guess is that for different teams that point is different.
                      -- But when you address a root cause - you should no longer have the problem.

                      >> Human process, test code, production code. I don't know that it helps to know this.
                      In my view absolutely. Once we have recognised that our production
                      code is badly designed, we have poor rhythm (all very hard things to
                      face up to) - my view is that we can stop analysing and start acting.

                      We did this (as discussed in the post)
                      *** within literally a few weeks automated testing was no longer a
                      problem for our team **
                      tests were being produced with production code and ran every couple of minutes.
                      Code related defects dropped significantly.
                      The team was starting to communicate with a coherent design language
                      and starting to push back on analysts for better examples and
                      scenarios.
                      All done with minimal budget (just my fees) and no change to delivery
                      timescales.

                      Within 3 months we went from nothing to 1000 odd developer tests, code
                      developed within that 3 months tended towards 100% coverage and legacy
                      code was gradually coming into the fold as well. We also had 60 odd
                      customer tests.

                      > Call it "common symptoms" and i think you're good on the "truth in advertising" bit.

                      Most importantly (as we had addressed the root causes) after I left
                      the project the team continued to be successful with automated
                      testing.
                      A further 3 months after my time with the team?
                      They were still storming along - keeping code coverage high and defects down.

                      That's not to say further questioning is not useful for the wider
                      company or for philosophical reasons -
                      but you quickly get to ability and motivation, then recruitment and
                      reward structures.
                      As a delivery focused project in a large organisation we had neither
                      the power, the resources, nor the time to deal with those issues.
                      But we did have the power, resources and time solve our specific
                      (pretty common) problem!

                      Does this make sense?

                      Cheers,

                      Craig




                      2008/10/24 Tim Ottinger <linux_tim@...>:
                      >
                      >
                      >> - Lack of rhythm
                      >> - Poor production code design
                      >> - Poor test code design
                      >> - Poor team communication
                      >
                      > The problem I have is that "poor design" is the overall story. Human process, test code, production code.
                      > I don't know that it helps to know this. I guess my gripe is that you say "root causes" and I think that you landed on four common symptoms instead.
                      >
                      > Why is the team out of rhythm?
                      > Why is the code designed poorly?
                      > Why do the tests tend to come out with this poor design?
                      > Why don't people communicate?
                      >
                      > Call it "common symptoms" and i think you're good on the "truth in advertising" bit.
                      >
                      >
                      >
                      >
                      > ------------------------------------
                      >
                      > To Post a message, send it to: extremeprogramming@...
                      >
                      > To Unsubscribe, send a blank message to: extremeprogramming-unsubscribe@...
                      >
                      > ad-free courtesy of objectmentor.comYahoo! Groups Links
                      >
                      >
                      >
                      >
                    • Tim Ottinger
                      ... You did, but not in any troublesome way. For instance, the set of questions below are questions you answered, but I intended them as questions that remain
                      Message 10 of 11 , Oct 28, 2008
                      • 0 Attachment
                        > Hi Tim and Jeff,
                        >
                        > Thanks for your thoughts. Apologies if I have misunderstood you.

                        You did, but not in any troublesome way. For instance, the set
                        of questions below are questions you answered, but I intended them
                        as questions that remain in the search for "Root Causes".

                        > >> Why is the team out of rhythm?
                        > >> Why is the code designed poorly?
                        > >> Why do the tests tend to come out with this poor design?
                        > >> Why don't people communicate?

                        I was merely indicating that these things were also (at least
                        potentially) symptoms, and had root causes that were worth
                        understanding.

                        > In my view none of these questions or answers are as useful as:
                        > "what can we do right now to make rhythm, production code, test code
                        > and communication a little better by the end of today than it was
                        > yesterday?"

                        We're agreed on this. But better yet is "what can we do to make next
                        month better than this month", or s/month/year/g, whatever is the
                        longest term that does not require you to guess or plan things that
                        are outside your influence.

                        >
                        > >> And yes, it also occurs to me that this team may benefit from
                        > asking "Why?" another couple of times. ;->
                        >
                        > At what point does further questioning "go meta" and cease to be of
                        > pragmatic value to the team?

                        We're talking about going more concrete, not more meta. The concern
                        was that "poor design" and "poor rhythm" are too abstract.


                        > My guess is that for different teams that point is different.
                        > -- But when you address a root cause - you should no longer have the problem.

                        We agree on this. We only haggle on whether your found a root cause
                        or a common symptom.

                        Ultimately, this comes down to the term "root cause" and whether you've
                        found it or not. We would agree that treating symptoms and ignoring root
                        causes is a sub-optimal choice, and that addressing a root cause will solve
                        a problem, and that it is worthy and honorable work to trace back the
                        surface symptoms to a deeper root cause.

                        So we are closer in opinion than you might think.

                        I think that you can learn an awful lot (and I mean awful in two ways)
                        by watching the coping mechanisms that a team has developed, then peeling
                        them away to get at the root causes. A team may do poor design because
                        they don't have sufficient agreement on what constitutes "good". It could
                        be a training issue. It may be that they've been under the gun to produce
                        quick results rather than good ones. It may be that they don't value
                        clean design. It could be that they've given up hope of being able to
                        do good work because of constant interruption. It may be that the quality
                        of design isn't their biggest problem at all. It may be that two of their
                        strongest personalities have different opinions about design, and are duking
                        it out in the code. Lean or TOC may have a good answer, or maybe study
                        of patterns and principles. Maybe you have the wrong guys in charge.

                        A layer or two deeper might give you the real root cause. I was merely
                        recommending that you look for the a cause that is treatable, rather than
                        an abstract symptom.

                        And finally, pointedly, I suggest that your root causes are really just
                        categories of failure types, rather than causes. You are going more meta
                        rather than more specific.

                        Not that there's anything wrong with that.
                      Your message has been successfully submitted and would be delivered to recipients shortly.