Loading ...
Sorry, an error occurred while loading the content.

Complex business knowledge and specs

Expand Messages
  • Otto Behrens
    Hello, We find it difficult to figure out how parts of the system should be working because of the complexity of the business domain. We have reasonable test
    Message 1 of 6 , Mar 1, 2011
    • 0 Attachment
      Hello,

      We find it difficult to figure out how parts of the system should be
      working because of the complexity of the business domain. We have
      reasonable test coverage, and the tests do tell us (often) how a
      certain part of the system works. But we end up spending quite some
      time in the debugger on a restored production database to figure out
      why a certain value is what it is, or why a certain rule was applied
      or not. (For example, why is the Capital Gain on a particular
      investment $1234.56). The system often behaves correctly (sometimes
      not) and we help the business people out by explaining how it works.

      Part of the answer is probably that we must just have more tests. And
      well refactored tests so that it's easy to figure out.

      Could it be that we need more documentation (business specifications)
      that contains formulas and rules on how the business domain works?
      We've been reasonably successful at extracting requirements when we
      need it, which reduces lead times. This means that when we deliver 2x
      a month, we can adapt to new projects without waiting for someone to
      write a spec.

      The business traditionally keep specs as a reference on how the system
      should implement business requirements (in other teams in the
      company). When we reduce the system specs, there is a need to
      understand things. I think reducing specs is good thing because specs
      often 1) take too long to write 2) are out of date and 3) don't not
      reflect what was implemented anyway. How do you retain information /
      knowledge even years after business people and IT people have moved
      on?

      Another (somewhat) related question is that we often don't get good
      detailed analysis done on solving the problem. Writing specs have the
      effect that someone sits down and *thoroughly* thinks through the
      problem and the potential alternative solutions. There are numerous
      reasons why we don't want to do this. But we end up losing that
      thoroughness if we don't. Any ideas where we should focus here?

      Thanks for your help.
      Otto
    • George Dinwiddie
      Hi, Otto, ... Is the problem you describe one of explaining how the business rules interact in a particular situation to give a particular answer? ... The
      Message 2 of 6 , Mar 1, 2011
      • 0 Attachment
        Hi, Otto,

        On 3/1/11 7:52 AM, Otto Behrens wrote:
        > Hello,
        >
        > We find it difficult to figure out how parts of the system should be
        > working because of the complexity of the business domain. We have
        > reasonable test coverage, and the tests do tell us (often) how a
        > certain part of the system works. But we end up spending quite some
        > time in the debugger on a restored production database to figure out
        > why a certain value is what it is, or why a certain rule was applied
        > or not. (For example, why is the Capital Gain on a particular
        > investment $1234.56). The system often behaves correctly (sometimes
        > not) and we help the business people out by explaining how it works.

        Is the problem you describe one of explaining how the business rules
        interact in a particular situation to give a particular answer?

        > Part of the answer is probably that we must just have more tests. And
        > well refactored tests so that it's easy to figure out.
        >
        > Could it be that we need more documentation (business specifications)
        > that contains formulas and rules on how the business domain works?
        > We've been reasonably successful at extracting requirements when we
        > need it, which reduces lead times. This means that when we deliver 2x
        > a month, we can adapt to new projects without waiting for someone to
        > write a spec.

        The downside of more documentation is that someone needs to keep it in
        sync with the code, and there's always the worry that it isn't. Can you
        write this documentation as tests against the business rules?

        > The business traditionally keep specs as a reference on how the system
        > should implement business requirements (in other teams in the
        > company). When we reduce the system specs, there is a need to
        > understand things. I think reducing specs is good thing because specs
        > often 1) take too long to write 2) are out of date and 3) don't not
        > reflect what was implemented anyway. How do you retain information /
        > knowledge even years after business people and IT people have moved
        > on?

        There are tools to allow you to document examples of the specs in
        language readable by the business people. In particular, I suggest
        taking a look at Cucumber, Robot Framework, and Fitnesse to see if any
        of these fit your needs.

        Note that these tools are not limited to testing the system as a whole,
        but can also test smaller parts of the system to check the operation of
        specific business rules. My rule of thumb is to execute acceptance
        tests as low in the system as possible, but to also have some tests that
        run through the full stack.

        > Another (somewhat) related question is that we often don't get good
        > detailed analysis done on solving the problem. Writing specs have the
        > effect that someone sits down and *thoroughly* thinks through the
        > problem and the potential alternative solutions. There are numerous
        > reasons why we don't want to do this. But we end up losing that
        > thoroughness if we don't. Any ideas where we should focus here?

        You can get the same thoroughness by discussing the examples between the
        Business, the Programmers, and the Testers. This gives three very
        different points of view. I call this the Three Amigos (though there
        may be more than three points of view required) and Ken Pugh calls it
        the Triad. The point is to distill the business requirements down to
        its essence, and represent that essence in potentially automatable examples.

        Hope that helps.

        - George

        --
        ----------------------------------------------------------------------
        * George Dinwiddie * http://blog.gdinwiddie.com
        Software Development http://www.idiacomputing.com
        Consultant and Coach http://www.agilemaryland.org
        ----------------------------------------------------------------------
      • Otto Behrens
        Hi George, ... Yes. And just to explain the business rule itself. ... Yes, agree. One problem is that business people can t derive the business rule from the
        Message 3 of 6 , Mar 1, 2011
        • 0 Attachment
          Hi George,

          > Is the problem you describe one of explaining how the business rules
          > interact in a particular situation to give a particular answer?

          Yes. And just to explain the business rule itself.

          > The downside of more documentation is that someone needs to keep it in
          > sync with the code, and there's always the worry that it isn't. Can you
          > write this documentation as tests against the business rules?

          Yes, agree. One problem is that business people can't derive the
          business rule from the code. There are just an enormous number of
          scenarios in tests to cover it properly. And then that scenarios don't
          express business rules as well as pictures and formulas do.

          > There are tools to allow you to document examples of the specs in
          > language readable by the business people. In particular, I suggest
          > taking a look at Cucumber, Robot Framework, and Fitnesse to see if any
          > of these fit your needs.

          Thanks, I'll have a look.

          > You can get the same thoroughness by discussing the examples between the
          > Business, the Programmers, and the Testers. This gives three very
          > different points of view. I call this the Three Amigos (though there
          > may be more than three points of view required) and Ken Pugh calls it
          > the Triad. The point is to distill the business requirements down to
          > its essence, and represent that essence in potentially automatable examples.

          Do you have this discussion during planning? We find developers lose
          patience when we do this because the planning takes a long time. Do
          you provide a complete list of example on the table for discussion, or
          do the examples emerge? If the solutions are not thought through even
          with a Three Amigos / Triad kind of discussion, does it simply imply
          that people do not take enough care in such a discussion?

          When you invite the Triad to a discussion, how much preparation do you
          expect from each of the parties? How is the preparation done? (Is
          there stuff written down?)

          Thanks
        • George Dinwiddie
          Otto, ... I have this discussion during backlog grooming. The stories need to be well understood just in time for the iteration planning. If they are not
          Message 4 of 6 , Mar 1, 2011
          • 0 Attachment
            Otto,

            On 3/1/11 10:52 AM, Otto Behrens wrote:
            >> You can get the same thoroughness by discussing the examples between the
            >> Business, the Programmers, and the Testers. This gives three very
            >> different points of view. I call this the Three Amigos (though there
            >> may be more than three points of view required) and Ken Pugh calls it
            >> the Triad. The point is to distill the business requirements down to
            >> its essence, and represent that essence in potentially automatable examples.
            >
            > Do you have this discussion during planning? We find developers lose
            > patience when we do this because the planning takes a long time. Do
            > you provide a complete list of example on the table for discussion, or
            > do the examples emerge? If the solutions are not thought through even
            > with a Three Amigos / Triad kind of discussion, does it simply imply
            > that people do not take enough care in such a discussion?

            I have this discussion during backlog grooming. The stories need to be
            well understood just in time for the iteration planning. If they are
            not well understood, then they're hard to estimate and the planning
            meetings take a long time.

            The business will likely have some examples in mind to start. The
            programmer and tester will likely think of other examples (especially
            edge cases and error conditions), which will emerge in the discussion.
            As the list of examples grows, you may want to split the story into
            clumps of these examples. If new examples come to mind later, you might
            want to create a new story to cover them in the future rather than
            expanding scope of a story in flight.

            Rather than thinking "not enough care" I prefer "not the right focus."
            Often people get in a hurry to get done, rather than to cover all the
            angles. This is especially true with a large meeting, such as an
            iteration planning meeting. A Three Amigos meeting doesn't need all the
            programmers and testers--one of each will generally suffice. And it
            doesn't have to be the same one of each for every story. Not all the
            upcoming stories have to be analyzed at the same time. I find that
            meetings longer than 2 hours tend to lose their productive edge.

            > When you invite the Triad to a discussion, how much preparation do you
            > expect from each of the parties? How is the preparation done? (Is
            > there stuff written down?)

            How much preparation do they need? When you're working with the actual
            business person, they've likely got most of the needed information in
            their head. When you're working with a business analyst as a proxy for
            the business person, they need to do a lot of work to learn the details
            of what the business person wants. They should make whatever notes they
            need to help them remember all the important stuff. And, in the
            conversation, it's likely that it will be discovered there are questions
            that they can't answer. If so, the Three Amigos can cover what they
            know and come back together after the answers are known.

            - George

            --
            ----------------------------------------------------------------------
            * George Dinwiddie * http://blog.gdinwiddie.com
            Software Development http://www.idiacomputing.com
            Consultant and Coach http://www.agilemaryland.org
            ----------------------------------------------------------------------
          • Steven Gordon
            Otto, I have never tried this approach, but maybe it would help. There have been times when as a customer, I have asked the bank or shop or whatever how they
            Message 5 of 6 , Mar 1, 2011
            • 0 Attachment
              Otto,

              I have never tried this approach, but maybe it would help.

              There have been times when as a customer, I have asked the bank or shop or
              whatever how they calculated a particular surcharge or tax. These days, the
              answer almost always is the computer said so, so it must be right. Not very
              satisfying to anybody who knows the computer just does what somebody told it
              to do.

              Suppose we added the following user story: As a user I want to be able to
              get an explanation of each charge. That would mean the system being able to
              produce upon demand a human-readable audit trail that indicates the business
              rule that was used at each step in a calculation.

              In other words, think of it as a service to the user rather than as a
              specification or a test. Having such a service would not only serve as
              instant documentation of the business knowledge being applied in any
              particular scenario and a useful debugging aid, but it would also add
              business value to your system.

              SteveG

              On Tue, Mar 1, 2011 at 5:52 AM, Otto Behrens <otto@...> wrote:

              >
              >
              > Hello,
              >
              > We find it difficult to figure out how parts of the system should be
              > working because of the complexity of the business domain. We have
              > reasonable test coverage, and the tests do tell us (often) how a
              > certain part of the system works. But we end up spending quite some
              > time in the debugger on a restored production database to figure out
              > why a certain value is what it is, or why a certain rule was applied
              > or not. (For example, why is the Capital Gain on a particular
              > investment $1234.56). The system often behaves correctly (sometimes
              > not) and we help the business people out by explaining how it works.
              >
              > Part of the answer is probably that we must just have more tests. And
              > well refactored tests so that it's easy to figure out.
              >
              > Could it be that we need more documentation (business specifications)
              > that contains formulas and rules on how the business domain works?
              > We've been reasonably successful at extracting requirements when we
              > need it, which reduces lead times. This means that when we deliver 2x
              > a month, we can adapt to new projects without waiting for someone to
              > write a spec.
              >
              > The business traditionally keep specs as a reference on how the system
              > should implement business requirements (in other teams in the
              > company). When we reduce the system specs, there is a need to
              > understand things. I think reducing specs is good thing because specs
              > often 1) take too long to write 2) are out of date and 3) don't not
              > reflect what was implemented anyway. How do you retain information /
              > knowledge even years after business people and IT people have moved
              > on?
              >
              > Another (somewhat) related question is that we often don't get good
              > detailed analysis done on solving the problem. Writing specs have the
              > effect that someone sits down and *thoroughly* thinks through the
              > problem and the potential alternative solutions. There are numerous
              > reasons why we don't want to do this. But we end up losing that
              > thoroughness if we don't. Any ideas where we should focus here?
              >
              > Thanks for your help.
              > Otto
              >
              >
              >


              [Non-text portions of this message have been removed]
            • Rick Mugridge
              Hi Otto, Gojko Adzic has an excellent book coming out, Specification by Example , Manning which covers this very well. It s based on his looking at the
              Message 6 of 6 , Mar 2, 2011
              • 0 Attachment
                Hi Otto,

                Gojko Adzic has an excellent book coming out, "Specification by
                Example", Manning which covers this very well. It's based on his looking
                at the practices of many teams and distilling the important "patterns"
                for success.

                I've used these techniques myself with clients, and they've worked very
                well in complex business domains. For me, the critical points are:

                * Writing (or verifying) the specs needs input from people with
                various skills. This process usually needs to be part of the continuous
                flow of a project. Early in a project, the whole, wider team may be
                involved in workshopping the specs. As a system matures, there need only
                be a few people involved. It depends on how important it is for everyone
                to understand, and have input into, the business goals and potential
                designs.

                *The specs need to be written at a business level, in terms of
                business rules and processes. They serve an important role as examples
                for the wider team, aiding discussions and communication. See "/Doubling
                the Value of Automated Tests: FitLibrary Storytests" /(2006)/,/with
                links for video and slides at
                http://www.rimuresearch.com/PapersAndTalks.html.

                * You can verify the consistency of the executable specifications and
                the system, so you can tell they're up to date and thus can be trusted.

                * Executable specifications have to evolve along with the thinking
                about the domain and what's needed. Hence some will need to be
                refactored from time to time, as with code.

                * If instead they're written as tests in terms of the concrete
                implementation of the system they will be verbose, hard to understand,
                have little value in answering business-level questions, and will be
                difficult to maintain. Most people who apply test automation at this
                level fall into this trap. Many tools don't support making a distinction
                between the DSL of the business layer and the DSL of the implementation
                layers (eg, database, xml, ui, message), and don't support a clean
                mapping between them. For the first couple of articles on such DSLs, see
                http://www.rimuresearch.com/KnowHowTo.html.

                One circumstance where executable specifications don't add value is
                where the domain experts are also the software experts. Then it's easier
                to write specs in code, getting the value of refactoring support and,
                potentially, the benefit of type-checking. This appears to be the case
                with Industrial Logic; Joshua has talked about how they don't need
                executable specifications for their eLearning system.

                Cheers, Rick

                BTW, I'm soon to release an open-source tool for refactoring executable
                specifications that are based on FitLibrary.

                On 2/03/2011 1:52 a.m., Otto Behrens wrote:
                >
                > Hello,
                >
                > We find it difficult to figure out how parts of the system should be
                > working because of the complexity of the business domain. We have
                > reasonable test coverage, and the tests do tell us (often) how a
                > certain part of the system works. But we end up spending quite some
                > time in the debugger on a restored production database to figure out
                > why a certain value is what it is, or why a certain rule was applied
                > or not. (For example, why is the Capital Gain on a particular
                > investment $1234.56). The system often behaves correctly (sometimes
                > not) and we help the business people out by explaining how it works.
                >
                > Part of the answer is probably that we must just have more tests. And
                > well refactored tests so that it's easy to figure out.
                >
                > Could it be that we need more documentation (business specifications)
                > that contains formulas and rules on how the business domain works?
                > We've been reasonably successful at extracting requirements when we
                > need it, which reduces lead times. This means that when we deliver 2x
                > a month, we can adapt to new projects without waiting for someone to
                > write a spec.
                >
                > The business traditionally keep specs as a reference on how the system
                > should implement business requirements (in other teams in the
                > company). When we reduce the system specs, there is a need to
                > understand things. I think reducing specs is good thing because specs
                > often 1) take too long to write 2) are out of date and 3) don't not
                > reflect what was implemented anyway. How do you retain information /
                > knowledge even years after business people and IT people have moved
                > on?
                >
                > Another (somewhat) related question is that we often don't get good
                > detailed analysis done on solving the problem. Writing specs have the
                > effect that someone sits down and *thoroughly* thinks through the
                > problem and the potential alternative solutions. There are numerous
                > reasons why we don't want to do this. But we end up losing that
                > thoroughness if we don't. Any ideas where we should focus here?
                >
                > Thanks for your help.
                > Otto
                >
                >


                [Non-text portions of this message have been removed]
              Your message has been successfully submitted and would be delivered to recipients shortly.