Loading ...
Sorry, an error occurred while loading the content.

Cost of Process Changes?

Expand Messages
  • Michael Avila
    I m looking for information regarding the cost of changing process in an organization. I m more interested in the costs of relatively small changes to the
    Message 1 of 8 , Jun 6, 2010
    • 0 Attachment
      I'm looking for information regarding the cost of changing process in an
      organization. I'm more interested in the costs of relatively small changes
      to the process, but any information would be GREATLY appreciated.

      A little context ... I work at a great organization! (Happy to finally be
      able to say that). Leadership is sharp and willing to have open discussions
      about every aspect of the organization. The company is relatively small at
      about 50 people, where roles range from programmers, visual designers, and
      IT (one guy) to accounting, marketing, new business, and executives. My role
      at the company (aside from programming) is to mentor the programmers and to
      help develop the process by which our projects are developed. I'm relatively
      new to the company, having only been here for about 2 months, but this
      industry is far from new to me. The patterns I'm seeing here I have seen at
      every company I've worked at prior to here.

      - FEEL FREE TO SKIP PARAGRAPH -
      When I first arrived the company was trying to implement Scrum. Their
      implementation was pretty basic. They had Scrums every couple of days (but
      they were going on for hours) and they pretty much just renamed their
      Milestones to Sprints. Early on I got the team to realize that they needed
      to Scrum everyday and to timebox them to about 15 minutes so that they
      aren't keeping people who could be adding value to the project from doing
      so. The next thing we decided on was to ditch our digital task
      progress/management/thing in favor of a simple sprint board with the columns
      'not being worked on/being worked on/ready to verify/done' populated with
      index cards (which we choose/define during sprint planning). We have a burn
      up chart for the whole project and a burn down chart for the iteration which
      are generated every morning and posted on the sprint board, which we Scrum
      around. The most recent thing we've been talking about (and working hard at
      doing) is continuously integrating all of our work. My goal has been to
      provide as much visibility into the 'reality?' of the project as possible.
      Where we actually are, what is actually being worked on, and what is
      actually working. The hope was to empower everyone to contribute to making
      the project successful (management and the development team). Now, this has
      been working great ... up to a point.

      Essentially, what I'm seeing is a lot of communication, but no feeding the
      generated information back into the decision making process. To be honest,
      this is happening on both sides of the fence (management and development).
      I've seen this issue everywhere I've been, and so I've decided to write
      about it.

      The gist of my write up is that 'feedback is crucial to project success'. I
      mean feedback in the 'feedback loop' sense and communication in the 'any
      event that generates information relevant to the project' sense. Relevant
      information is going to fall along two dimensions: 1) internal expectations
      and 2) external expectations. It is widely understood that delaying
      communication is detrimental, but not too many people seem concerned about
      delaying feedback. One point I'm touching on is the "cost of delaying
      feedback". Feedback, naturally, is going to result in change. There may be
      more types of change (I'm very interested in hearing your thoughts on
      changes resulting from feedback), but I'm focusing on two: 1) changes in
      what is being produced and 2) changes in how things are being produced. My
      opinion is that changes to what is being produced will fall along Barry
      Boehm's "Cost of Change Curve". But I don't have much information on hand
      about the cost/effects of changes in how things are being done. Which is why
      I'm sending this email ...

      I apologize if that was way too much information. I'm just hoping to get
      good relevant information! I really appreciate any help I can get with
      this!


      [Non-text portions of this message have been removed]
    • D.AndrĂ© Dhondt
      ... Disclaimer: I only skimmed your note.... but it seemed related to David Anderson s keynote at XP 2010 last week. He said he introduced lean, and
      Message 2 of 8 , Jun 6, 2010
      • 0 Attachment
        On Sun, Jun 6, 2010 at 9:20 PM, Michael Avila <t3hh00d@...> wrote:

        > I'm looking for information regarding the cost of changing process in an
        > organization. I'm more interested in the costs of relatively small
        > changes to the process, but any information would be GREATLY appreciated.
        >

        Disclaimer: I only skimmed your note.... but it seemed related to David
        Anderson's keynote at XP 2010 last week. He said he introduced lean, and
        specifically work-in-progress limits, to monitor the software process
        improvement efforts on his projects. This isn't directly measuring the
        costs of change, but it shows the value of those changes, which ties the
        feedback loops you were looking for, I think. So that means you'd need to
        do a value-stream map, add wip limits, and then start experimenting...

        --
        D. André Dhondt
        mobile: 011 33 671 034 984
        twitter: adhondt http://dhondtsayitsagile.blogspot.com/

        Support low-cost conferences -- http://AgileTour.org/
        If you're in the area, join Agile Philly http://www.AgilePhilly.com
        Mentor/be mentored for free: the Agile Skills Project
        http://www.AgileSkillsProject.org/


        [Non-text portions of this message have been removed]
      • Tim Ottinger
        Cost: your sanity, 5% of the team at least, and management upset. Less joking: If you actually want to prevent a change, measure it by the day, hour, or week.
        Message 3 of 8 , Jun 9, 2010
        • 0 Attachment
          Cost: your sanity, 5% of the team at least, and management upset.

          Less joking:

          If you actually want to prevent a change, measure it by the day, hour, or week.
          If you want it to succeed, then measure it by the quarter or year.

          After two years, we have measurable improvements (huge decrease in known errors, faster release of small new features, shorter release test cycles, fewer regressions, fewer broken builds per month, decrease in code size, increase in test count and test coverage, etc.

          If you measured two or three months in, all of the good measures would be down and all the bad ones up.



          Tim Ottinger
          http://agileinaflash.blogspot.com/
          http://agileotter.blogspot.com/
        • George Dinwiddie
          Tim, I sense a card lurking in your reply. - George ... -- ... * George Dinwiddie * http://blog.gdinwiddie.com Software Development
          Message 4 of 8 , Jun 9, 2010
          • 0 Attachment
            Tim, I sense a card lurking in your reply.

            - George

            Tim Ottinger wrote:
            > Cost: your sanity, 5% of the team at least, and management upset.
            >
            > Less joking:
            >
            > If you actually want to prevent a change, measure it by the day,
            > hour, or week.
            > If you want it to succeed, then measure it by the quarter or year.
            >
            > After two years, we have measurable improvements (huge decrease in
            > known errors, faster release of small new features, shorter release
            > test cycles, fewer regressions, fewer broken builds per month,
            > decrease in code size, increase in test count and test coverage, etc.
            >
            >
            > If you measured two or three months in, all of the good measures
            > would be down and all the bad ones up.

            --
            ----------------------------------------------------------------------
            * George Dinwiddie * http://blog.gdinwiddie.com
            Software Development http://www.idiacomputing.com
            Consultant and Coach http://www.agilemaryland.org
            ----------------------------------------------------------------------
          • Michael Avila
            @Dhondt - Thanks! I m actually currently talking with leadership about value-stream mapping. I ve never done it before, but I m very familiar with what it is
            Message 5 of 8 , Jun 9, 2010
            • 0 Attachment
              @Dhondt - Thanks! I'm actually currently talking with leadership about
              value-stream mapping. I've never done it before, but I'm very familiar with
              what it is and the ideas behind it. Do you have any resource on hand that
              would be of practical use? I've been looking at the "Learning to See" book
              for the last few months, but haven't got to it yet (I spend an inordinate
              amount of time reading).

              On Wed, Jun 9, 2010 at 9:54 AM, George Dinwiddie <lists@...>wrote:

              >
              >
              > Tim, I sense a card lurking in your reply.
              >
              > - George
              >
              >
              > Tim Ottinger wrote:
              > > Cost: your sanity, 5% of the team at least, and management upset.
              > >
              > > Less joking:
              > >
              > > If you actually want to prevent a change, measure it by the day,
              > > hour, or week.
              > > If you want it to succeed, then measure it by the quarter or year.
              > >
              > > After two years, we have measurable improvements (huge decrease in
              > > known errors, faster release of small new features, shorter release
              > > test cycles, fewer regressions, fewer broken builds per month,
              > > decrease in code size, increase in test count and test coverage, etc.
              > >
              > >
              > > If you measured two or three months in, all of the good measures
              > > would be down and all the bad ones up.
              >
              > --
              > ----------------------------------------------------------
              > * George Dinwiddie * http://blog.gdinwiddie.com
              > Software Development http://www.idiacomputing.com
              > Consultant and Coach http://www.agilemaryland.org
              > ----------------------------------------------------------
              >
              >
              >



              --
              -
              http://proas.org/michael


              [Non-text portions of this message have been removed]
            • Michael Avila
              @Ottinger - That makes perfect sense! Thanks for the heads up. I realize that what I m looking for may not actually be practical/feasible, but let me rephrase
              Message 6 of 8 , Jun 9, 2010
              • 0 Attachment
                @Ottinger - That makes perfect sense! Thanks for the heads up.

                I realize that what I'm looking for may not actually be practical/feasible,
                but let me rephrase and see if that provokes anymore thoughts.

                I'm putting some information together for leadership at my company. My
                assessment, so far, is that everyone takes too much of a *hands
                off*approach. Everyone is working very hard, but we have a lot of
                problems with
                meeting expectations, and this issue is pervasive. Not meeting our clients'
                expectations will ultimately results in change, and as you can imagine, it's
                quite a bit of effort to reconcile a few weeks worth of work with our newly
                understood expectations. IMO a lot more effort than necessary because we are
                also not meeting our own internal expectations. Nearly every change we react
                to involves us changing what we've produced, and not how we produced it. If
                we had better technical practices, more effective interactions with each
                other, more insightful interactions with our customer, essentially, more
                driving our development with feedback, we can mitigate much of these costs.

                I hope leadership realizes that it's important for us to meet our internal
                expectations/standards, which means allocating time/energy to ensure that it
                happens. The development team is interested in things like TDD, CI, and code
                reviews, but they feel they don't have time because they need to keep *
                producing*. I'm referencing material like Conway's Law to reinforce the idea
                that how we produce is a critical factor in what we produce, and Barry
                Boehm's Cost of Change Curve to help them understand why the cost of
                producing rises throughout production.

                Since a big point I'm trying to make is that we need to *change how we're
                working when how we are working isn't working* I was hoping to find
                information on the costs associated with making that change. I'm suddenly
                realizing that I only vaguely understand what I'm trying to get at. The cost
                of change curve explains how delaying finding and fixing problems causes the
                cost to rise as progress is made. I'm assuming that delaying changing the
                way you work has a similar effect on the cost of developing?

                I, again, really appreciate your guys' patience with me. What I'm thinking
                may already be well established. I'm quite familiar with the literature
                surrounding what we do and know that no material I have on hand that
                discusses what I'm thinking about.

                Any thoughts?

                Thanks again, I realize all your time is valuable and I appreciate you
                making room for helping me out.


                On Wed, Jun 9, 2010 at 10:33 AM, Michael Avila <t3hh00d@...> wrote:

                > @Dhondt - Thanks! I'm actually currently talking with leadership about
                > value-stream mapping. I've never done it before, but I'm very familiar with
                > what it is and the ideas behind it. Do you have any resource on hand that
                > would be of practical use? I've been looking at the "Learning to See" book
                > for the last few months, but haven't got to it yet (I spend an inordinate
                > amount of time reading).
                >
                >
                > On Wed, Jun 9, 2010 at 9:54 AM, George Dinwiddie <lists@...>wrote:
                >
                >>
                >>
                >> Tim, I sense a card lurking in your reply.
                >>
                >> - George
                >>
                >>
                >> Tim Ottinger wrote:
                >> > Cost: your sanity, 5% of the team at least, and management upset.
                >> >
                >> > Less joking:
                >> >
                >> > If you actually want to prevent a change, measure it by the day,
                >> > hour, or week.
                >> > If you want it to succeed, then measure it by the quarter or year.
                >> >
                >> > After two years, we have measurable improvements (huge decrease in
                >> > known errors, faster release of small new features, shorter release
                >> > test cycles, fewer regressions, fewer broken builds per month,
                >> > decrease in code size, increase in test count and test coverage, etc.
                >> >
                >> >
                >> > If you measured two or three months in, all of the good measures
                >> > would be down and all the bad ones up.
                >>
                >> --
                >> ----------------------------------------------------------
                >> * George Dinwiddie * http://blog.gdinwiddie.com
                >> Software Development http://www.idiacomputing.com
                >> Consultant and Coach http://www.agilemaryland.org
                >> ----------------------------------------------------------
                >>
                >>
                >>
                >
                >
                >
                > --
                > -
                > http://proas.org/michael
                >



                --
                -
                http://proas.org/michael


                [Non-text portions of this message have been removed]
              • Jeff Anderson
                Actually I think you can incrementally improve by lowering wip and measuring the impact of change on lead and cycle time, I ve tried it and so far I m
                Message 7 of 8 , Jun 9, 2010
                • 0 Attachment
                  Actually I think you can incrementally improve by lowering wip and
                  measuring the impact of change on lead and cycle time, I've tried it
                  and so far I'm impressed with the results, so +1 on the Kanban.

                  I disagree with Tim, measurements are another source of data, just
                  don't confuse them with the truth, it's a tool but you can very like
                  invent your own correlations so you need to be careful with them. That
                  being said lack of measurability is one of the reasons software
                  development resembles magic and religion more than technology,
                  agilists and waterfallers argue without and real evidence. This is
                  slowly changing.

                  I would track new processes by a couple of key measures, but be ready
                  for some short term dips (to tims point)

                  I like the ones suggested by the kanban approach.
                  -lead time
                  -cycle time
                  -failure intake vs value intake

                  This ties all process changes to improvements in the system.


                  On 6/9/10, George Dinwiddie <lists@...> wrote:
                  > Tim, I sense a card lurking in your reply.
                  >
                  > - George
                  >
                  > Tim Ottinger wrote:
                  >> Cost: your sanity, 5% of the team at least, and management upset.
                  >>
                  >> Less joking:
                  >>
                  >> If you actually want to prevent a change, measure it by the day,
                  >> hour, or week.
                  >> If you want it to succeed, then measure it by the quarter or year.
                  >>
                  >> After two years, we have measurable improvements (huge decrease in
                  >> known errors, faster release of small new features, shorter release
                  >> test cycles, fewer regressions, fewer broken builds per month,
                  >> decrease in code size, increase in test count and test coverage, etc.
                  >>
                  >>
                  >> If you measured two or three months in, all of the good measures
                  >> would be down and all the bad ones up.
                  >
                  > --
                  > ----------------------------------------------------------------------
                  > * George Dinwiddie * http://blog.gdinwiddie.com
                  > Software Development http://www.idiacomputing.com
                  > Consultant and Coach http://www.agilemaryland.org
                  > ----------------------------------------------------------------------
                  >
                  >

                  --
                  Sent from my mobile device

                  Jeff Anderson

                  http://agileconsulting.blogspot.com/
                • Tim Ottinger
                  I don t know how to give you this information. Sorry. But I do know that you can start changing without it. There may have to be some talks about what it means
                  Message 8 of 8 , Jun 10, 2010
                  • 0 Attachment
                    I don't know how to give you this information. Sorry.
                    But I do know that you can start changing without it.
                    There may have to be some talks about what it means
                    and how to back off a bit for a short while. You might
                    have to bring in a consultant or coach to give it a
                    try.

                    Most decisions are not really made on cost. Even
                    no-brainer cost/benefit decisions are often overturned
                    on gut feel and emotions. See "management rewired"
                    for some discussion of brain science and how things
                    work. It's not a rational case of cost/bene, but
                    whether people feel good about changing.

                    So you're not measuring, you're selling.

                    Nothing will sell better than a little success.
                    The trick is getting a little success or two
                    started.

                    Strike a match, start a fire.

                    Tim Ottinger
                    http://agileinaflash.blogspot.com/
                    http://agileotter.blogspot.com/
                  Your message has been successfully submitted and would be delivered to recipients shortly.