Loading ...
Sorry, an error occurred while loading the content.

Re: [agile-usability] What's your definition of done?

Expand Messages
  • Justin Tauber
    Theoretically, there shouldn t be any such thing as technical debt either, though, right? But then again, the repayment of technical debts might be a good way
    Message 1 of 20 , Aug 17 8:59 PM
    • 0 Attachment
      Theoretically, there shouldn't be any such thing as technical debt either, though, right? But then again, the repayment of technical  debts might be a good way of distinguishing "done" for a release, from "done" for an iteration.

      Interestingly, the timing for repayment of a ux debt has to be different. Refactoring the design in a prerelease iteration, say, will only get in the way of refactoring code, right? So you can really only repay ux debt in the next release.

      Justin Tauber
      @brtrx

      On 18/08/2011, at 2:54 AM, Anders Ramsay <andersr@...> wrote:

       

      On Wed, Aug 17, 2011 at 11:59 AM, Justin Tauber <anotherjustin@...> wrote:


      Which makes me wonder: is anyone working with a concept of "user debt" as a counterpart to "technical debt" ? 
       
      I like the concept of "user debt," as in we're indebted to the users :) 

      UX debt is often a reference to a feature which technically functions as requested but where the quality of the experience is low, e..g some stuff is not properly aligned, or the feel of the experience is clunky or whatever.

      Defining something that has UX debt as Done means you'll be delivering a half-baked UX, which goes to the core of the UX field's complaint about Agile.

      IMO, there should be no such thing as UX debt. Either the quality of the experience is appropriate for the product, context, and domain, or it is not.  (E.g. for an entertainment product, experience quality should likely be very high, for an enterprise product, this may be less critical.)

      If a feature is seen as Done in every other way except for experience quality, then a decision can be made to call it Done and create a new card that addresses the particular experience quality issues and add it to the backlog.

    • Jon Innes
      Both technical debt and UX debt are real. Most teams can judge the amount of technical debt fairly well as they have engineers on the team who know how to.
      Message 2 of 20 , Aug 18 9:22 AM
      • 0 Attachment

        Both technical debt and UX debt are real. Most teams can judge the amount of technical debt fairly well as they have engineers on the team who know how to. That's not true for UX debt. That's because UX can only be judged directly by prospective end-users. Since few teams actually capture user feedback DURING development in a systematic and objective way and use it to guide their day to day work, most are unconsciously incompetent when it comes to UX. 

        As I said in my recent talk at Agile2011, "done" can only be defined by the end-users. Teams can try and judge it, but "good enough" to a bunch of engineers or the product owner may not be good enough for the market. This is why the vast majority of IT projects (and many startups) fail, they jump off the waterfall when it comes to user adoption. That's the core insight from UX that Agile doesn't typically consider. Marty Cagan's book "Inspired" talks about this is some detail.

        I'd disagree with Justin's point below. I've been on projects where we had considerable technical debt and UX debt. It was only when the development team was given the opportunity to redo existing functionality for technical reasons that we got a chance to address the UX debt.

        The key about UX debt is it is relative to the maturity of the market. Geoffrey Moore's book "Dealing with Darwin" and Don Norman's "Invisible Computer" cover this. The reason some products can survive with significant UX debt is that the market they sell to is immature, or it has bad user feedback loops so decisions aren't based on UX factors. As someone who has led UX teams at major enterprise software firms I can tell you that's the real reason most enterprise software firms get away with bad UX. However, the position of a product with significant UX debt is tenuous and largely sustained due to cost of switching products and other market forces. Some examples of how fast this changes in other markets--the table PC (MSFT was first to market) and the smart phone (Nokia).

        Jon Innes
        UX Innovation LLC
        @innes_jon

        On Aug 17, 2011, at 8:59 PM, Justin Tauber wrote:

         

        Theoretically, there shouldn't be any such thing as technical debt either, though, right? But then again, the repayment of technical  debts might be a good way of distinguishing "done" for a release, from "done" for an iteration.

        Interestingly, the timing for repayment of a ux debt has to be different. Refactoring the design in a prerelease iteration, say, will only get in the way of refactoring code, right? So you can really only repay ux debt in the next release.

        Justin Tauber
        @brtrx

        On 18/08/2011, at 2:54 AM, Anders Ramsay <andersr@...> wrote:

         

        On Wed, Aug 17, 2011 at 11:59 AM, Justin Tauber <anotherjustin@...> wrote:


        Which makes me wonder: is anyone working with a concept of "user debt" as a counterpart to "technical debt" ? 
         
        I like the concept of "user debt," as in we're indebted to the users :) 

        UX debt is often a reference to a feature which technically functions as requested but where the quality of the experience is low, e..g some stuff is not properly aligned, or the feel of the experience is clunky or whatever.

        Defining something that has UX debt as Done means you'll be delivering a half-baked UX, which goes to the core of the UX field's complaint about Agile.

        IMO, there should be no such thing as UX debt. Either the quality of the experience is appropriate for the product, context, and domain, or it is not.  (E.g. for an entertainment product, experience quality should likely be very high, for an enterprise product, this may be less critical.)

        If a feature is seen as Done in every other way except for experience quality, then a decision can be made to call it Done and create a new card that addresses the particular experience quality issues and add it to the backlog.



      • Adrian Howard
        Hi Gene, I think I m probably making my point badly. Let me try and clarify :-) ... Some questions: What progress are we trying to measure? What s the utility
        Message 3 of 20 , Aug 18 9:29 AM
        • 0 Attachment
          Hi Gene,

          I think I'm probably making my point badly. Let me try and clarify :-)

          On 17 Aug 2011, at 14:33, Gene Arch wrote:

          > I must respectfully disagree with you, Adrian. The definition you
          > espouse seems to take the work outside of the iteration, which to me
          > seems to make it harder to really nail down what done is. Your
          > definition given below seems to lend itself to stories hanging out there without a definite resolution, which makes it difficult to show
          > progress.

          Some questions:

          What progress are we trying to measure?

          What's the utility of having a metric that shows the team progressing when the rest of the business disagrees?

          If we have stories hanging out there without a definite resolution is that a sign that:
          a) we should change our definition of "done"?
          b) we should figure out what's blocking those stories and fix the problem?

          > For instance, in your example of the MDs agreeing on a
          > feature being ok, what did you do in the event they disagreed?

          Simple - the story wasn't done :-)

          > Did a story sit out there until they could reconcile their differences?

          Yup.

          > Wouldn't it be better to agree upon what is acceptable relative to what you understand were the requirements from the MDs,

          We had built a feature that management decided wasn't correct. How would counting this as done help the team or the business?

          > and if they can't agree on whether or not the feature is acceptable, then that's an issue outside the development iteration. Something that, as you stated, is a separate problem.

          It was a separate problem - but one that the team needed to address. Having the measured productivity drop gives us a powerful tool to help do that.

          Let me tell you about the 2xMD problem in a bit more detail. This is a slight simplification, but I think it gets the point across.

          The working context was a web app that was continually deployed, so we were measuring stories-done-throughput not stories-done-per-iteration.

          The two MDs (let's call them Bob and Alice) were very busy and often offsite. Bob was more available, but Alice was senior and knew more about the domain and customers.

          Most of the time progress was fine. Stories were done, passed dev/customer tests, and were deployed. Everybody happy.

          However, there were a couple of occasions where stuff was missed or misunderstood by the dev team. One time it caused a pretty serious issue with a client demo and nearly lost an important sale. We started implementing a bunch of changes that should help stop the problem happening again - but Alice asked to approve all stories before they were released. Alice approval became part of the team definition of "done".

          Alice was often unavailable. Throughput dropped dramatically. That was quickly noticed. We discussed the issue and Bob agreed to step in when Alice wasn't available. Throughput rose. Having "Alice or Bob sign off" as part of "done" helped us solve a problem.

          Well - almost solve it... Alice sometimes disagreed with Bob. Bob sometimes disagreed with Alice. So "done" started included "Bob and Alice had to sign off". Unsurprisingly throughput slowed again. It wasn't as bad as before since Bob helped us spot some problems earlier and fix 'em. He also helped spread the domain knowledge around the rest of the team more - so we started spotting some of the problems ourselves. But Alice was still a bottleneck - and we could show it *because* we had a very restrictive and tight definition of done.

          That meant we could have a conversation with Bob and Alice about the relative value in the off-site work they were doing, compared to the value them being available as domain experts, and to help verify stories were doing what they wanted them to do.

          That conversation, over time, resulted in various tweaks to the product development process that removed the bottleneck (a combination of arranging more time with Alice, more education of the team as a whole about the domain, and a tweaked release process that pushed features out to a smaller audience first, some FIT-ish tests to help make some domain knowledge explicit, etc.).

          Does that make it more obvious why having the MD sign of form part of our definition of done? It helped us figure out problems that needed fixing.

          > I have difficulty with anything living "inside the team's heads,"
          > because when it does, it is often subject to misinterpretation and
          > hidden assumptions. If the team documents their definition of done in a tangible manner and keeps it close, then in the event of disagreements during an iteration, they can bring it out and point to it as a reference point for the discussion.
          >
          > I'm not saying the definition is the end-all be-all (it necessarily should be reviewed on a regular basis and updated based on new information), but to leave it to the team's individual assumptions seems dangerous to me.

          I'm not necessarily against there being documentation of the definition of done - but the value is in the agreement. Not the document. Especially since in many contexts what counts as "done" for different stories at different times can be very different.

          > I may be misunderstanding you, and if so, I apologize. I am also a
          > relatively new practitioner of agile, so I could be off my rocker as
          > well, but this is how I understand it.


          You're definitely not off your rocker :)

          Cheers,

          Adrian
          --
          http://quietstars.com adrianh@... twitter.com/adrianh
          t. +44 (0)7752 419080 skype adrianjohnhoward del.icio.us/adrianh
        • Gene Arch
          Thanks for your precise response to my message, Adrian. I better understand the situation, and it makes more sense. I think in my organization, we put a lot
          Message 4 of 20 , Aug 18 10:11 AM
          • 0 Attachment
            Thanks for your precise response to my message, Adrian.  I better understand the situation, and it makes more sense.  I think in my organization, we put a lot of emphasis on stories-done-per-iteration, and that's where I was getting the confusion.  I'm now wondering if we should look at your way of doing it, as we have a similar issue with a client team that is offsite, sometimes disagrees, and likes control. 
             
            It's always educational to read your posts, Adrian, so thank you very much for humoring me :)
             
            ~Gene


            From: agile-usability@yahoogroups.com [mailto:agile-usability@yahoogroups.com] On Behalf Of Adrian Howard
            Sent: Thursday, August 18, 2011 11:29 AM
            To: agile-usability@yahoogroups.com
            Subject: Re: [agile-usability] What's your definition of done?

             

            Hi Gene,

            I think I'm probably making my point badly. Let me try and clarify :-)

            On 17 Aug 2011, at 14:33, Gene Arch wrote:

            > I must respectfully disagree with you, Adrian. The definition you
            > espouse seems to take the work outside of the iteration, which to me
            > seems to make it harder to really nail down what done is. Your
            > definition given below seems to lend itself to stories hanging out there without a definite resolution, which makes it difficult to show
            > progress.

            Some questions:

            What progress are we trying to measure?

            What's the utility of having a metric that shows the team progressing when the rest of the business disagrees?

            If we have stories hanging out there without a definite resolution is that a sign that:
            a) we should change our definition of "done"?
            b) we should figure out what's blocking those stories and fix the problem?

            > For instance, in your example of the MDs agreeing on a
            > feature being ok, what did you do in the event they disagreed?

            Simple - the story wasn't done :-)

            > Did a story sit out there until they could reconcile their differences?

            Yup.

            > Wouldn't it be better to agree upon what is acceptable relative to what you understand were the requirements from the MDs,

            We had built a feature that management decided wasn't correct. How would counting this as done help the team or the business?

            > and if they can't agree on whether or not the feature is acceptable, then that's an issue outside the development iteration. Something that, as you stated, is a separate problem.

            It was a separate problem - but one that the team needed to address. Having the measured productivity drop gives us a powerful tool to help do that.

            Let me tell you about the 2xMD problem in a bit more detail. This is a slight simplification, but I think it gets the point across.

            The working context was a web app that was continually deployed, so we were measuring stories-done-throughput not stories-done-per-iteration.

            The two MDs (let's call them Bob and Alice) were very busy and often offsite. Bob was more available, but Alice was senior and knew more about the domain and customers.

            Most of the time progress was fine. Stories were done, passed dev/customer tests, and were deployed. Everybody happy.

            However, there were a couple of occasions where stuff was missed or misunderstood by the dev team. One time it caused a pretty serious issue with a client demo and nearly lost an important sale. We started implementing a bunch of changes that should help stop the problem happening again - but Alice asked to approve all stories before they were released. Alice approval became part of the team definition of "done".

            Alice was often unavailable. Throughput dropped dramatically. That was quickly noticed. We discussed the issue and Bob agreed to step in when Alice wasn't available. Throughput rose. Having "Alice or Bob sign off" as part of "done" helped us solve a problem.

            Well - almost solve it... Alice sometimes disagreed with Bob. Bob sometimes disagreed with Alice. So "done" started included "Bob and Alice had to sign off". Unsurprisingly throughput slowed again. It wasn't as bad as before since Bob helped us spot some problems earlier and fix 'em. He also helped spread the domain knowledge around the rest of the team more - so we started spotting some of the problems ourselves. But Alice was still a bottleneck - and we could show it *because* we had a very restrictive and tight definition of done.

            That meant we could have a conversation with Bob and Alice about the relative value in the off-site work they were doing, compared to the value them being available as domain experts, and to help verify stories were doing what they wanted them to do.

            That conversation, over time, resulted in various tweaks to the product development process that removed the bottleneck (a combination of arranging more time with Alice, more education of the team as a whole about the domain, and a tweaked release process that pushed features out to a smaller audience first, some FIT-ish tests to help make some domain knowledge explicit, etc.).

            Does that make it more obvious why having the MD sign of form part of our definition of done? It helped us figure out problems that needed fixing.

            > I have difficulty with anything living "inside the team's heads,"
            > because when it does, it is often subject to misinterpretation and
            > hidden assumptions. If the team documents their definition of done in a tangible manner and keeps it close, then in the event of disagreements during an iteration, they can bring it out and point to it as a reference point for the discussion.
            >
            > I'm not saying the definition is the end-all be-all (it necessarily should be reviewed on a regular basis and updated based on new information), but to leave it to the team's individual assumptions seems dangerous to me.

            I'm not necessarily against there being documentation of the definition of done - but the value is in the agreement. Not the document. Especially since in many contexts what counts as "done" for different stories at different times can be very different.

            > I may be misunderstanding you, and if so, I apologize. I am also a
            > relatively new practitioner of agile, so I could be off my rocker as
            > well, but this is how I understand it.

            You're definitely not off your rocker :)

            Cheers,

            Adrian
            --
            http://quietstars.com adrianh@... twitter.com/adrianh
            t. +44 (0)7752 419080 skype adrianjohnhoward del.icio.us/adrianh

          • Adrian Howard
            Hi Justin, ... I think unpacking done like that can be harmful. It s making the feedback loop longer, so it makes it much harder to fix problems sooner. For
            Message 5 of 20 , Aug 19 1:42 AM
            • 0 Attachment
              Hi Justin,

              On 17 Aug 2011, at 16:59, Justin Tauber wrote:

              > Thanks for all the responses, they've been really useful.
              >
              > I get the impressions that there a few different concepts that need unpacking here:
              > * Done for an iteration vs done for a release
              > * Done as an internal standard prior to exposure to customers vs done from the perspective of the customer

              I think unpacking "done" like that can be harmful. It's making the feedback loop longer, so it makes it much harder to fix problems sooner.

              For example if a story is "done" for an iteration, but not "done" for a release - when do we figure out it's not really "done" for a release? How is putting off that level of "done" until later helping us?

              The relentless focus of good agile teams on done _really_ meaning done is a huge advantage for me and one I'd be reluctant to give up. It's the driver behind getting all necessary people involved with the project.

              > Though I can see how from both an agile and a ux perspective there shouldn't be a standard that isn't end user focused, from a practical perspective it's hard to learn from usability tests run on buggy software, so some "internal" standards need to be adopted. From that perspective, Jeff's definition of done looks like meaning "ready to expose to usability testing".

              Actually I think there's quite a lot the team can learn from usability testing buggy software ;-) I know I've learned include things like:

              * Participants don't notice a feature is buggy
              * Participants never use the buggy feature
              * Participants notice, but work around, the buggy feature
              * Participants see the bug as a feature

              all of which have interesting effects on how we might prioritise features and future work.

              Cheers,

              Adrian
              --
              http://quietstars.com adrianh@... twitter.com/adrianh
              t. +44 (0)7752 419080 skype adrianjohnhoward del.icio.us/adrianh
            • Adrian Howard
              Hi Anders, On 17 Aug 2011, at 17:54, Anders Ramsay wrote: [snip] ... [snip] I think it depends on the particular way the debt metaphor is being used. If you re
              Message 6 of 20 , Aug 19 2:00 AM
              • 0 Attachment
                Hi Anders,

                On 17 Aug 2011, at 17:54, Anders Ramsay wrote:
                [snip]
                > Defining something that has UX debt as Done means you'll be delivering a
                > half-baked UX, which goes to the core of the UX field's complaint about
                > Agile.
                >
                > IMO, there should be no such thing as UX debt. Either the quality of the
                > experience is appropriate for the product, context, and domain, or it is
                > not. (E.g. for an entertainment product, experience quality should likely
                > be very high, for an enterprise product, this may be less critical.)
                [snip]

                I think it depends on the particular way the debt metaphor is being used.

                If you're using it in the more general "debt is bad" sense then I agree completely. If you produce a poor UX, just like if you produce bad code, and don't care then your product is f*cked in anything but the short term.

                However the metaphor was originally a little bit more nuanced than that. It's more about how taking on a technical debt, and then paying it off later, can be the right thing to do in some circumstances - just like taking on a financial debt. There's a nice video from Ward on the topic here http://www.youtube.com/watch?v=pqeJFYwnkjE (It's only 5mins long).

                Having debt is fine - if it's done in a mindful manner with the intent of paying the debt off.

                Cheers,

                Adrian
                --
                http://quietstars.com adrianh@... twitter.com/adrianh
                t. +44 (0)7752 419080 skype adrianjohnhoward del.icio.us/adrianh
              Your message has been successfully submitted and would be delivered to recipients shortly.