Loading ...
Sorry, an error occurred while loading the content.

Re: Agile Estimation - User stories for non-user driven functionality?

Expand Messages
  • dnicolet99
    ... What I do (and many others, too, I think) on projects like this is to have a so-called Iteration Zero that is dedicated to building workstations, getting
    Message 1 of 7 , Jan 30, 2007
    • 0 Attachment
      --- In scrumdevelopment@yahoogroups.com, "Casey Manus"
      <caseymanus@...> wrote:
      > * Do you create user stories for "infrastructure like" pre-
      > requisites to developing the user facing functionality in our user
      > stories? User stories for things like building new developer
      > workstations, upgrading the database to SQL 2005, building base
      > classes, etc all seem wrong, save maybe the SQL 2005 upgrade because
      > you can tie that to an operation problem.

      What I do (and many others, too, I think) on projects like this is to
      have a so-called "Iteration Zero" that is dedicated to building
      workstations, getting the tools set up (database upgrade, etc.) and so

      Regarding how to handle technical tasks going forward, there's not a
      cookbook approach, but there has been a discussion of that very
      subject on the Extreme Programming discussion group, here
      http://tech.groups.yahoo.com/group/extremeprogramming/. You might be
      able to cull some useful ideas from that thread.

      > * The confidence level of our user story points seems to be low.
      > We did an initial pass at them, and then we immediately starting
      > second guessing them. The team didn't seem to all have the same idea
      > of what "done" was going to mean for us and not everyone had been
      > deeply involved in the spike (although everyone was supposed to be).

      IMO that's normal at the outset of a project, and even moreso for a
      team of people who haven't worked together before. Iteration by
      iteration the team's estimates should become more and more consistent.

      You're right to think that the definition of "done" is important. The
      team needs a consensus about that before estimates or velocity
      measurements can become useful.

      > * We have "completed" 3 sprints prior to starting this project,
      > and our reported velocity has been about 11 points, but in actuality
      > our velocity was probably only around 5, because our defect rate has
      > been high, we haven't been doing enough unit testing;code
      > review;acceptance testing, basically we aren't living up to the
      > definition of "done" we all agreed to.

      To me, the red flag here is that you're not reporting the truth. There
      shouldn't be a "reported velocity" for public consumption and a
      different, semi-secret "actual velocity". Transparency is a key value
      of agile development.
    Your message has been successfully submitted and would be delivered to recipients shortly.