Sorry, an error occurred while loading the content.

## RE: [scrumdevelopment] Metrics to report up

Expand Messages
• Thank you Adam, I have been wanting such a clear explanation of Function Point Analysis for a long time :) I see one probably totally irrelevant little flaw in
Message 1 of 57 , Nov 2, 2010
Thank you Adam, I have been wanting such a clear explanation of Function Point Analysis for a long time :)

I see one probably totally irrelevant little flaw in the arithmetic ... I hesitate to mention it ... How do you calculate the lines of code in your system before any code has been written?

Regards,
Roy Morien

To: scrumdevelopment@yahoogroups.com
From: adam.sroka@...
Date: Mon, 1 Nov 2010 20:45:14 -0700
Subject: Re: [scrumdevelopment] Metrics to report up

What you need to do is calculate the lines of code in your system and the number of function points. From this you can derive the ANQ according to the following formula:

ANQ = ( LOC / FP ^ 2 ) * Story Points

The way you use this is you normalize all of your teams' working hours by comparing each team to the most productive team according to this formula:

Team Hours to Work = 40 * Highest ANQ / Team's ANQ

In this way you can make sure that all teams produce the same number of points. This is the power of the Arbitrary Number Quotient (ANQ.)

...

On a more serious note, you could:

1) Give the Sr. Manager a login to the user test environment.

2) Explain that every new feature completed in the Sprint will be deployed to this environment by the end of Sprint demo, and she is welcome to play with them however she wants. In fact, any feedback she could give to the PO would be welcomed.

3) Offer to spend time going over the current features in the backlog as well as those that have been released, and invite the Sr. Manager to your end of Sprint demo.

4) Offer to provide any other information that she may need, but point out that it is your (the Scrummaster's) job to limit the amount of time the team spends collecting data in order to maximize the amount of time that they are adding value to the product.

On Sun, Oct 31, 2010 at 8:57 PM, Charles Bradley - Scrum Coach CSM, PSM I wrote:

What metrics should we report up?  Which ones shouldn't we report up?

An example:
A department has 3 dev teams that all report to one Senior Manager.  That Senior Manager reports along with a few others to a VP.

What metrics, if any, would you suggest to report from the team to the Sr. Mgr?
What metrics, if any, would you suggest to report from the Sr. Mgr up to the VP?

This is something I don't have a lot of experience in, so I'd appreciate any feedback.

Charles

• I have come across this scenario very often (almost 95% products/projects) in which defects escape and I have seen this irrespective of methodologies (or
Message 57 of 57 , Dec 13, 2010
I have come across this scenario very often (almost 95% products/projects) in which defects escape and I have seen this irrespective of methodologies (or practices) used. Human can make mistakes (at every phase/activity) for various reasons however we would like to keep improving ourselves continuously. We measure 'defects escaped' and take it seriously, means, we get to the root cause and invest in required trainings/expectations.

We focus on design & code quality metrics, like, cyclomatic complexity, fan-in, fan-out, depth (inheritance) and run some code quality tools to check coding standard compliance and other parameters (like, memory leakage etc). We report this to management as well to keep them in loop. We measure test related metrics, like, # of test cases (manual vs automated), first time pass ratio, # of defects (open, fixed, closed, postponed) etc.

We do not focus much on velocity, thanks to this forum. We track release burn down, # of stories committed / # of stories achieved (to keep improving team's commitment level), # of demos accepted/ rejected by PO, # of times team got changes in middle of sprint (it is minimal but not zero yet, this helps in deciding sprint length, puts back-pressure on PO) and few more (customer satisfaction and employee satisfaction).

For us, agile is mix of Scrum and XP practices hence we focus on both.

--
Regards,
Hariprakash Agrawal (Hari),
Managing Director | Agile Coach | http://opcord.com | http://www.linkedin.com/in/hariprakash
Software Testing (QTP, Selenium, JMeter, AutoIT, Sahi, NUnit, VB/Java Script, Manual) || Consulting / Trainings (Agile, CMMi, Six Sigma, Project Management, Software Testing
)

On Mon, Dec 13, 2010 at 9:11 PM, Ron Jeffries wrote:

Hello, woynam. On Monday, December 13, 2010, at 10:12:25 AM, you
wrote:

> Sorry, but I don't see how "defects" can escape. If you're
> writing automated tests for every story, an "escaped" defect means
> that you ignored the failing test. Is that really that common?

It is possible, and not all that unlikely, to miss a test or write
one incorrectly. It would be possible, I suppose, to define Done as
"passes whatever tests we wrote" but that strikes me as a bit too
lax.

So an escaped defect would be something we didn't like, that we
agree we understood and somehow failed to get implemented and
tested.

Ron Jeffries
www.XProgramming.com
Sorry about your cow ... I didn't know she was sacred.

Your message has been successfully submitted and would be delivered to recipients shortly.