Loading ...
Sorry, an error occurred while loading the content.

Re: [XP] Re: Results are in on organizational culture survey

Expand Messages
  • Keith Ray
    Very well worded! C. Keith Ray Amplify Your Agility Coaching | Training | Assessment | eLearning http://industriallogic.com
    Message 1 of 55 , Jul 8, 2010
    • 0 Attachment
      Very well worded!

      C. Keith Ray

      Amplify Your Agility
      Coaching | Training | Assessment | eLearning

      On Jul 8, 2010, at 6:09 PM, Steven Gordon <sgordonphd@...> wrote:

      > BTW, the intrinsic human mechanism to convert an experience/input into
      > mental models consistent with our personal world view and then
      > substitute that mental model for that experience is also why agile
      > works better than the alternatives.
      > Agile calls for collaborating in real time when things are happening
      > instead of documenting those things and expecting everybody who reads
      > that document to somehow come away with the same mental model. Agile
      > calls for reducing the amount of WIP so that the team can all focus
      > collaboratively on that work as it is happening instead of
      > periodically inventorying that work and expecting whoever removes that
      > work from the shelf will somehow have the same mental model as the
      > people who were working on it earlier.
      > Short collaborative time boxes avoids everybody forming their
      > individual mental models of the work and then working at cross
      > purposes later. Many call this "shared mental model", but I think a
      > lot of what makes it work is that we are doing the work in real time
      > instead of substituting a mental model for latent work.
      > On Thu, Jul 8, 2010 at 5:39 PM, Steven Gordon <sgordonphd@...>
      > wrote:
      >> On Thu, Jul 8, 2010 at 4:27 PM, PAUL <beckfordp@...>
      >> wrote:
      >>> Hi Steven,
      >>> I've got my own views why we always end up down the same rabbit
      >>> hole. So I
      >>> agree with the sentiment, but that is different from "there is
      >>> nothing to
      >>> learn". Please see my comments:
      >>> --- In extremeprogramming@yahoogroups.com, Steven Gordon
      >>> <sgordonphd@...>
      >>> wrote:
      >>>> We can learn a lot about human perceptions, opinions, attitudes and
      >>>> biases. In some worlds, such as marketing, perception is reality.
      >>> In any world where we only have subjective means of measurement then
      >>> perception is reality. If my customers are extremely happy with
      >>> the results
      >>> of my labours, who am I to say they are wrong?
      >> Yes, customers can accurately say how happy they were with
      >> results. Once we
      >> start getting into their perceptions as to why they were
      >> dissatisfied, what
      >> lead to failures, what lead to successes, then I do not believe the
      >> data
      >> collected by a superficial survey months later reflects the reality
      >> of the
      >> project. A frequent complaint might be that the team did not plan
      >> in enough
      >> detail, but that response generally reflect FUD not the reality of
      >> the
      >> project. A post mortem with all the project stakeholders and
      >> participants
      >> facilitated by a professional facitator would get much more valid
      >> responses.
      >>>> In most other worlds, perceptions affects reality and can also
      >>>> provide
      >>>> clues
      >>>> about reality, but they are not reality.
      >>> I can get all phylisophical here, but how do you define reality? I'm
      >>> curious.
      >> Social scientists are well aware of differences between perception
      >> and
      >> reality. They know if you ask a person to reflect on why they did
      >> something, you will almost always get rationalizations instead of
      >> the real
      >> reason.
      >> They ask questions in very careful ways. They ask the same question
      >> different ways (with other questions in between). They ask the same
      >> question at different times. They have ways to ameliorate biases.
      >> Engineering researchers are not trained in those techniques.
      >>> People's perceptions about agile
      >>>> projects can help identify misconceptions that need to be
      >>>> addressed,
      >>>> expectations that need to be set coming in, etc. Inferring that
      >>>> perceptions
      >>>> as to what is good or bad about agile reflects what really does
      >>>> and does
      >>>> not
      >>>> work in agile seems flawed to me,
      >>> Good/Bad. That's the problem - two valued thinking which doesn't
      >>> allow for
      >>> grey or differences in opinion. If you say it works for you then
      >>> it works
      >>> for you. Now if 95% of people in your same situation say it
      >>> doesn't work for
      >>> them, then to me that is valuable data.
      >> The perception of grey along the success/failure axis is useful.
      >> Untrained
      >> researchers digging any deeper than that will get rationalizations
      >> based on
      >> biases, especially if much time has passed between the event and the
      >> survey. That is the way the human mind works, over the intervening
      >> time we
      >> cannot help but make mental models based on our own world view, and
      >> then the
      >> qualitative data reflects those mental models rather than what
      >> actually
      >> happened.
      >>>> If your purpose is to market agile, RUP, CMMI, etc., then
      >>>> qualitative
      >>>> surveys could indeed provide useful information. If your purpose
      >>>> is to
      >>>> learn how to make agile, RUP, CMMI, etc. actually work more
      >>>> effectively
      >>>> (as
      >>>> opposed to being marketed more effectively), I think it is
      >>>> dangerous to
      >>>> confuse perception and reality.
      >>> My goal isn't to market, my goal is to understand. Peoples
      >>> perceptions are
      >>> extremely important if I want to understand why they do what they
      >>> do. I
      >>> don't believe there is this objective truth. There are things that
      >>> people
      >>> think work and things that they think don't. If someone tells me
      >>> that
      >>> something is working for them, then to me that is a data point.
      >>> Paul.
      > ------------------------------------
      > To Post a message, send it to: extremeprogramming@...
      > To Unsubscribe, send a blank message to: extremeprogramming-unsubscribe@...
      > ad-free courtesy of objectmentor.comYahoo! Groups Links
    • Laurent Bossavit
      Hi Paul, ... I m just getting things off the ground at the moment, so I may have more to tell in a few months. The initiative is called Institut Agile and aims
      Message 55 of 55 , Jul 14, 2010
      • 0 Attachment
        Hi Paul,
        > Well done! Keeps us all posted. In fact can you tells us more?
        I'm just getting things off the ground at the moment, so I may have
        more to tell in a few months. The initiative is called Institut Agile
        and aims at two things, growing the agile business ecosystem and
        getting agile on the map as a research topic on an equivalent footing
        to "software engineering". The scope is (for now) local to France. One
        of the first items on the roadmap is to start establishing a database
        of projects for the purposes of those longitudinal studies I mentioned
        in the article I posted earlier. Another is to get in touch with
        everyone I can find doing research on agile practices (typically in
        software engineering departments) and put them in touch with each other.

        Laurent Bossavit
      Your message has been successfully submitted and would be delivered to recipients shortly.