Loading ...
Sorry, an error occurred while loading the content.

REVIEW: "Minding the Machines", William M. Evan/Mark Manion

Expand Messages
  • Rob, grandpa of Ryan, Trevor, Devon & Ha
    BKMNDMCH.RVW 20040527 Minding the Machines , William M. Evan/Mark Manion, 2002, 0-13-065646-1, U$29.99/C$46.99 %A William M. Evan
    Message 1 of 1 , Sep 28, 2004
    • 0 Attachment
      BKMNDMCH.RVW 20040527

      "Minding the Machines", William M. Evan/Mark Manion, 2002,
      0-13-065646-1, U$29.99/C$46.99
      %A William M. Evan mindingthemachines@...
      %A Mark Manion mindingthemachines@...
      %C One Lake St., Upper Saddle River, NJ 07458
      %D 2002
      %G 0-13-065646-1
      %I Prentice Hall
      %O U$29.99/C$46.99 +1-201-236-7139 fax: +1-201-236-7131
      %O http://www.amazon.com/exec/obidos/ASIN/0130656461/robsladesinterne
      %O http://www.amazon.ca/exec/obidos/ASIN/0130656461/robsladesin03-20
      %P 485 p.
      %T "Minding the Machines: Preventing Technological Disasters"

      Part one is an introduction. It is ironic, both in terms of the title
      of the chapter; "Technological Disasters: an Overview"; and
      particularly the title of the book, that although the authors list
      four categories of disaster causes, the examples given overwhelmingly
      indicate human error, if not outright malfeasance. The
      classifications provided are also confusing: what difference is there
      between human, organizational, and socio-cultural factors? The
      comparison of natural and man-made disasters, and the supporting
      tables, in chapter two raise more questions than they answer: why are
      both types increasing at almost identical rates (in glaring contrast
      to the stated conclusion)?

      Part two looks at the prevalence of technological disasters. (I
      thought we just did that?) Chapter three says nothing new about Y2K.
      The theories of technological disasters, in chapter four, are flawed
      by an overly simplistic view of systems, one which completely ignores
      the inherent tendency of complex systems in general, and digital
      systems in particular, to catastrophic failure modes. As noted, the
      book is heavily larded with tables and figures, most of which have
      little apparent relevance to the text, and some of which actually seem
      to contradict the written material. One example in this chapter
      points out that the figures are, themselves, unexplained and poorly
      captioned: a diagram with six numbered interrelationships is followed
      by a numbered list--for a completely different set of factors. In
      chapter five the authors set up an odd, and poorly explained, matrix
      of "systemic dimensions" underlying disasters. "Human Factors
      Factors" (sic) are technological (as opposed to social) systems and
      external (as opposed to internal) systemic factors. The reporting of
      details in the examples in this and other chapters is suspect: despite
      specific and itemized accounts of the Therac 25 tragedy in at least
      two of the references listed for this chapter, the authors insist that
      somehow the type of radiation was at fault, rather than the flawed
      user interface that allowed incorrect dosage settings to be retained
      by the device, even after the operator believed the error had been

      Part three supposedly looks at technological disasters since the
      industrial revolution. Chapter six meanders through a wide variety of
      industrial "revolutions," and then delves briefly into future biotech,
      nanotech, and robotics/artificial intelligence. A terse and bemusing
      expansion of the earlier four part matrix into twelve goes on in
      chapter seven.

      Part four provides an "Analysis of Case Studies of Technological
      Disasters." Chapter eight insists on fitting a number of tragedies
      into the matrix from chapter seven. The reasons for the choices are
      not obvious: the authors insist throughout the book that the Bhopal
      poison gas release was due to "socio-cultural factors" when it is
      clearly, as far as the book recounts, due to greed and a lack of
      provision for safety equipment and procedures. (Another table
      maintains that Bhopal was an "accident" while the sinking of the
      Titanic, with far less impact in deaths and injuries, was a disaster
      and a tragedy.) Chapter nine lists one "lesson learned" from each of
      the "case studies": actually, what all of them have in common is the
      fact that technological disasters have *numerous* causes, not just a
      single one. The Tenerife airliner crash, as only one example, was
      caused by overloading of a backup situation, fear of regulations that
      made no provision for emergencies, miscommunications, failure to
      verify communications, pressure of overloading of facilities, and
      other failures.

      Part five talks about strategic responses. Chapter ten states that
      scientists need to stress professional education and safety. Now, I
      can sympathize with that attitude in large measure: as a virus
      researcher I've been crying in the wilderness about malware for many
      years, and have recently been exhorting corporations to support free
      public security awareness training as a benefit to the enterprise by
      reducing overall levels of risk. I think it a bit unfair, though, to
      put all the weight for safety on the shoulders of the professionals,
      when the rest of society is completely obsessed with time-to-market
      and dancing pigs. Chapter eleven tacitly admits this fact, with case
      studies that demonstrate that in many instances of corporate
      wrongdoing the executives were warned of the dangers in advance. No
      recommendations for specific responses are made. The four legal
      branches of the United States government, and their relationships to
      technology, are listed in chapter twelve: again, no suggestions are
      forthcoming. A fairly standard overview of risk analysis is given in
      chapter thirteen, which, I suppose, might be some kind of endorsement
      of and recommendation for risk analysis itself. Chapter fourteen
      assumes that "democratic" decision making is better than "technical,"
      without ever examining the dangers of social and political influences
      forcing the bad public policy rulings that the case studies in the
      work truly demonstrate.

      This book actually says very little about either technology or
      technological disasters: most of the evidence points out fraud,
      avarice, and other social factors that create most any kind of
      disasters. For those who really do want to know how to make
      technology safer, it would be best to look elsewhere.

      copyright Robert M. Slade, 2004 BKMNDMCH.RVW 20040527

      ====================== (quote inserted randomly by Pegasus Mailer)
      rslade@... slade@... rslade@...
      Post hoc, ergo propter hoc
      After it, therefore because of it
      http://victoria.tc.ca/techrev or http://sun.soci.niu.edu/~rslade
    Your message has been successfully submitted and would be delivered to recipients shortly.