RE: [agile-usability] security, agility, and usability
The book is on my buy list, Larry. When I do read it – and enjoy it – I will write you.
It isn’t just the redundancy, that have someone walking around provides. There is 10,000 to 100,000 years of genetic refinement of a system that has adapted and inspected over that period of time and continues to do so today. A UX should not –cannot be – something to replace the heuristics that happen between the ears. The UX must augment.
Take the helmet in the Warfighter, every feature and function acts as a force multiplier and it is designed to be accessible as the person does their job. Or the thermal ‘goggles’ most firefighters have access to. They ‘see through walls’ showing people, hotspots, and even warm pizza! But the firefighter is the one that acts because their eyes are behind the goggles and their brains are processing the information.
In short a really sophisticated UX can lull the brain into a false sense of predicted events. That sir is what to exploit if you want a successful infiltration. In this perspective, the more stuff you put in to track what is happening, the easier it is to create the illusion, because what is expected is already on the other side of the wire. All we need to do is strip the timestamp and run a loop or a delay.
Do I think Agile has a place in solving this – most definitely, but it isn’t necessarily in elegantly delivered code built by the best people who can code. I firmly believe Agile works best when it builds enough support – at any one time – to leverage the skills and capabilities of the people doing the job. To do that you need to put the ‘doer’ at the very front and build to meet their needs. Lean manufacturing also has an impact, if we keep in mind to limit only what you do to what has been agreed to.
From a systems perspective, Agile needs to change gears, IMO. The software we are so proud becomes a part of the solution, going lean would mean taking our seat in the bus of specialized skills who make the ‘doer’ a winner. What we need to understand that the systems view looks at what happens before, during, and after the ‘tooling’ is done. Once we do that and invite the other ‘specialized skills’ to the table – and listen to them – we can begin to improve the security, safety, and sanity of what people do to get the job done.
"Planning constantly peers into the future for indications as to where a solution may emerge."
"A Plan is a complex situation, adapting to an emerging solution."
Mike Dwyer said:
> I was looking forward to the book much more than the news. <
You can still enjoy the read, particularly as it is about a still largely unacknowledged threat in which we in the U.S. are particularly vulnerable. Do buy the book. (Am I allowed to say that?)
And Mike said:
> What would have happened to the Iranian systems if they had some one walking around looking at real gauges and listening with real ears and had a human nose picking up the scent of things grinding away. <
This is a concrete architectural suggestion similar to ones I have thought about. It’s based on the principle of redundant communication. If there is a hardwired analog gauge visible to someone walking around and it doesn’t match the HMI display, then somebody might notice and might take some action before the 984th centrifuge spins out of control.
One of the problems is that in many cases (this being one of them), there is no meaning to the concept of “hardwired” or “analog.” It occurs to me that something like that might be accomplished by cheating, violating the layered architecture that separates the presentation layer from the model and the controlled equipment. If the HMI layer could directly read a signal from the equipment over a different wire, it might be harder to intercept both with a simple software exploit. Another approach might be to build a small PLC system into the equipment itself with its program burned into a PROM and it’s small display welded to the case. Makes updating harder, but that is precisely the idea—it could not be digitally compromised, only physically.
Adam Sroka and William Petri suggested that getting the various specialties in the same room is a step in the right direction. I agree, but this has little or nothing to do with agile development. I’ve been part of (dare I say it?) waterfall projects that used this kind of teamwork to good effect. The “agile means quality” argument is also appealing, but quality in the sense of security will be improved only to the extent it is explicitly attended to. It’s Weinberg’s Law: Whatever you measure or pay attention to is improved. Again, it applies to all development approaches.
I do wonder if pair programming paired with multi-disciplinary design might make a good lever, particularly if each pair is charged with watching for security flaws as well as coding defects. So-called pen-testing (penetration testing to identify security flaws) might be made part of test-first development. Or maybe it would have to be built into system regression tests?
Of course, the vulnerabilities in industrial control systems are as much in hardware and fundamental PLC architecture as in software, so radically more secure systems will require integral hardware-software design and development teams—lean engineering with agile software development. Now there’s an idea!
--Larry Constantine, IDSA, ACM Fellow
Professor | University of Madeira | Funchal, Portugal
Institute Fellow | Madeira Interactive Technologies Institute | www.M-ITI.org
Fiction “to feed the inner nerd” – Bashert , Web Games, and The Dome, political thrillers from Lior Samson | www.LiorSamson.com