The Swiss Cheese Model of System Accidents

James Reason’s Swiss Cheese Model of System Accidents is quite a useful way to to think about how failures can happen, even when you have multiple layers of “defence” in place. It’s been applied to things like aviation and medical safety, but it’s equally appropriate to apply it to your own design work. We’ve all worked on projects where a number of small unforeseen issues have lined up and created serious problems. James Reason elaborates:

“[T]wo important features of human error tend to be overlooked. First, it is often the best people who make the worst mistakes—error is not the monopoly of an unfortunate few. Second, far from being random, mishaps tend to fall into recurrent patterns. The same set of circumstances can provoke similar errors, regardless of the people involved. The pursuit of greater safety is seriously impeded by an approach that does not seek out and remove the error-provoking properties within the system at large.” – James Reason (2000)


Swiss Cheese Model applied to Air Safety (Astra Project)

What’s nice about this model is that it encourages you to look at the pattern of issues that occurred, rather than to simply ask an individual to “pay more attention next time”. It’s really more of a metaphor than a model, but it’s still useful to keep in mind during your project post-mortem meetings. You are doing project post-mortems, aren’t you?

2 comments