On Risk

Proximity overrides any other factors in risk analysis.

Geographical, emotional, personal, and economic proximity (especially immediate personal gain) often outweighs rational risk assessments.

Prepare for the consequences and disruptions, not for the disasters.

Focus on preparing for the ripple effects and disruptions that follow disasters. This broader approach ensures resilience against a wider range of potential issues. The disasters that may happen are unknown but the disruptions in your systems are finite and well known.

Know your biases.

Humans are terrible at estimating risks (but probably better than most other creatures) and we have so many biases that affect our judgement. Familiarize yourself with key biases, remain aware of them, and constantly question your assessments.

Accidents do not exist. There is always causality.

Everything is interconnected. Nothing exists in isolation. For everything that happens there is a cause. Recognizing this interconnectedness is vital, especially when analyzing failures or crashes.

There are unknown unknowns.

Donald Rumsfeld

AwareNot aware
UnderstandKnown knowns: Things we are aware of and understandUnknown knowns: Things we are not aware of but do understand or know implicitly
Don’t understandKnown unknowns: Things we are aware of but don’t understandUnknown unknowns: Things we are neither aware of nor understand

Know how complex systems fail.

Complex systems are intrinsically hazardous systems.
Complex systems are heavily and successfully defended against failure.
Catastrophe requires multiple failures – single point failures are not enough.
Complex systems contain changing mixtures of failures latent within them.
Change introduces new forms of failure.
Complex systems run in degraded mode.
Safety is a characteristic of systems and not of their components.
Richard I. Cook

Software systems are very complex systems.

Widespread adoption turns technology into a threat vector.

“For just about any technology, be it an operating system, application or network, when a sufficient level of adoption is reached, that technology then becomes a threat vector.” — George Spafford

A man with a watch knows what time it is. A man with two watches is never sure.

Segal’s law

Anything that can go wrong, will — at the worst possible time.

Finagle’s Law

“If there is a possibility of several things going wrong, the one that will cause the most damage will be the one to go wrong.” — Murphy’s Fourth Law
Corollary - if there is a worse time for something to go wrong, it will happen then.


Last updated: 20 February 2025