Finished: January 12, 2025
Why I read this
This is one of those books that is niche but at the same time somewhat well known. In my strange circle of project manager and engineer friends it probably comes up about 1,000 times more frequently than it might come up in any other setting. Yet I feel like it is still not unheard of in popular culture, similar to what people might know about semi niche books like The Intelligent Investor (a title by Benjamin Graham that I have yet to fully finish), or The Power of Geography that I read last year. It’s that type of book that although it covers a subject that normally interests a very small percentage of the world (whether it be geography, investing, or systems thinking) its release was well enough acclaimed that it managed to jump outside of its typical community to be known more universally like some B List actors. All of this to be a long way to say that, this is a book I’ve been hearing about for a while and one that I’m glad I was finally able to tackle.
What I learned
First of all this book was much more scholarly than I thought it was going to be. Often these types of books make a large effort to make the content as approachable as possible so that the maximum number of readers can get value out of it. Although the examples were events which we all know about such as famous plane crashes, oil spills, or space shuttle explosions Dekker’s insistence on diving into the technical details, while at the same time pushing to relate the content to the somewhat counterintuitive problem of systems thinking made the long rambling paragraphs a challenge to always connect with.
Despite the fact that I probably did not capture everything that I was meant to understand from reading this, I can confirm that this is a book that makes you smarter and encourages the reader to think and to learn. The more I have reflected on it since finishing it a little over a week ago, the more I have gone back to the themes in the quiet moments of the day to think on them further. Of these ideas I wanted to share a few key ones that expanded a bit my vision of complicated systems and how they function here with you.
The first and most basic item was that Complex Systems cannot be completely understood by any single reference point. It sounds silly to think say like that, but as Dekker explained, this fact has rather broad implications. A system that has become Complex cannot be fully understood, and therefore must be treated differently than traditional systems. For example, the stock market might have begun as people purchasing small profitable portions of a company (I’m not exactly sure how the stock market began, I will have to look it up later), but now the market is such a massive interconnection of parts and pieces that when this market performs, or does not, it cannot be treated like a broken car where you can open the hood, look for the broken part, replace it and be on your way. Instead problems with a complex system like this must be addressed abstractly and holistically and even then we are not sure of the efficacy of our solutions. This is why we see economists with PHD’s arguing with each other over whether or not a president’s economic agenda is good or not and why we don’t see mechanics arguing over whether an engine will make a car drive.
But going beyond the fact that complex systems are not able to be understood Dekker pointed out that we as a society hold onto this opinion that any issue can be dug into until you have a root cause and after resolution of that cause the problem will disappear. We even do this with things that are extremely abstract. Notice that if someone appears unhappy the first question is often “what’s wrong?” Even though for most people the complex web of experiences and interactions that has led someone to be sad, or anxious, or worried is completely impossible to understand, we still search for the simple answer to this question as if it can be pinpointed and resolved.
On a different point Dekker explained how the fundamental objectives of many organisations will ultimately and without fail lead to eventual failure. He explains this in the idea that each company is ultimately searching to produce the maximum of unit X while consuming the least of unit Y. As people search for efficiencies to reduce the consumption of time, resources, or energy required to produce something they are going to continuously erode the margins of error and when, not if, the process is subjected to an eventually error or mistake, it will no longer have any room to adapt and it will shatter. This is even more evident in organisations with large safety considerations, mine being one of them. In dangerous fields, such as infrastructure construction, the goal of the company is usually to construct as much infrastructure as possible while spending the least amount of time to do so. Yet, at the same time operational managers are put under enormous pressure to do it safely without incidents or accidents. When taken to the extremes a manager searching for a guaranteed safety rating of 100% (having absolutely 0 accidents) could reasonably conclude that safety is his more important objective (over say generation of revenue) and keep his team isolated in a padded room instead of producing. The reverse would also be true where any manager looking to absolutely maximise profit and production would be forced to ignore the additional costs and time required to perform work safely, resulting in an environment ripe for disaster. This duality of conflicting goals is a very real phenomenon for which there is no very real solution. This is the challenge of safety focused organisations and it is one that only a systemic approach will be able to address in the future. Yet, in spite of this contradiction in objectives Dekker observed that good managers and performing organisations combat this challenge with staff that are sensitive to both objectives. Staff that are willing to go the extra mile to work together for the ultimately desired result instead of focusing on singular metrics or objectives. He even gave the advice that the easiest way to find those who are not helping the organisation is to find those people who auto-silo themselves. Those employees who say things like “my job is to do X, safety isn’t my department.” I’ll be sure to look out for this type of person in the future.
Finally, I wanted to touch on an idea that I will address much more in my next review on a famous Malcolm Gladwell book. This is the idea of tipping points. Made famous by the idea of white flight (an observed social phenomenon where predominantly white communities would slowly integrate black families until suddenly after a certain percentage of the community was black, the community would rapidly become composed of only black families, where all of the white families had left) actually applied much more broadly that the social concepts in which it is applied. We like to think that many systems work linearly. That the system changes slowly in a manner that is predictable based on the increase or the decrease of the inputs. However that is actually scientifically rare. In most systems the changes happen rapidly and violently based on a very small change in the inputs. To explain this Dekker referenced the states of matter noting that a liquid is only a liquid until a certain exact temperature. You can apply heat to water between 0.01 degrees Celsius until 99.9 degrees Celsius and it will behave exactly the same. However if you subtract the 0.01 to go to 0.0 or add the 0.01 to go to 100 the water changes dramatically and instantly into a solid or a gas. This is how accidents happen in organisations, small systematic changes and deviations happen over a very long time in which no one even takes notice of them until finally something breaks and there is your exploded Space Shuttle or broken offshore oil drilling platform.
What I didn’t like
As far as the content goes, it was so technical that I don’t have (nor do I feel qualified to have) much of a critique, however I would note that there are some books that are good for audio listening and this is not one of them. The rambling sentences filled with technical and analytical jargon were extremely difficult to follow without the ability to flip back regularly and think through the sentence. So unlike my regular challenges with a book I think this negative will simply motivate me to pick up a hard copy in the future and to read through it again.
Questions I asked
How can goals be balanced in an organization where safety and performance are contradictory objectives? For example, in dangerous occupations the safest way to perform the works is to not perform them at all.
How can we hold people responsible for accidents when it’s not possible for them to have had all the pertinent knowledge before hand?
If an organisation simply decides to stay static (instead of growing) can it avoid the drift into failure?
My Favorite Quote
“Arriving at the edge of chaos is a logical endpoint for drift. At the edge of chaos, systems have tuned themselves to the point of maximum capability.”
Sydney Dekker
Books I liked like this one
Debt: the First 5000 Years : David Graeber (for a deeply technical look at an aspect of human society.)
The Power of Geography : Tim Marshall (for an interesting dive into an extremely niche subject)

