Book Review: Fooled by Randomness by Nassim Nicholas Taleb

January 07, 2026

|

Hassan Anees

One of the most profound statements I have read in Fooled by Randomness was how we judge performance in any field.

I start with the platitude that one cannot judge a performance in any given field (war, politics, medicine, investments) by the results, but by the costs of the alternative (i.e., if history played out in a different way). Such substitute courses of events are called alternative histories.

The quote above was the sole inspiration for this post. It applies to the information security field where practitioners constantly take measures to safeguard sensitive information like your personal, health, and financial data. The goal is always to avoid the costly scenarios like data breaches and ensuring critical operations stay uninterrupted.

I write to express what a fantastic book Fooled by Randomness is and to reflect on areas of the book that influenced my thinking. This post is not meant to summarize Nassim Taleb’s writing as he is clear in his literature. Instead, I am here as a student of psychology and a practitioner in security attempting to relate the principles laid out in the book within my own field.

We’ll talk about two areas:

  1. Our brains suck at nonlinear thinking

  2. Why we choose to make good impressions over meaningful impact

As a brief overview, Taleb’s book highlights the role randomness plays in financial markets. You learn about common misunderstandings of statistics and probability and how it applies in any field. Many arguments made in the book are rooted in principles of human behavior which I’ll highlight throughout the post.

Our Brains Suck at Nonlinear Thinking

Your brain has a difficult time understanding nonlinear outcomes because your emotional circuitry is built to understand linear outputs instead. Taleb highlights that when we believe two variables are causally related, then we also believe that a steady increase in one variable will always produce a result in the other variable. This concept is important since this causes people to misjudge the likelihood of rare events which leads to overall poor decision making.

We can illustrate this notion with an example of learning to play piano. I would expect the more I practice playing piano, the better my skills will become. However, the reality is that I could spend months practicing and still not be able to play a single song during that time. We often expect a linear outcome. Meaning that if you practice consistently, you will consistently get better. Taleb states that we rarely see satisfying linear outcomes in reality.

However, let’s say that I have the mental toughness to continue despite no visible results. At the 6th month I have an “aha” moment and I am suddenly able to play like Mozart. This is known as the tipping point or a nonlinear outcome. We see this all throughout nature as well. An ice cube will not melt at 29°F, 30°F, or 31°F, but at 32°F (0°C).

Nonlinear outcomes are difficult to comprehend because small changes can lead to disproportionate results. We like linear outcomes because of their predictable nature and ease of understanding. Anything that is easier to understand is using availability bias. A natural human tendency to over weigh the importance of information that comes to us more easily.

In similar fashion, Taleb says many people both undervalue and misunderstand nonlinear outcomes. This leads us either blindly ignoring rare events or being ill prepared for them.

Making an Impression

This section leans into the human behavior concepts seen throughout the book.

Most people are hell-bent on making a good impression and do everything to avoid making a bad one. This behavior is visible in all relationships, whether it is personal, business, or political.

Within the realm of business, Taleb hammers against the role of the modern day risk manager within large financial institutions throughout the book.

From the standpoint of an institution, the existence of a risk manager has less to do with actual risk reduction than it has to do with the impression of risk reduction.

Unfortunately, this type of behavior is carried through many industries.

In the information security field, we typically see this as “security theater” which means to implement measures that evoke a feeling of security without actually improving security. You can see this carried out by governments on a large-scale as they usually prioritize visible measures over effective ones.

Ross Anderson illustrates this well in his book Security Engineering on the measures taken post 9/11. A notable example being the distribution of security funding by the TSA. The TSA could have reduced most of their risk of hijackers by spending $100m on reinforcing cockpit doors. Instead, to demonstrate a visible measure, they spent the majority of their budget (around 14.7 billion) on passenger screening.

The Bottom Line

Fooled by Randomness by Nassim Nicholas Taleb is a book worth reading for those interested in understanding hidden risk, misinterpretations of probability, and how randomness is nested in everything we do.