Antifragile: Things That Gain From Disorder by Nassim Taleb

If I were forced to compile a list of “top ten books I have read that help a person understand the world” this book would certainly be on it. I have to admit this book review was a long time coming. I first read this book in 2017, and I have procrastinated on writing a book review since then, mostly due to wanting to do justice to such an important compilation of insights.

Book review with synopsis: Antifragile: Things That Gain from Disorder by Nassim Taleb

While Taleb is perhaps most widely known for being a financial industry commentator (and better known as the author of The Black Swan rather than this book), Antifragile could most accurately be described as a modern work of practical philosophy. As such, it covers a wide breadth of topics, with application not only to finance, but also to biology, engineering, politics, culture, business and entrepreneurship. With this book Taleb always approaches these subjects from fresh perspectives, while ruthlessly evaluating the empirical data to form his conclusions.

Antifragile is the fourth book in the five-volume philosophical treatise on uncertainty and risk management titled Incerto. One could certainly read only a single one of these books (or read them out of order) and still gain a great deal of insight. However, to gain the full perspective that Taleb offers, it probably helps to read all of them, ideally in the order they were published as the ideas tend to build on previous works. I have read two of his three prior books, and I’m curious to find out what he has to say in his latest book Skin in the Game, which came out last year, but the reason I chose to focus on Antifragile is because of the unique concepts in this specific volume that really spoke to me and allowed me to view the world with new insights.

If you’ve read any of the negative reviews on Taleb’s writings, you should probably just ignore them. Such reviews have generally been written by people who don’t like what Taleb has to say, since Taleb often puts the media, press, bureaucrats, politicians, bankers, and academics very effectively in his crosshairs. Yet these are exactly the types of people who commonly write book reviews of the type of books that Taleb writes, and they likely do not want cogent critiques of their standard operating procedures to be widely disseminated and understood. Thus, they attempt to discredit Taleb.

With the book at 544 pages, Antifragile is no weekend read. I do recommend you read the book (and his others too). But if you have already read the negative reviews and aren’t sure if you want to spend money on buying the actual book just yet, or if you prefer to get many of the concepts without Taleb’s controversial writing style (he tends to be pompous – but if you can tolerate and overlook that you will learn a lot), I offer a synopsis below, with some added explanations or insights. If any condescension comes across, that is a result of Taleb’s writing style and not my intended writing tone.

Synopsis:

One of the central but superficial takeaways from this book is the very concept of something being “antifragile” – a word which Taleb had to coin because the English language fails to have another word to accurately capture the idea. Briefly summarized, anything – a living organism, a society, a company, a technology, a financial system, etc. – could be classified in one of three ways:

1. Fragile – breaks easily under conditions of Stress, Volatility or Randomness (SVR – my term, not Taleb’s).
2. Resilient – is resistant to breaking under SVR, or able to recover quickly from SVR, but does not benefit from SVR either.
3. Antifragile – actually grows stronger and more durable as a result of SVR. This is more than mere resilience and resistance to SVR, but is where SVR is actually beneficial and constructive.

If you think of this in mathematical terms, a fragile response to stress would result in a -1, a resilient response to stress would result in 0, but an antifragile response to stress may result in a +1, or +2. Or to use Taleb’s words to explain: “the opposite of positive is negative, not neutral”. A real world example of working out in a gym demonstrates that human biology has a certain degree of antifragility: the stress of lifting weights makes us stronger.

However, this book expands far beyond this very simple concept, and provides numerous real world examples of both fragile systems and antifragile systems.

A key concept that ties in with many other topics in the book is the concept of “asymmetry”. A simple way to think about asymmetry is to consider the cost (or risk) to benefit ratio of something. If something has a low cost relative to a high benefit, that could be described as having positive asymmetry. Conversely, if something has a high cost relative to a low benefit, that could be described as having a negative asymmetry.The concept of asymmetry contrasts with that of symmetry, where there is a correlated relationship between benefit and cost: if you increase one, you increase the other as well. The take home lesson for life, finance/business, and society is to structure things to maximize positive asymmetry – more benefit with less cost.

One of Taleb’s favorite examples of negative asymmetry is airline travel. In airline travel many “stressors” or errors can go wrong that can result in the flight being late, and these stressors often compound each other. On the other hand, the best case scenario is usually that the flight will be on time, or maybe just slightly early. Flights are often four hours late, but they are never four hours early, and only occasionally four minutes early. So the potential downside is large, and the potential upside is small – a classical negative asymmetry. Taleb provides a few examples of positive asymmetry. One of his favorites is the use of financial options, where one’s potential losses are limited, but if you buy the right option, your gains are potentially unlimited. This is a classic example of positive asymmetry: small downside risk, and a lot of potential upside opportunity.

Taleb uses a simple test to determine whether something is fragile or antifragile. If unexpected random or variable events have a negative impact, that is fragile. If those same unexpected random or variable events have a potentially positive impact, that is antifragile. In the context of a business owner or manager, this could apply as follows: if your gross sales increase 10% how much do your profits increase? Conversely, if your gross sales decrease 10% how much do your profits decrease? If the answer is that the change is the same in both directions, then you have a typical linear system. If the change in profits is greater in the first case you have (antifragile) positive asymmetry, and if the change is greater in the latter case you have (fragile) negative asymmetry. Developing positive asymmetries is the key to antifragility and long term success in Taleb’s view.

The concept of asymmetry then ties in with the concept of nonlinearity and nonlinear systems. A linear system is one in which the output is directly proportional to the input. In other words, one unit of input leads to one unit of output. Linear systems tend to be relatively simple systems. In contrast, nonlinear systems tend to be more complex, and the relationship between input and output often do not correlate well. As a result, understanding cause and effect relationships is difficult in nonlinear systems (and humans have a poor track record in managing them). In the context of finance, non-linearities are said to have convexity,  and this is a concept that Taleb returns to frequently, because convexity can lead to positive asymmetry.

Furthermore, in nonlinear systems one unit of input might result in ten units of output (positive asymmetry), and ten units of input might result in one unit of output (negative asymmetry). A tricky aspect of dealing with nonlinear systems is that interventions often have unintended consequences. Thus we need to be very careful with such interventions. It is very difficult to set a tax or monetary policy that only has one effect. Similarly it is very difficult to develop (or use) a pharmaceutical drug that only affects only one gene without having side effects. Examples of nonlinear systems include biological ecosystems, biological organisms, and free market financial systems.

A common aspect of fragile systems is that the response to stressors is nonlinear. As such, the cumulative effect of small shocks is less than the effect of an equivalent single large shock. Having 100 one pound rocks dropped on you over the course of an hour won’t kill you, but having 1 one hundred pound rock dropped on you all at once probably will.

Another one of Taleb’s central ideas is that of “optionality”. The financial concept and product of options has existed in financial markets for nearly a hundred years, but Taleb applies the concept in a much broader way. In the broadest sense, Taleb’s concept of optionality is to structure any circumstances in such a way that you can choose to act on good opportunities when they present themselves, but not be required to act on circumstances when doing so would be disadvantageous or neutral. In other words, act on the good, and ignore the bad. So you could say that positive asymmetry is a type of optionality. A key benefit of optionality is that it doesn’t require one to be a predictive genius about future events, it only requires one to have enough intelligence to not make self detrimental choices, and recognize beneficial outcomes when they occur. However, optionality does require you to tinker (on a small scale) – because it opens up possibilities to take advantage of.

A real world example of this could be to keep your resume posted to job sites, even if you already have a good job. If you get offered a better job, you can take it. If you get offered lesser jobs, you ignore the offers. Taleb makes the interesting point that biological evolution itself makes use of the principles of optionality and positive asymmetry. When a genetic mutation occurs that is harmful, the organism tends to die or fails to reproduce, so is therefore ignored genetically. But when a genetic mutation occurs that is beneficial to the organism, that mutation then tends to spread throughout a population because the organism will be more likely survive and breed more. Thus positive optionality occurs.

One of Taleb’s principle suggestions to develop antifragility is to combine the idea of positive asymmetry with building “robust” (essentially resilient) systems. This combination allows you to take advantages of opportunities when they present themselves (whether intentionally created or randomly occurring), while insulating you from disruptive events when they inevitably occur. The idea is essentially to structure optionality in your favor. He uses the analogy of a “barbell strategy” for investing with which one invests in extremely low risk, stable assets on one end, and then invests in risky assets with high potential opportunity on the other end. To Taleb, investing in the middle ground takes on more risk than people realize, while offering less opportunity than people believe – a fragile combination. In contrast, his barbell strategy theoretically provides antifragility. The same could be applied to your career: take a safe job during the day, work on a risky business with lots of upside at night.

The same is true of entrepreneurial business risk takers, according to Taleb. Taleb argues that risk taking entrepreneurs should be either publicly financially supported, or at least lauded for their attempts, even if they fail. The reason for this is simple: while some of them may not have great ideas, some of them will, and the positive optionallity they create will benefit all of society. If entrepreneurs were not willing to take those risks, those new innovations or inventions would likely never come into existence. So just like in evolutionary genetics, some percentage of a population has to be sacrificed to benefit the whole.

On the opposite end of the spectrum, Taleb makes several arguments in the book that the way things are commonly done in finance and government have undesirable unintended consequences of making those systems fragile. For example, bailouts propping up failed companies may be well intentioned to ease the pain in the short term, but they make the overall system more vulnerable to black swans and other catastrophic failures in the future. Taleb makes the point that depriving systems of necessary stressors is not actually a good thing and can be downright harmful. If parents remove all of the stress and uncertainty from their child’s life, the child will never develop the psychological skills to address such circumstances.

Just as depriving a person of weight bearing exercise (for example astronauts in space) makes their bones and muscles frail and weak, depriving such systems of necessary self corrections means dysfunction, or in the case of biology, pathology develops. When applied to financial systems, randomness, uncertainty, and volatility damage modern systems that are fundamentally fragile, because they have not been allowed to properly (and naturally) self correct. We witnessed exactly that during the 2007-2009 financial crisis.

To me Taleb’s message seems to be, for real free markets to function in a healthy way, Schumpeter’s creative destruction must be allowed to work on individual bases, or eventually the entire system will fail into dysfunction. Taleb ridicules those who promote such nanny state coddling as “fragilistas” – people who in the end create and promote fragile systems. Another trait of fragilistas is that they tend to fall into the “I/we have all the answers” trap, instead of admitting that they may not know something (which is often a better solution). Fragilistas believe they know the cause and effect of everything, even when the evidence demonstrates they have been incorrect in the past. The problem with this, according to Taleb, is “they make you engage in policies and actions, all artificial, in which the benefits are small and visible, and the side effects are potentially severe and invisible.” Policy makers are experts at creating unintended (usually bad) consequences. And perhaps worst of all, such policy makers often do not have to face the consequences of their policies, while the rest of us do. (For example, politicians and corporate executives still get plush retirement pensions or packages, regardless if the economy crashes or a business fails.)

What are some key traits that make a system either fragile or antifragile?

Centralization tends to make things fragile, because there are key points of failure that affect large numbers of people and other systems. Decentralization produces more robustness (if not necessarily antifragility) because localized points of failure impact fewer people and other systems, and also because smaller groups are more agile or flexible to be able to implement solutions. Finally, decentralized systems can better take advantage of optionality and positive asymmetry. With a greater number of “experiments” (decentralized activities) taking place, people can learn from one another and implement best practices that others have developed (positive optionality).

Similarly, our thinking can be either fragile or antifragile. According to Taleb, believing in hindsight that past phenomenon could have been predicted tends to lead to fragility, because it tends to make us overconfident in believing we can accurately predict the future. The central cause of this fallacy is that we tend to believe that the future will be similar to the past. Taleb discusses the idea that the 20/20 benefit of hindsight is cherry picking – it is not predictive of future foresight. Taleb touches upon the idea that humans have a bias or thinking error that tends to smooth out the irregularities of life. We tend to impose our own interpretations of events, to try to make sense out of things that inherently don’t make sense. Like “seeing” a constellation in the stars that is not really there, we perceive patterns in human events that may or may not actually exist. These biases leave us vulnerable to fragility, and we only become antifragile when we accept that we cannot predict the future with any accuracy. According to Taleb, future events are in reality much more random and filled with more volatility than we typically believe, plan for, or prepare for. The antidote is to accept that future events will be highly difficult to predict with accuracy, and plan accordingly.

To traditional risk management professionals, future events follow a “normal distribution” curve in statistical probability. To Taleb, this belief is a naive fallacy; to Taleb, the future is full of “fat tail” events that would traditionally be considered low probability events by the mainstream. An example of this is how most mainstream financial analysts, including the Federal Reserve Board, did not predict the housing crash of 2007. To mainstream analysts, those are one off unusual events. To Taleb, the unusual is the usual, it’s just that the extreme events are different each time. Since people tend to focus on things that have hurt them in the past, they tend not to perceive other, different risks, and thus do not easily see them in foresight.

In Taleb’s view, it is better to be prepared for unpredictable and extreme events, since the down side to doing so is relatively small (in opportunity cost), while the upside of insulating oneself from such risky events is large. In other words, if the worst case comes to pass we can weather it, and if things turn out better than that, great, we can benefit from that too, using optionalities. For example, a person living in Houston could make a lot of preparations for an offshore hurricane and buy discounted tickets for an Astros game. If the hurricane hits Houston, she is ready (and may be able to get a ticket refund). If the hurricane does not hit Houston, she can go enjoy the ball game, and still has the preparations in place for the next hurricane. Thus, such strategies maximize opportunities while minimizing risks.

One of Taleb’s lesser lessons in the book is that antifragility consists more of avoiding negative pitfalls rather than frequently being correct. You only have to get things right occasionally to be successful – if you can avoid destructive mistakes most or all of the rest of the time.

Taleb puts considerable weight on things that have been around for a long time (particularly literature and ancient wisdom) because they have a proven track record. He does so because to Taleb, time serves the same function as stress or volatility: time weeds out the fragile, and what is left must be antifragile. Therefore, if something has been around for a long time, it will likely continue to be around for a long time. And on the flip side, if something is new it will probably will not last, because it is probably fragile. In other words, his default assumption is fragility until proven otherwise. He also makes the point that things that were built in the past were “overbuilt” – thus they have lasted a long time, but were not efficient at the time. In modern times we tend to build things that are just barely strong enough to meet their designed or desired purpose. This makes things more efficient now, while also ensuring that they will likely break in the longer term, and thus are fragile.

Taleb brings this up to emphasize the point that overbuilding things and creating redundancy makes things resilient to stress, while building things only for efficiency makes them fragile. For example, decades ago retail stores used to have large back store rooms that held excess inventory. That was not efficient, but it did provide some resilience if there was a supply chain shock. Today’s retail stores use Just In Time delivery systems, which are much more efficient, but are also much more fragile to disruptions. Redundancy may seem like an inefficient waste if nothing unusual happens. Except that unusual things happen with usual regularity. Thus why insurance was invented. In nature, redundancy is a central risk management strategy.

Talab spends some time (not nearly enough IMHO, hence his follow up book Skin In The Game) commenting on the fact that in our modern society, particularly in finance and politics, actions have become divorced from consequences, and incentives have become distorted and perverted. This reality has created many moral hazards and negative consequences to people who do not deserve them. For example, Wall Street executives took on tremendous risks that benefited them and their firms financially when times were good, but when the 2008 financial crisis hit they were let off the hook from the consequences of taking those risks and were bailed out by the US Federal Government and Federal Reserve. Thus the profits were privatized to the banks and the costs were passed on to those who did not deserve them – US taxpayers. Taleb goes back to the “old fashioned” idea of holding people accountable for the consequences of their actions. Or to put it in Taleb’s vernacular, that participants need to have “skin in the game”. If they do not, they can set others up for failures from which they themselves benefit. Along these lines, Taleb points out that there is a very common fallacy among fragilista elitists that just because something is legal, that also makes it moral. Which is unfortunately not true, since the laws are often written by special interest lobbyist criminals for their own benefit.

One of my favorite passages from the book is quoted below, because it directly applies to what Nature’s Complement is doing. While the quote is directed at the pharmaceutical industry, it could just as well be about the personal care product industry: “[S]mall companies and artisans tend to sell us healthy products, ones that seem naturally and spontaneously needed; larger companies — including pharmaceutical giants — are likely to be in the business of producing wholesale iatrogenics, [things that harm health] taking our money, and then, to add insult to injury, hijacking the state thanks to their army of lobbyists. Further, anything that requires marketing appears to carry such side effects. You certainly need an advertising apparatus to convince people that Coke brings them ‘happiness’— and it works.” The same is true of personal care products and cosmetics that harm people’s health – they too require large marketing budgets, full of “Jedi mind tricks“, to convince people to use them.

Here is a roundup of some secondary useful or amusing ideas to look for in the book, which I don’t have room to cover in this synopsis:

  • The Streisand Effect – Trying to make an idea go away will make it more prevalent
  • The Procrustean Bed – Making the data fit your model instead of making your model fit the data
  • The Turkey Problem – Why black swans are so difficult to spot by those only crunching statistics (without broader perspective)
  • Buridan’s Donkey – Randomness can help us make decisions
  • Stoicism – Makes us more antifragile
  • Domain Dependence – Lessons learned in one area may, or may not, transfer to other areas
  • Limitations of language – Language can limit our ability to capture and express complex ideas
  • The Agency Problem – Incentives need to be aligned, or problems inevitably crop up

Finally (for the purpose of this synopsis, since this book contains a lot more), four of Taleb’s central conclusions could be summed up this way: 1. It is incorrect to think that randomness is inherently risky and therefore bad; 2. It is incorrect to think that one can eliminate randomness by trying to eliminate randomness. Attempting to do so only leads to fragility and even greater levels of randomness. 3. It is more important to think in terms of the ratio of upside benefit to potential downside risk/cost than it is to think in terms of true vs. false. The former is how successful business people and investors think; the latter is how scientists and philosophers traditionally think. 4. Many problems can be solved by removing things (“via negativa”) rather than by adding things. For example, if you are sick, don’t add pharmaceuticals; instead, stop eating unhealthy “food”. Or, if the economy is weak, stop interfering with it, instead of interfering more while trying to “fix” it.

There is so much more to this book in the valuable ideas it brings to the table I cannot possibly do it justice in a brief book review. One could quibble that the book could be better organized, and perhaps so. But in my view the main disadvantage to reading this book is that many people will probably find it to be very dense, slow reading. The concepts that Taleb covers tend to be both abstract and complex, and therefore require a high level of concentration to fully comprehend and integrate into ones thinking and actions. Yet the intellectual payoff is well worth the level of work required.

For Health,
Rob

Nature's Complement is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program. If you purchase products on Amazon through any of our affiliate links, we get a small percentage of the transaction, at no extra cost to you. We spend a lot of time writing the articles on this site, and all this information is provided free of charge. When you use our affiliate links, you support the writing you enjoy without necessarily buying our products. (However we would appreciate if you would do that too!) Thank you for helping to support our work, however you choose to do so.

These statements have not been evaluated by the Food and Drug Administration. This information and/or products are not intended to diagnose, treat, cure or prevent any disease.