Categories
Books

The Black Swan, by Taleb, and the Electronics

The Black Swan is a best-selling book written by Nasim Taleb. Its main quality is to ensure that the reader sees the world in a different, much more insightful way, and it is therefore a must-read.

Although focused on philosophy and economics, it can be analyzed from the perspective of any area or profession. In this article, the most important concepts explained in the book are summarized, and reflections are provided from the point of view of electronic engineering (in a broad sense).

1. The Black Swan

In line with the title, the black swan is the fundamental concept around which the book develops.

Summarizing to the minimum expression, to allow the potential reader to enjoy the book’s reading experience to the fullest, a black swan is an unexpected phenomenon, which has a great impact, and which is introduced into history in a platonic way by so-called analysts and intellectuals, only after the fact.

For example, the attack on the twin towers is a negative black swan, because of the terrible consequences. On the other hand, the explosion of artificial intelligence (AI) technologies is a positive black swan (except perhaps for those who had enough knowledge to know that it was going to happen sooner or later).

Its name comes from the analogy with the color of swans. Typically, any swan found on a lake is white. However, there are also black swans, although they are rarer and much more difficult to find.

Around the idea of the black swan, there are other concepts that are thoroughly developed in the book, which are simplified below.

1.1 Confirmation Bias

Human attitude whereby we traditionally associate the data we see with a belief we have pre-established beforehand, even when a more detailed analysis of such data could refute a large part of our original assumptions.

This bias also commonly occurs in the opposite direction. In addition to attributing data to our original beliefs, we humans tend to look only for information that tends to confirm our hypotheses. For example, when we read the press closest to our political and economic ideology, and we do not analyze the media of the opposite sign.

1.2 Narrative Fallacy

Phenomenon that consists of the platonization of known events, through the artificial creation of a series of sequences that seem to lead to the event in a logical way.

The cause-and-consequence relationships that are established, often by prestigious academics and analysts in suits and ties, seem to indicate that the event could have been foreseen. However, in practice, the very essence of the black swan makes it unpredictable.

1.3 Silent Evidence

Silent evidence is that which is hidden, or simply goes unnoticed, so that other evidence of potentially contrary sign manages to impose a pseudo-truth that may be absolutely false.

For example, if those who prayed and were saved from a storm on a ship expose their reality: praying to the Gods allows you to survive in a shipwreck. But… Who can listen to the opinion of those who prayed and are at the bottom of the sea?

1.4 Inability to Predict

Basic education, as well as more advanced academic training, tends to make us believe that everything that happens can be predicted. We consider ourselves capable of establishing cause and effect relationships for everything. Implicitly, this makes us believe that we can model everything around us.

However, in practice, the randomness of the events around us is much more complex. For example, many events are influenced by human behavior, which is far from modelable. Without going into depth, as an anecdote, there are humans who simultaneously prefer apples to pears, pears to oranges, and oranges to apples!

2. The Black Swan in Electronics

In general, at first approximation, in the day-to-day design and development of electronics, one might think that a black swan is the result of a lack of knowledge, or the consequence of not following an adequate strategy, for example by introducing requirements late. In that sense, a black swan in an electronic design would seem to be an avoidable phenomenon with good practices.

Below, we will look at examples halfway between the two extremes, but which all of us who work in this industry have experienced to a greater or lesser extent.

2.1 Confirmation Bias in Electronics

When debugging a problem, whether hardware or software, it is common to have suspicions of a causal reason. Confirmation bias occurs when associating any experimental manifestation with the a priori identified cause, even when that same manifestation may be consistent with other reasons.

On many occasions, this attitude is related to the ease of testing the potential solution. Indeed, if we are debugging software, any result that may be indicative of the initial hypothesis can be corrected relatively quickly in the code to try to confirm the assumption. However, in hardware, the manipulation of a board for a measurement may involve a lot of time spent working on a prototype. In any case, it is not uncommon that we get carried away by first impressions, instead of thinking in an absolutely skeptical way, and waste time with tests that could have been avoided.

2.2 Narrative Fallacy in Electronics

Although it may seem related to the previous point, there are cases in which the narrative fallacy applies individually and in isolation.

They are usually cases in which it seems that a problem has been debugged and solved definitively. Based on the manifestations and actions taken to solve the problem, a whole story has been created with steps and logical reasons that confirm that the problem has been identified and solved… To discover some time later that the problem continues to manifest itself.

An example of this author’s own experience occurred in a VHDL design that presented problems of metastability. After several modifications that seemed to solve the problem logically, subsequent syntheses again resulted in the same event. In other words, the whole logical argument that I had worked out fell apart. Finally I discovered that there was a flip-flop in which the reset had been coded asynchronously, not being synchronized with the rest of the flip-flops.

2.3 Silent Evidence in Electronics

Silent evidence is constantly being produced in electronic design. There are so many variables associated with a product that it is impossible to think from the perspective of each and every one of them. Here are a couple of examples that I have experienced during my career.

On one occasion we used an integrated circuit (IC) in an internal configuration well detailed by the manufacturer, but in an application that was not exactly the one explained in the datasheet. In practice, the circuit did not work properly, since resets were necessary that only fixed the locking randomly. It was one of those situations where it seems that a first level chip does not work. We saw that the forums were filled with questions related to the IC, which often went unresolved. That invited many team members to think that the item had a design problem. No one seemed to understand that the device was being used in hundreds of thousands of designs, and that it was selling in the millions. None of those designers were going to bother posting on a forum, since the chip was working properly in their application. We were the ones who were trying to exceed the applicability framework of the device.

On another occasion I managed to run a transceiver at one speed. At one point I had to reduce the data rate by half. Dividing by two as explained by the manufacturer did not work. After a lot of headaches and doubts about the functionality of the chip, I discovered that there were two reserved registers, not detailed by the manufacturer, that also had to be modified. These values in the registers could only be seen if one generated a unique, non-modifiable data rate configuration using a tool provided by the manufacturer and well hidden on their website.

2.4 Inability to Predict in Electronics

In this section you can include all those bizarre phenomena that sometimes make you wonder why you work in this sector, in R&D. Wouldn’t it be better to become a salesperson? Perhaps a manager? Or maybe it would be better to become a sheep herder in the beautiful lands of Ireland?

This is a short list of such events that I have experienced throughout my career. Events that have a clear a posteriori explanation, but which are not contemplated in the idealized contexts narrated in technical books. Reality always surpasses fiction:

  • We had a prototype based on a device with a somewhat strange encapsulation. Nothing that made us suspicious a priori. However, there was no way to make it work. All the measurements were bizarre and illogical. Finally, we discovered that the assembler had managed to insert an 0402 capacitor into a hole under the package.
  • In a fully refined and functional design, long duration tests failed because of completely random and isolated events. The most delusional explanations had been put on the table, the more elaborate the higher the status of the engineer proposing them. Finally an intern discovered by chance that the failure occurred when a nearby soldering iron turned on/off automatically. A faulty and poorly isolated power supply on the workbench did the rest of the work.
  • In a not particularly complex RF design, an internal regulator that had no shutdown pin was randomly shutting down. Some of the frequencies that were demodulated on the board did not seem to like the regulator’s internal control circuitry.
  • One of those FPGA manufacturers that were worth a lot in the stock market, gave us some information about one of their devices in advance. We started to make the board, getting ahead of ourselves in such a way that our bonus was more assured than ever. One day their representative showed up at the office, bringing bad news. Finally not all the transceivers were going to work as fast as we had been told. Much of our design (and bonus) went to waste. The guy said heads had rolled. We replied that we were sorry for their families. No hard feelings.

3. Good Praxis against the Black Swan

Therefore, it is clear that in electronics we are not protected from suffering black swans either. At least we can list some good practices, to be as less vulnerable as possible:

  • Faced with an unexpected problem, when we are going to do experiments to take data, it is better to isolate ourselves from the initial and lightly elaborated hypotheses. Get the data and measurements, the more the better, as systematically as possible. Even in seemingly easy-to-solve situations, it is best to act in this way, and then process the data by thinking with a cool, unbiased mind. Let’s protect ourselves from confirmation bias and the narrative fallacy.
  • Purchase evaluation boards, as many as necessary to reproduce your future design as close to reality as possible.
  • Provision multiple units of each item, whether discrete items, electronic boards or development environments.
  • Humility, humility and more humility. Always be cautious and prudent. Be skeptical of your own abilities, and don’t take anything for granted if you haven’t thought it thoroughly, without bias.

Bibliography (Sponsored)
[1] The Black Swan, Nassim Taleb
[2] Nassim Taleb Bibliography


Subscription
If you liked this contribution, do not hesitate to subscribe to our newsletter: