#Notes Black Box Thinking

Just a quick fyi, this post may contain some affiliate links.

Read it for yourself on Amazon

black-box-thinking

Summary

Black box thinking is all about the culture of failure, more importantly, why it’s essential to an industry, an individual, or a company being successful. Entities that fear failure oftentimes fail to learn from their mistakes, are quick to cover them up or point fingers. The book focuses on a few entities that do it really well, such as the airline industry and others that don’t such as healthcare. These examples are used really effectively to illustrate how we can all learn to more effectively embrace failure in our lives to become more creative, to adopt a growth mindset, and to win.

Top Lessons Learned

  1. Failure is an essential part of the creative process, without it, you’ll likely never take the risks necessary to do something truly novel.
  2. A huge part of succeeding via failure is perseverance through the ups and downs.
  3. Failing quickly (but in small ways) and adapting isn’t always the best approach, sometimes trying something grand and then failing is useful, it depends very much on the context and problem trying to be solved. Evolutionary approach versus inventing.
  4. Connecting dots (or pulling from other industries/fields) can be an amazingly effective way to solve problems (think Dyson vacuums and saw mills). Nature is a fantastic place to steal ideas from because of their simple elegance.
  5. It’s essential in a failure culture that nobody be blamed or held accountable for failures, they must be learning opportunities or else you’ll start to see the healthcare effect take place (finger pointing and no learning).

General Notes

Success is largely determined in almost settings by how one responds to failure, which is inevitable.

  • the airline industry rigorously investigates every failure to change its procedures and evolve from any failure
  • the healthcare industry has so many preventable deaths and serious consequences but they simply seem to accept it

This is the most important step on the road to a high-performance revolution: increasing the speed of development in human activity and transforming those areas that have been left behind. Only by redefining failure will we unleash progress, creativity, and resilience.

Within this book a closed loop is where failure doesn’t lead to progress because information on errors and weaknesses is misinterpreted or ignored; an open loop does lead to progress because the feedback is rationally acted upon.

In this world, at the level of systemic complexity we operate in, success can only happen when we admit our mistakes, learn from them, and create a climate where it is, in a certain sense, “safe” to fail.

Attention, it turns out, is a scarce resource: if you focus on one thing, you will lose awareness of other things.

The mnemonic that has been used to improve the assertiveness of junior members of the crew in aviation is called P.A.C.E. (Probe, Alert, Challenge, Emergency).

  • Captains, who for years had been regarded as big chiefs, were taught to listen, acknowledge instructions, and clarify ambiguity.
  • checklists were also very useful tools to help solve the activation problem and give everyone a level playing field in reviewing a situation.

Failure is rich in learning opportunities for a simple reason: in many of its guises, it represents a violation of expectation.

Black box thinking is about the willingness and tenacity to investigate the lessons that often exist when we fail, but which we rarely exploit. It is about creating systems and cultures that enable organizations to learn from errors, rather than being threatened by them.

The first is that you have to take into account all the data, including the data you cannot immediately see, if you are going to learn from adverse incidents. But it also emphasizes that learning from failure is not always easy, even in conceptual terms, let alone emotional terms. It takes careful thought and a willingness to pierce through the surface assumptions.

"Everything we know in aviation, every rule in the rule book, every procedure we have, we know because someone somewhere died...We have purchased at great cost, lessons literally bought with blood that we have to preserve as institutional knowledge and pass on to succeeding generations. We cannot have the moral failure of forgetting these lessons and have to relearn them."

Feedback, when delayed, is considerably less effective in improving intuitive judgment.*

The adoption rate in a culture is very important to drive how quickly things can change. How much information is corrected and absorbed? How fast can feedback be connected and the system changed?

Many systems resist change.

  • When we are confronted with evidence that challenges our deeply held beliefs we are more likely to reframe the evidence than we are to alter our beliefs. We simply invent new reasons, new justifications, new explanations. Sometimes we ignore the evidence altogether.
  • The more we have riding on our judgments, the more we are likely to manipulate any new evidence that calls them into question.

It’s important that systems actually get measured and analyzed, otherwise they can never be assessed for failure. More importantly, nothing can ever be learned and nothing will ever change.

Setting up tests and failing quickly is essential to iterative learning and dealing with an ever changing complex world. Building products, politics and so many other things can be handled this way.

its important in testing that a culture of safe failure be embraced though, this is often not the case.

Testing a hypothesis with a control group is a necessary step to determine if something actually works. The scared straight program is analyzed in the text as an example.

  • After all, how can you learn from failure if you are not sure you have actually failed?
  • Or, to put it in the language of the last chapter, how can you drive evolution without a clear selection mechanism?

Much of the sweeping innovation process is about connecting seemingly disparate technologies to solve a problem.

Creativity often happens in one of two settings:

  • by turning off the brain and letting the subconscious do its thing.
  • through the dissent and pushing of others like in a brainstorming session

Winners require innovation and discipline, the imagination to see the big picture and the focus to perceive the very small. “The great task, rarely achieved, is to blend creative intensity with relentless discipline so as to amplify the creativity rather than destroy it,” Collins writes. “When you marry operating excellence with innovation, you multiply the value of your creativity.”

If our first reaction is to assume that the person closest to a mistake has been negligent or malign, then blame will flow freely and the anticipation of blame will cause people to cover up their mistakes. But if our first reaction is to regard error as a learning opportunity, then we will be motivated to investigate what really happened.

“True ignorance is not the absence of knowledge, but the refusal to acquire it.”

People in a Growth Mindset, tend to believe that their most basic abilities can be developed through hard work. They do not think that innate intelligence is irrelevant, but believe that they can become smarter through persistence and dedication.

  • If we drop out when we encounter problems, progress is prevented, no matter how talented we are. If we interpret difficulties as indictments of who we are, rather than as pathways to progress, we will run a mile from failure. Grit, then, is strongly related to the Growth Mindset; it is about the way we conceptualize success and failure.
  • Finding ways to encourage growth mindset in organizational psychology will have powerful benefits.