Humans are wired by nature to search for patterns. If your bike broke on the way to the office, you would get off the bike and look for a pattern. You would say, “What is the one thing that is different from yesterday?”
The challenge with this instinct is that when there are two things in a row that are wrong, we get flummoxed. What happens when four unrelated things go wrong? Enter cognitive bias and the need for a generalist perspective to discover causation through critical thinking. By definition, cognitive bias is a challenge that affects all human beings, and it refers to subconscious inclinations that affect judgement and decision-making, altering our objectivity.
I’ve seen this scenario play out many times in AV systems integration—specifically on enterprise networks—with errors troubleshooting simple problems and more time required to find a solution. The result is always very frustrated end users.
Considering all of the technical knowledge required for systems integration today, it’s important to recognize how human cognitive biases impact testing and verification. As AV systems have become more complex on enterprise networks, with so many different technical people sharing ownership of the quality of the system, subconscious prejudices are even more in play.
In addition to audiovisual experts, the process involves IT directors, network security, construction managers, owners’ representatives, and other end user stakeholders. Each trade is challenged to provide an integrated system that satisfies the vision of the end user. When bringing these systems online or troubleshooting with these necessary cross-functional teams, it is crucial to engage specialists who can also be generalists—these are people who can back up to 50,000 feet and think about what challenges mean holistically. The more micro-focused people are, the more cognitive biases are likely to be triggered.
Critical thinking is the key to triumphing over cognitive bias. While this has always been true, these days, troubleshooters must also be able to communicate well with others, explain their findings clearly and concisely, and remain open to facing unknowns. The process pushes us past our comfort zones as experts.
While critical thinking is a learned experience that must be practiced over time, most people are natural critical thinkers in their own fiefdoms when left to their own devices. That all changes when we get in a room with others, where exploration and discovery tend to get lost in the tussle of competing business interests.
AV integration on an enterprise network illustrates this difficulty well. For example, consider the troubleshooting process that took place when the sound system simply stopped working for a sports team on opening day. The network passes no traffic. The IT manager tells us our “stuff” is broken—the ostrich effect, or optimism bias—wishing the problem away. We sit down with IT and start watching the IP addresses flash by, until one of my guys sees a number that doesn’t match the VLAN that the sound system is on. There’s a rogue device plugged into our VLAN. The IT manager doesn’t see why it would be a problem, but lo and behold, someone unplugs that device and boom, the sound system goes back on.
I recall one troubleshooting meeting where the construction manager wondered why the IT manager was in the room. I had to counter his blind spot bias by explaining that we needed all the stakeholders to present the facts and the observations from everyone’s perspective.
Biases and preconceptions make it difficult to work in a team environment. You need to help others be good critical thinkers. The way to do that is to be a good critical thinker yourself. The first step is self-awareness, and leadership is key. Then, do you have data to back that up?
We want as much data as possible when we’re solving a problem. At the same time, not all data can be weighted the same. You have to discriminate noise from meaningful facts. Your opinion is extraordinarily important, as long as it’s accompanied by data. Without a clear, measurable observation, your opinion is just an opinion.
It’s interesting to consider how artificial intelligence could be the future of troubleshooting extraordinarily complex AV systems. But until that’s a reality, we must face the fact that we don’t know what the result is going to be. All we can do is try our best and apply our processes. Not only do you need to be an expert, you also need to be able to clearly and concisely communicate your part of the puzzle to others. You must be relentless in your application of critical thinking and provide careful, patient explanation.
Define: Critical Thinking
We might think we “know” what critical thinking means, but the idea is often interpreted differently by different people. While it might seem obvious, the fact that we’re human beings makes it much more complicated. When all is said and done, critical thinking is a learned experience that involves the following steps:
1. Analyzing Break down a problem or challenge into pieces to identify the various reasoning, purposes, and correlations.
2. Applying Standard Evaluate the challenge based on established benchmarks or processes.
3. Discriminating Compare and contrast the various elements while categorizing according to priority.
4. Information Gathering Research pertinent sources of information and collect data.
5. Logical Reasoning Draw conclusions from all of the above steps.
6. Action Planning Establish solutions and any varying scenarios that may result.
Common Types of Cognitive Bias
Finding or creating meaningless patterns is a major obstacle to accurate and consistent problem solving. Cognitive biases are unconscious by definition and require insight to even recognize their influence on everyday events. When troubleshooting AV systems with a diverse group of stakeholders, beware of these common cognitive biases.