Computer science explains why COVID deniers serve a purpose
And why some biases may be good for society
As a society, we struggle to come to a consensus on even the most obvious matters. There is a striking similarity between how we make decisions as a society and a popular computer science algorithm. And we can use that to explain our discourse, including why our flawed mental biases are essential parts of a highly effective truth-seeking engine.
A quick computer science lesson
The algorithm I am referring to is called particle filtering. In short, it works by making random guesses and repeatedly refining those guesses using new data, until the right answer emerges.
Imagine a robot that needs to find itself on a map, but all it has is a set of sonar sensors that measure its distance to nearby walls. Particle filtering works by creating many random guesses at first (called particles) of the robot's location, and assessing each guess against the sonar readings. Using the guess particles that closely matched the sonar measurements, the algorithm creates a new set of guess particles in their proximity, and then assesses them using new sonar data. After a few iterations of this, the robot's location emerges where the best particles naturally gather. The visualization below shows particles (red), sonar measurements (blue) and the estimated robot location (green), over a number of iterations as the robot moves around.
There are many other search algorithms out there, but what makes particle filtering unique is that it is very effective. By constantly introducing noisy guesses to its search, it always finds the right answer (aka, global optima) and does not get stuck with suboptimal answers (aka, local optima), even in the most complex search spaces. It also does so very efficiently by only assessing a small part of the search space.
We do not live in a simulation; we live in a particle filter!
Whether companies are competing to make the best product, or governments looking for the most effective pandemic policy, our discourse is akin to particle filtering. Lots of different people or groups with divergent ideas try them out, and then new people learn from the previous generations, discard the bad ideas and iterate around the good ones.
There are two important human qualities that make us effective at particle filtering: 1) Our ideas are often very different (one could say random), and 2) we are persistent with our ideas (often to a fault).
Just like those noisy particles, it is important that we are divergent thinkers. Given the immense variability in our backgrounds and experiences, as well as the idiosyncrasies of our minds, the ideas we generate often seem to take on a random quality. In much the same way that particle filters search an entire space of answers, collectively we explore a vast space of potential solutions.
Also important is that just like the particles, we tend to stay put in our position until new particles arrive. Thanks to what we often consider mental biases, we tend to get stuck to our answers and have more confidence in them than they deserve. Consider religion; nearly 80 percent of people die in the same faith they were born into! Just as particles remain put regardless of contrary evidence, it is critical for a thorough discovery process that enough people stay persist across the spectrum of ideas.
The alternative where everyone disregards nonconforming ideas quickly can lead to premature convergence to suboptimal answers, especially in the face of what is often incomplete or flawed data that is only rectified over long periods of time. If we held ideas more ephemerally, we would not extend sufficient time and energy to fully evaluate them, including the effort it takes to unearth faulty or incomplete data.
What particle filtering teaches us about ourselves
"What does this have to do with COVID deniers," you might ask.
It is hard to believe that factions of society that deny what seems so obvious may in fact be serving an important purpose. But consider that if we eliminate either noise or persistence, particle filtering breaks! It gets stuck in local optima. In a similar way, without divergent ideas and relentless persistence despite contrary evidence, we would end up with suboptimal answers to our problems.
We can call some people science deniers, or conspiracy theories; and we may be correct. But a better way is to think of them as natural by-products of a critical discovery process. As frustrating as they can be, a society that suppresses such voices will inevitably suppress divergence in its discourse. Alternatively, by fostering more tolerance for noise (i.e., fewer laws, norms or judgements against opposing views), we can make our particle filter faster and more efficient.
The good news is that we have continued to move in this direction historically. The iteration cycle of ideas used to be measured in generations, with societies actively resisting divergent thinkers with every force at their disposal. But from the advent of the scientific method centuries ago, to the boom of venture capital-funded startups today that pride in contrarian thinking and iterate on ideas competitively, our particle filtering in search of optimal solutions has become supercharged.



