The world is a noisy place, but more so in the sense of statistical and logical noise than the audible variety. The book “Noise: A Flaw in Human Judgment” is dedicated to this subject, but in brief “noise” means the variability of decision-making from different experts presented with the exact same information, or even the same experts presented with that information at different points in time.
One of the first cases they pointed to was a 43% average variance, costing a particular insurance company something on the order of hundreds of millions of dollars. In other industries they found the average variance to be even higher, at roughly 55%. Keep in mind, the market offers some selection pressure to remove the worst performers in many industries, but the same isn’t necessarily true of governments, where they face no true “competition” and the cost of “switching brands” is revolution.
The variability of decisions, noise, can be attributed to different combinations and potencies of cognitive bias being expressed, both in different individuals and the same individuals over time. This inconsistency causes levels of unfairness which have proven quite dramatic when measured over the past 50 years. They also cause serious financial damage and major errors in the judgment of executives and policymakers. These are all the norm.
In this norm, humans are easily influenced at a subconscious level, with circumstantial factors like the weather that day, if a local sports team recently won or lost a game, or how long ago and what someone ate nudging their decisions in one direction or another. Numerous studies were performed to demonstrate these factors, but a few also emerged to showcase two ways of reducing noise.
In one a village participated in an experiment of estimation, and while no one person in that village guessed the correct answer the average (mean) of all their answers was within a fraction of a percent for accuracy, a difference of the true value of 1198 and the 1200 average. This was termed the “Wisdom of Crowds”, which later became known as Collective Intelligence. Two other teams took this even further, showing 3 different methods which allowed individuals to improve their own accuracy even absent getting a second opinion, showing 10%, 33%, and 50% improvements respectively.
This dire norm of high noise is of course an opportunity for dramatic improvements to be made. In particular, the noise itself is a resource, which when properly processed through collective systems may provide humanity with greater levels of debiasing than were previously possible. Much as gasoline served the combustion engine far better than it could as an ingredient in your coffee, how that resource is utilized matters a great deal.
Collective intelligence systems already offer us a means of taking a few weak and noisy performers and producing greater accuracy than any one individual demonstrates on their own. Even systems with no cognitive architecture or memory that persuaded humans to function like a beehive proved adept at this.
The order of processing also comes into play in the best solutions architecture, and in this case, generating collective intelligence to reach that greatly improved accuracy may be followed up with the best of those 3 self-reflection methods for further improvement. The best performer, at 50% improvement, was a 4-question process where an individual would attempt to disprove or qualify their own conclusion.
Through the sum of experience of collective intelligence systems, these capacities and lessons learned could be integrated, transferred, and delivered when and where they are needed globally, removing gross inefficiencies of redundancy. A group of 100 doctors contributing their knowledge to such a system could potentially serve 1,000 times the number of patients while offering a quality of care no individual doctor or non-collective facility on the planet could even approach. Further, this care could persist for decades or centuries beyond the lifespan of any of those doctors, for whatever duration the data they contributed remained applicable. The iteration of this process could also continue to reduce noise even further over time.
As noise is created by biases being expressed to different degrees and in different combinations then each instance of noise is the output of many unknown variables coming together. Another future research focus and opportunity are structuring interactions in such a way that the number and potency of potential influences are limited in different ways. Over time, through such testing and measurement the individual unknowns, specific biases, and their branching influences may be isolated, quantified, and effectively untangled. This offers us yet another method of reducing cognitive bias and “noise” yet again.
Noise costs businesses double-digit profits, costs governments their credibility and effectiveness, and costs virtually every person on the planet a lower quality of life. Some may pretend to be the exception, but those who deal with the problem will gain a strong advantage over the “magical thinkers” of the world.