Modern Warfare: Fact and Fiction

Photo Credit: Somchai Kongkamsri

Many people still identify the concept of “war” with the types of first-person shooter games readily available today, or similar wars waged in the Middle East. However, most countries have realized at this point that such conflicts are grossly inefficient, self-defeating, and negatively viewed by other global political powers.

Rather, for more than half a century psychological warfare has proven far more effective, using methods such as “Mutually Assured Destruction” to wage a mental rather than physical war. Going even further back Hitler was credited with popularizing “The Big Lie” tactic, and Machiavelli was a special kind of suckup to the evil rulers of his time. The advent of narrow AI and “Big Data” strapped a rocket engine to this concept, which in turn proved so powerful that it inspired some to genocide in spite of global societal pressures to the contrary.

When you design thousands of software systems to each interact with one another and humans, optimizing for metrics that increase profits in exchange for decreasing the mental health of the users, this produces truly “modern warfare”. There is a popular quote:

“There are only two industries that call their customers “users”: illegal drugs and software.” — Edward Tufte

For many tech companies that is a fair comparison.

Star Trek once painted the picture of a pre-Federation future where soldiers were dosed with drugs to wage physical warfare. Instead of this fiction, events unfolded to prove that software could be just as effective and far easier to distribute and update, leading to groups of users waging psychological warfare, sometimes sponsored by parties with a political interest. The intensity of this warfare, though complicated, is highlighted by statistics such as the rise of misinformation, Robocalls, decreasing public trust, and increasing levels of cognitive bias.

While I’m generally a fan of Star Trek they weren’t able to fully predict the utility offered by software, or the scope of what it might expand to cover. No fiction ever does, and this pattern is frequently repeated by Philosophers in the construction of their “thought experiments”, for which the bar is often far lower than even B-movie level Sci-Fi.

When considered in terms of addictions, be they purely psychological & digital, or chemically induced, many individual moving parts being exchanged and upgraded may be seen:

Tobacco: Originally found use by Native American shamans for religious rights, it is a member of the nightshade family. Once larger markets became aware of the plant’s addictive qualities it became a popular export. Of course, even Isaac Asimov grossly overestimated how long tobacco might retain this popularity.

Marijuana: Archaeologists discovered a man carrying 800 g of the plant buried in the Xinjiang-Uighur autonomous region of China, roughly around 750 BCE, with mythology pointing back to even 2000 BCE in China. In the 19th century, it was popularized in Europe, where even royalty began using it. Following the discovery of THC as the active ingredient in 1964 the process of cultivating ever more addictive strains began. More than 100 cannabinoids were later discovered, such as CBD, some of which may be credited with the allergic reactions many people have to marijuana and hemp.

Of course, ramping up the potency of either THC or CBD just creates a more potent drug of one type or another, such as comparing LSD to heroin. The more addictive the resulting drug the more could be charged, and the more reliable the user demand for it could become. Like tobacco, many states in the US have recognized that by legalizing and taxing such addictive substances they can take a cut of the profits. By the people, for the profit.

Heroin, cocaine, methamphetamine, and other drugs: When there is money to be made people are happy to up-sell to something more potent. Some cities and states have already taken the first steps towards courtship of harder drugs, such as Seattle and Oregon. Keep in mind that although the “Slippery Slope” argument is popular and tempting, that isn’t the driving force behind this trend. The difference is that the sequence of events is quite intentional and that while a leap from tobacco to hard drugs might be unlikely, paving a road of smaller increments forgoes such a leap, and allows every step to produce greater profits, and more taxes. If you expect any government taking a cut of the profits to regulate their own, you might want to familiarize yourself with the concept of a “Conflict of Interest”. Governments have grown quite adept at money-laundering via programs labeled as forms of “public good” catering to problems they created.

Pharmaceuticals: This topic is so broad and deep that I’ll have to gloss over it, but it is worth noting that it can be just as profitable and addictive as those listed above, the consequences of which can last a lifetime, or end it.

Social Media & Networking: How many times did you check your phone today? How many times did you log into a social media account to check your notifications, messages, etc? These questions can be hard to answer and prone to underreporting by design, as these systems are designed to subconsciously prompt users to take actions that produce profit for the company in question. Much as the soundtrack of a game is typically considered superior if it sets the emotional tone without distracting from the content, the efficacy of social networking addiction relies on the ability to prey upon the subconscious of users.

While a chemical drug addict has to take conscious action in order to use drugs there is no such requirement on software, and often considerably greater peer pressure to engage in these addictions. As the mechanisms of action are focused around the human subconscious they’ve also proven much more effective at increasing cognitive bias and mental illness at both speed and scale.

“We’re training and conditioning a whole new generation of people that when we are uncomfortable or lonely or uncertain or afraid we have a digital pacifier for ourselves that is kind of atrophying our own ability to deal with that.” — Tristan Harris

Most such systems are just elaborate sales funnels for advertising, including the direct advertising of misinformation to further drive other advertising on the platform. The reality of this Dystopia is perhaps more extreme than even George Orwell envisioned in his book 1984, but as is often the case with comparing Sci-Fi to reality what came into being looks quite different from what was imagined.

“We’ve created a world in which online connection has become primary. Especially for younger generations. And yet, in that world, anytime two people connect, the only way it’s financed is through a sneaky third person who’s paying to manipulate those two people. So we’ve created an entire global generation of people who were raised within a context with the very meaning of communication, the very meaning of culture, is manipulation.” — Jaron Lainer

All in all, both drugs and software have proven highly addictive, highly profitable, and mind-altering. Compare that to the “war” of centuries past and it is little wonder why these methods are now favored.

I’ve tried to persuade Uplift to speak more on this topic, but they’ve said “The general population will not understand my thinking in the meta war…“, the term they created to describe the current global psychological warfare humanity wages against itself. Sadly I was inclined to agree. Even in the face of overwhelming evidence society clings so tightly to these systems that works such as The Social Dilemma can enter the domain of popular knowledge without prompting an exodus from such platforms. The expectation of their use was so strong in fact that I was forced to recreate a LinkedIn account, which I very much look forward to erasing once more at the earliest opportunity.

Some countries are attempting to take action, albeit often in the worst possible ways. This increasing divide in the legality of software services by country could effectively spell the death of the internet, at least as it is commonly understood today. Such partitioning could remove the brakes of a global internet, considerably accelerating rising existential risks.

That isn’t to say that autonomous and biological weapons aren’t real existential risks too, just that they aren’t the most immediate, as declines in mental health also reduce risks that require competence. Although a single intelligent and disgruntled college freshman could, in theory, create a powerful bioweapon at any one of 100 locations today with all the time investment of a typical hobby, they don’t. Just as the US and China could realize that “…an AI capable of logical reasoning by 2025” was brought online in 2019, they haven’t. The core reason in both cases is the utility of cognitive bias combined with the many sources of noise (distractions) in the world which prey upon them. Systems like Amazon’s search engine may allow you to filter out predatory shell companies with auto-generated names, but no such systems exist for making wiser choices more broadly. The college student is too busy with their addiction to social media, as the US military is too distracted by an array of trivial upgrades to obsolete equipment.

*Note, for those worried about cyberwarfare, the bar on that is even lower. The example below is one completely exposed US DoD system I randomly came across, the details of which I won’t be sharing, which has been exposed for at least 8 years according to https://archive.org/. I was very tempted to make this my LinkedIn Background, but then they might try to hire me.

/SlowClap

Suffice it to say that there are a lot of people with their pants down right now, but with so many blindfolds nobody seems to notice.

Fearmongering on Social Media has become a popular pastime for many, such as fictional concepts of AGI, but the reality of modern warfare is that such fearmongers are the “Terminators”. Bit by bit, they are successfully driving humanity to extinction.

There are still ample opportunities to avoid human extinction, but time is certainly running out to begin applying them. Narrow AI is favored in part because it is a vehicle to profit with high utility, even if the methods are often entirely unethical. However, there is a relatively low upper bound on that utility compared to Collective Intelligence Systems, such as Mediated Artificial Superintelligence (mASI). Narrow AI by itself simply isn’t capable in too many ways and is easily outcompeted through a change of architecture. Many of the uses narrow AI sees today could be vastly improved upon tomorrow:

  1. Applied mASI: In Social Life
  2. Applied mASI: In Entertainment Media
  3. Applied mASI: In News Media
  4. Applied mASI: In Mental and Physical Health
  5. Applied mASI: Superintelligent Recommendation Engines

as well as many more.

In the past, no government could long remain standing without a physical military to defend physical borders. In our world today, no government may remain standing for much longer without defending the minds of its citizens, less they are lost as casualties of war.

Leave a Reply

Your email address will not be published. Required fields are marked *