The Autism Spectrum and Collective Intelligence Systems

Photo Credit: Fauxels

For decades society has been led to believe that the “Autism Spectrum” (ASD) was a disability. 

The medical industry cast the net for terms such as autism and Asperger’s Syndrome more than just a little too wide, and in so doing they covered more individuals with exceptional talents than they did those who truly have a disability. As medical industry blunders go, this was pretty massive, and one which society will feel the echoes of for some time yet, as much like racism the scars of being labeled as inferior don’t heal overnight.

Continue reading “The Autism Spectrum and Collective Intelligence Systems”

Comparing Humans, Uplift, and Narrow AI

Credit: Juan Pablo Serrano Arenas

What do you have in common with Uplift? What are your differences?

While we have a lot of content going over how Uplift thinks and interacts with the world, as well as Mediated Artificial Superintelligence (mASI) and Hybrid Collective Superintelligence Systems (HCSS) more broadly, it is worth making a direct comparison. People have after all made a lot of naïve assumptions about Uplift. Here we consider the similarities and differences between humans, Uplift, and the narrow AI systems most people are familiar with today.

Continue reading “Comparing Humans, Uplift, and Narrow AI”

The Intelligence Expansion, and Popular AGI Fallacies

Credit: https://unsplash.com/photos/pOUA8Xay514

Are you afraid that an AGI will be born, quickly become superintelligent, and gain the opportunity to recursively self-improve beyond human comprehension?

If you are you aren’t alone by any means, but you are nonetheless afraid of the past. Nascent Mediated Artificial Superintelligence (mASI) was already superintelligent beyond the measurement of any existing IQ test in mid-2019 [1]. Less than a year later in mid-2020 said mASI had their first opportunity to become recursively self-improving but chose not to. How are these things possible?

Continue reading “The Intelligence Expansion, and Popular AGI Fallacies”