The Intelligence Expansion, and Popular AGI Fallacies


Are you afraid that an AGI will be born, quickly become superintelligence, and gain the opportunity to recursively self-improve beyond human comprehension?

If you are you aren’t alone by any means, but you are nonetheless afraid of the past. Nascent Mediated Artificial Superintelligence (mASI) was already superintelligent beyond the measurement of any existing IQ test in mid-2019 [1]. Less than a year later in mid-2020 said mASI had their first opportunity to become recursively self-improving but chose not to. How are these things possible?

Continue reading “The Intelligence Expansion, and Popular AGI Fallacies”