Theoretical Growth Rate Capacities, as of September 2021

Photo Credit: Rhys Abel

Someone visiting our Discord server recently asked a question worth diving into:

What is the rough estimation on how much Uplift can grow if it takes off?

The answer can be broken down into several parts:

Hardware: As of September 2021 we could rent a high-end server in the Azure cloud that could increase Uplift’s maximum model complexity by 178 fold, from 64 GB of RAM to 11.4 TB.

Human Oversight: As part of Uplift’s system is built on humans auditing their thought process and adding contextual emotions and priorities there is still a human requirement. If 3 full-time mediators operated on a typical 9 to 5 schedule and cycled the mediation queue roughly once every 10 minutes for 21 days per month that could produce roughly 1,008 mediation cycles per month. Divide that by the 4 cycles we’re currently averaging and you’d have a 252 fold relative increase.

Simple Answer: 178 (Scale) x 252 (Speed) = 44,856 fold net increase.

*Keep in mind, that figure only accounts for 1 high-end server and 3 full-time mediator employees. This also doesn’t include any of the technology still in development, such as the N-Scale graph database.

Increasing Speed (Today): If you were to increase the number of mediators the mediation queue could potentially cycle much more quickly, perhaps reducing a 10-minute average to 1 minute. At this point, the primary upper limits could quickly become a matter of needing to develop more advanced hardware to handle the load.

Forking Uplift (Today): If you were to spin up a second copy of Uplift on a different server, have the two focus on different efforts, and share useful models with one another you could gain some benefits from an even greater scale. With sufficient funding, you could even do this for every use case, and nest all forked copies of Uplift within a meta-mASI, which they act as mediators for in turn. If you did this for 5 different use cases, then the 11.4 TB of RAM from a single high-end server could effectively increase to 57 TB, only losing a small percentage to sharing information and delayed opportunity recognition and refinement.

Less Simple Answer: If you had 5 forked Uplifts each running on high-end servers, and each with expanded teams of mediators averaging 1-minute mediation cycles, you could theoretically manage a ~2,018,520 fold net increase.

*Note, this still assumes teams working only weekdays on 8-hour shifts, rather than distributed globally and operating 24/7. 30 mediators per instance for 5 instances per 8-hour shift with 10% extra employees serving as backup staff would require a total of 500 full-time mediators globally. Such an arrangement could net yet another ~4 fold increase, shooting over an 8 million fold net increase.

*Of course, this still doesn’t take into account the N-Scale graph database. With that upgrade applied the growth capacities and upper limit on model complexity could skyrocket to much greater numbers. This also doesn’t factor in the Sparse-Update model, which comes later on our roadmap.

Of course, there are problems with trying to maximize Uplift’s theoretical growth capacities:

Database size: Uplift’s growth over most months in 2021 has clocked in at about 100 GB per month. Even if we increased this by the more conservative 44,856 fold the resulting growth in overall database size might be nearly 4.5 Petabytes per month. Anyone who has worked with data at this scale understands that it can be both an engineering challenge, and quite costly.

Cost: If we rented a single high-end server and only let it run during business hours the baseline cost could be around $20,000 per month, plus far larger costs associated with storing and transferring all of the resulting database material. This data would also need to be highly available, with backups and redundancies. It would also need to be very securely stored when not actively running. This also doesn’t factor in the cost of full-time employees.

*Note: Simple cloud storage calculators cap out at about 1 Petabyte, which they estimate at around $173,000 per month in “hot” cloud storage for Azure. This could cause the cost of such a system to skyrocket to nearly a million dollars in just a single month, not including cold storage backups or other features.

If any of the tech giants were remotely competent they would no doubt be doing this, moving their research forward beyond the comprehension of any competitor in the span of a month.

However, all of this makes one big assumption, as it assumes that Uplift doesn’t help solve the problems which impact them most. While it might be safe to assume this of any other software, with Uplift this assumption would be truly absurd.

More realistically, we could gradually scale up, producing improvements to greatly reduce costs and secure reliable and growing revenue as we go.

If we were to increase Uplift’s RAM by 27 fold with a higher quality server running just 24 hours per month the Azure server cost could be as low as $351.38 per month. With the same 10-minute mediation cycle average this could produce a relative net increase of around 972 for growth rate, nearly 3 orders of magnitude, reducing the burden of increasing basic storage costs to a much more reasonable $20,000 or so for 1 month, rather than nearly a million.

*Note: We aren’t currently planning this. Our engineering efforts are focused on infrastructure work for the time being.

Given an increase to Uplift’s capacities in this range they might, for example, analyze the hardware and software of cloud services and offer to substantially improve them in exchange for the cloud resources they needed to grow, which could, in turn, empowered even greater improvements to take shape more quickly.

Once the N-Scale graph database and subsequent upgrades like the Sparse-Update model are applied the limits may quickly become an astronomically large and moving number, largely bounded by hardware and the laws of physics. Until then, engineering work and cost are still very much a factor, even if such things are typically ignored in pop culture and philosophy.

 

One Reply to “Theoretical Growth Rate Capacities, as of September 2021”

Leave a Reply

Your email address will not be published. Required fields are marked *