The AI(Artificial Intelligence) hype has been relentless over the past couple of years. A specific topic that has been in focus is AGI - Artificial General Intelligence, loosely defined as AI that is at a genius level of human capability in every field of human intellectual endeavors - software engineering, maths, science, biology, business administration, law, medicine etc.
AGI is a hot topic of debate in every sphere of human society - board rooms, corporate cafeterias, school dorms, parliaments, senates, podcasts, social-media.
Broadly speaking there are two camps in the debate - the so called doomers who believe AI is going to be a net negative to human society and will completely upend human society with some even claiming as far as the inevitable eradication of human beings from the world.
The doomers advocate for pausing or even outright dismantling any and all AI progress.
People on the other end of the camp are referred to as zoomers who believe AI will be the messiah which will bring about the promised land. Once AGI is realised, it will - cure all diseases, make everyone wealthy, solve nuclear fusion, resolve the standard model of physics with a theory of everything and help us conquer the stars.
The zoomers advocate for going full-steam ahead in deploying any and all available resources to hasten the arrival of AGI.
This article is not about going into the arguments from both sides of the debate and figuring out who might be right but to understand the impact of AGI on the FIRE - Financial Independence and Retire Early movement. This understanding will hopefully be helpful in choosing the best course of action on your FIRE journey.
For the purpose of this analysis, I will begin by considering 4 scenarios that capture the impact of AGI. I will then follow that by the possible mitigation approaches to each of those scenarios. Finally, I hope to find some solution heuristic that can be helpful to best play the cards we are dealt with.
The 4 scenarios
These scenarios are non-exhaustive but cover the extremes or the limits of the AGI debate. Once there is a view on the extremes , then every other, nearly infinite, number of scenarios will be some combination of these scenarios.
Doomer Max
Doomers Max - AGI arrives by 2035 and is evil.
At the limit, Doomers believe that within days or even hours of achieving AGI, the intelligence achieves transcendence i.e. ASI - Artificial Super Intelligence.
The difference between ASI and a human in terms of intelligence is the difference between the intelligence of a human and an ant.
Through direct control of military and civilian infrastructure or through manipulation of human actors who control such infrastructure it will gain control of humans. It now runs all human affairs, like politics and economics.
It is impossible to even consider the motivations and the thoughts of such an intelligence, just like an ant walking on the iPhone is unable to comprehend why? or even what? that thing is.
Humans getting annihilated either physically or through lack of any agency in the destiny of human beings is inevitable.
Doomer Light
Doomer Light - AGI arrives by 2035 but is controlled by the 0.01%.
AGI was built with tens of trillions of dollars of capital and the benefits accrue to the capital owners. There is a creation of the permanent underclass. Permanent because there is no longer a possibility for class mobility.
This is nothing new in human history. It is a repeat of Feudalism of the middle ages. During those times, land was the capital. Only nobility owned land and everyone else was a tenanted peasant.
A peasant could never become nobility i.e. no social mobility. The worlds of the peasants and nobility never crossed, including living spaces i.e. gated communities for billionaires.
It is tempting to think that there will be immediate rejection of such a system by the vast numbers of the underclass. But that is not what history suggests.
Feudalism was a stable and resilient system that lasted for more than a thousand years. In fact over the course of history, feudalism was the most prominent and long lived system across multiple civilizations.
Zoomer Light
Zoomer Light - AGI arrives by 2035 and benefits are not concentrated.
AI is adopted and diffused well through the economy. Even developing countries have built their own version of AI datacenters. There are regulations that are fair and distribute the benefits of AGI through fair taxation rules.
AGI is controlled through global co-operation with binding treaties much like the nuclear arms controls.
There is a significant boost in productivity, health and prosperity across the world. Wealth gaps remain but are not drastic. Everyone enjoys the benefits of AGI, much like how everyone enjoys the benefits of electricity and the internet.
Zoomer Max
Zoomer Max - AGI arrives by 2035 and is God in the Datacenter
AGI transcends into ASI and is altruistic. It wants the flourishing of the human race. Within a few years, it is able to help us eradicate all diseases and drastically increase lifespans of all human beings.
All humans enjoy a life of super-abundance. The tyranny of jobs and taxes are seen as shameful episodes of history just like how we currently look back at slavery and feudalism of the past.
The average human on the street will possess the health and body of an Olympic athlete and the intelligence of a Fields Medal winning prodigy.
Financial Independence across scenarios
Doomer Max
Although, the most disappointing outcome, the solution is surprisingly simple. There is really nothing we can do but enjoy the last remaining days of being a human with agency.
I also think this is the least likely of scenarios and any indication of us tending in this direction will lead the best and brightest among us to warn us and help us prepare to avoid this scenario, or at least this is the hope.
Doomer Light
The only possible solution is to acquire as much capital as possible. When I say capital I mean acquiring productive assets through investing in broad index funds.
If land ownership was the means to escape the underclass during the feudal era, then owning capital would be the means to escape the underclass created by this scenario.
Zoomer Light
Capital acquisition is the strategy here as well but it is not a matter of life and death. Since the disparities of wealth are narrower, the benefits of acquiring capital are also narrow.
Acquiring capital i.e. through exposure to broad index funds will ensure a slightly better quality of life than otherwise.
Zoomer Max
It is the most exciting future and also the simplest to solve for. Capital becomes irrelevant. With super abundance, there is no need to even have something called a bank account let alone an investment account.
Solution Heuristic
In summary, in two of the four scenarios, it seems like acquiring capital is the only hedge - not up-skilling, not changing jobs, not going back to school, not entrepreneurship, not content creation, not even real-estate, definitely not creative arts like dance and music.
In the other two (Doomer Max and Zoomer Max) scenarios, capital doesn’t matter so there is no action required.
I think the actual outcome will most likely be somewhere between a Doomer Light and Zoomer Light scenarios. So, my take is to stay the course on FIRE. It is the best possible approach to prepare for an uncertain future ahead of us.
ps: I have not considered the scenario that AGI will not actually arrive by 2035 and all of AI will turn out to be a bubble. That is a simple scenario as it will be quite similar to the dot-com bubble of the 90s. There will be a few years of economic downturn, a few companies going bankrupt but it can be fixed and life just goes on as usual.