oumuamua writes:
Many people disagree on when AGI/ASI is coming but few doubt that it is coming. It is interesting to analyze what happens if nothing is done beforehand to prepare for it, that is, if it takes the ‘default’ course in the current system.The key economic effect of AI is that it makes capital a more and more general substitute for labour. There's less need to pay humans for their time to perform work, because you can replace that with capital. (e.g. data centres running software replaces a human doing mental labour).
As jobs get replaced, the State will have to enact UBI but what lifestyle will it support? Currently the State looks after citizens because they make the State strong. This incentive could go away after AGI.
With labour-replacing AI, the incentives of states—in the sense of what actions states should take to maximise their competitiveness against other states and/or their own power—will no longer be aligned with humans in this way. The incentives might be better than during feudalism. During feudalism, the incentive was to extract as much as possible from the peasants without them dying. After labour-replacing AI, humans will be less a resource to be mined and more just irrelevant. However, spending fewer resources on humans and more on the AIs that sustain the state's competitive advantage will still be incentivised.
Sufficiently strong AI could make entrepreneurship and startups obsolete.
VC funds might be able to directly convert money into hundreds of startup attempts all run by AIs, without having to go through the intermediate route of finding a human entrepreneurs to manage the AIs for them.
This means AI is the main driver of wealth and will eliminate upward mobility in society.
In a worse case, AI trillionaires have near-unlimited and unchecked power, and there's a permanent aristocracy that was locked in based on how much capital they had at the time of labour-replacing AI. The power disparities between classes might make modern people shiver, much like modern people consider feudal status hierarchies grotesque. But don't worry—much like the feudal underclass mostly accepted their world order due to their culture even without superhumanly persuasive AIs around, the future underclass will too.
It could be a dire future; however, you don’t need to wait for the default outcome to unfold, you can shape how it unfolds.
But it's also true that right now is a great time to do something ambitious. Robin Hanson calls the present "the dreamtime", following a concept in Aboriginal myths: the time when the future world order and its values are still liquid, not yet set in stone.