labels: philosophy_of_science
ai can never fully displace us because it is a tool, so insofar as it is available for us to "wield" (i.e. delegate literally anything to), we necessarily exist on a level of complexity above it. However advanced it gets, we will float above it and continue to be lifted by progress as we always have. Progress is destabilizing, but it allows us to exist at a higher level of complexity as a species.
..and for some reason I can't place my finger on right now, that is necessarily a good thing?
The fundamental issue here is availability. There exists a significant technological divide in many parts of the world, and people cannot rise above tools they don't have access to. "Third world" rural farmers obviously can't compete with automation. The economic divide is actually a symptom of a more pernicious gap in the levels of complexity at which we fundamentally exist. This is a weird way to frame it, but it's a special kind of barrier to opportunity. Those farmers aren't just handicapped from competing in the agricultural market, they are disconnected from all of the opportunities available at the higher levels of abstraction they lack access to. It sounds stupid, but consider how like "influencer" is a career in a lot of the world now. It's not just an unlikely and difficult career option for someone from one of these communities: it's probably something it wouldn't even occur to them is an option, if they're even aware that this job (and everything about it) even exists.
Stacking complexity like this isn't strictly uniform. I'm characterizing the level sets here as concentric bubbles, but obviously the geometry is more complex than that with overlaps and branches. Regardless, a "rank" parameter describing degree of complexity necessarily exists. Rank gaps like this are a source of Kuhnian incommensurability. The wider these complexity gaps become, the more difficult it will be to bridge them, and the more likely it becomes that the people left behind will experience systemic pressures/attractors towards equilibrium stable states in which they adopt roles of systemic exploitation, like how the complexity growth from evolution resulted in the permanent exploitation of mitocondria by eukaryotes.
The takeaway here is that technology needs to be available for there to be justice. There is no zero sum game here. The sun provides our system with more energy than we know what to do with. But it is entirely possible for bubbles of society to get "left behind" by development, and the gap gets bigger as the levels of complexity get higher, making bridging the gap harder. If we don't bring the world along with us as we ride the bubble now, it'll just be a lot harder to lift them up later (which we will always be incentivized to do because of human empathy). Consider for example how even 200 years after the end of slavery, the ancestors of those communities continue to be significantly hindered by systemic injustices.