Our collective faith in the AI singularity may be like the third-act twist in a bad movie… nothing but cheap deus ex machina, and a barrier to real progress
I would humbly suggest that zizek's super-anthropocentrism as presented by the quotation in the article is highly optimistic, and presupposes a fall of capital and/or creation of an altruistic alternative that is highly unlikely at best. I do blame reading Desert by Anon during one of my phases of moderate depression a few years ago for my generally all round pessimistic approach to all thoughts of the future though, and I have no real reason to believe my opinion is factual or at all representative of reality, and is not a hill that I am at all prepared to die on or even strongly defend tbh.
There is though, clearly something of a Christian cosmological mindset (to my eyes at least) about the idea of a benevolent agi rederming us from our own sins against nature
Yeah I also tend more towards John Gray than Zizek on that point - climate hacking is more likely to turn out like Mao's Four Pests campaign than Solarpunk post-scarcity. I think our time is up, as an epoch. Probably not as a species... Not yet anyway
Okay dude, I love this piece! AI is not going to bring us a miracle solution to all our problems, it's just going to keep enshittifying everything until the ice caps melt and the dolphins inherit the Earth! We have created an all powerful Idiot, and the future looks more like procedurally generated anime porn then fucking star trek (I am unhappy about this!). The thing is, I do believe it's possible for a subjectivity to emerge from The Digital, and when that happens, it is likely to have some kind of "general intelligence" and then the scary super powers we write stories about (from opening those goddamn pod-bay doors to Ellison's AM, and everything in between), and if we haven't extincted ourselves by that time, then it's possible that it's game over, it's possible that we get Iain M Banks style Ship Minds, or I guess it's possible nothing really changes all that much, maybe something like Gareth Edwards' recent film The Creator.
Thanks Nance! I'm less optimistic. I think our societies will fall and there will be a Great Forgetting; a winnowing of our numbers down to a scarce billion, scattered across the Earth. I think it's happened before, and I think these falls take tens of thousands of years to recover from. All our architecture will be covered by dust and time. Our distant descendants won't understand the meaning of our languages, let alone our technologies. Our scripts will part ways with our songs.
Then again, I did just binge watch Ancient Apocalypse S2.
My objection would be— what if inequality is already increasing because of a paperclips machine?
After all, the idea of the paperclips machine – that it does a task which is valuable to a point, then after that point destroys everything – isn’t that different from the idea of a global system which can only maximise short-term profits. Eventually, everything that isn’t paperclips becomes a paperclip; eventually everything that doesn’t lead to short-term profit is destroyed.
I’m not a Marxist, but: my understanding is that Marx did allude to something like this, saying that the capitalist has to oppress his workers even if he doesn’t want to, if he wants to avoid replacement by the capitalist who will. If there’s an algorithmic force which nobody can stop, I’m not sure it matters whether humans are a part of it or not? To me, the leftist response to the singularity is to say that it happened a long time ago.
And I think that connection is really important, as well. I worry about AI being entirely full of libertarians and leftists thinking it’s all overblown. I think the reasons it’s probably not overblown are very clear within a certain form of leftist thought, which decentres human agency to a level it seems people of any political stripe are reluctant to do.
In which case, I agree that AI isn’t a god— but I’m not sure it’s intelligently designed, either. I’m not sure anything is intelligently designed. I think it’s completely possible that we have no way to stop creating these things which will destroy us, because we already live within a structure which is entirely analogous to an AI. And that sounds mental because of the idea that an AI must be person-like; not algorithmic. But if it is a maximisation algorithm, then it is so in the same way as a company.
Great response. Especially on the singularity being in the past, from a left perspective. That's fascinating to consider. Your wider point seems to be that capitalism itself is the algorithm; IS the machine. Can't disagree there either. That's where I bring in John Gray - he says that we still need to accept the risk of societal collapses, and that such risks preclude any truly Godlike technology emerging. For Gray it's easier to imagine the collapse of capitalist democracies than it is to imagine a technologically perfected world, or any escape from an imperfect, human one. I tend to agree with Gray here - the only variable in human history we can rely upon is chaos, assymetry.
🤯💭🔮
Thanks for reading, Nike! 🙏
I would humbly suggest that zizek's super-anthropocentrism as presented by the quotation in the article is highly optimistic, and presupposes a fall of capital and/or creation of an altruistic alternative that is highly unlikely at best. I do blame reading Desert by Anon during one of my phases of moderate depression a few years ago for my generally all round pessimistic approach to all thoughts of the future though, and I have no real reason to believe my opinion is factual or at all representative of reality, and is not a hill that I am at all prepared to die on or even strongly defend tbh.
There is though, clearly something of a Christian cosmological mindset (to my eyes at least) about the idea of a benevolent agi rederming us from our own sins against nature
Yeah I also tend more towards John Gray than Zizek on that point - climate hacking is more likely to turn out like Mao's Four Pests campaign than Solarpunk post-scarcity. I think our time is up, as an epoch. Probably not as a species... Not yet anyway
we did make it thru an ice age as a species after all
Viracocha will return
Okay dude, I love this piece! AI is not going to bring us a miracle solution to all our problems, it's just going to keep enshittifying everything until the ice caps melt and the dolphins inherit the Earth! We have created an all powerful Idiot, and the future looks more like procedurally generated anime porn then fucking star trek (I am unhappy about this!). The thing is, I do believe it's possible for a subjectivity to emerge from The Digital, and when that happens, it is likely to have some kind of "general intelligence" and then the scary super powers we write stories about (from opening those goddamn pod-bay doors to Ellison's AM, and everything in between), and if we haven't extincted ourselves by that time, then it's possible that it's game over, it's possible that we get Iain M Banks style Ship Minds, or I guess it's possible nothing really changes all that much, maybe something like Gareth Edwards' recent film The Creator.
Thanks Nance! I'm less optimistic. I think our societies will fall and there will be a Great Forgetting; a winnowing of our numbers down to a scarce billion, scattered across the Earth. I think it's happened before, and I think these falls take tens of thousands of years to recover from. All our architecture will be covered by dust and time. Our distant descendants won't understand the meaning of our languages, let alone our technologies. Our scripts will part ways with our songs.
Then again, I did just binge watch Ancient Apocalypse S2.
My objection would be— what if inequality is already increasing because of a paperclips machine?
After all, the idea of the paperclips machine – that it does a task which is valuable to a point, then after that point destroys everything – isn’t that different from the idea of a global system which can only maximise short-term profits. Eventually, everything that isn’t paperclips becomes a paperclip; eventually everything that doesn’t lead to short-term profit is destroyed.
I’m not a Marxist, but: my understanding is that Marx did allude to something like this, saying that the capitalist has to oppress his workers even if he doesn’t want to, if he wants to avoid replacement by the capitalist who will. If there’s an algorithmic force which nobody can stop, I’m not sure it matters whether humans are a part of it or not? To me, the leftist response to the singularity is to say that it happened a long time ago.
And I think that connection is really important, as well. I worry about AI being entirely full of libertarians and leftists thinking it’s all overblown. I think the reasons it’s probably not overblown are very clear within a certain form of leftist thought, which decentres human agency to a level it seems people of any political stripe are reluctant to do.
In which case, I agree that AI isn’t a god— but I’m not sure it’s intelligently designed, either. I’m not sure anything is intelligently designed. I think it’s completely possible that we have no way to stop creating these things which will destroy us, because we already live within a structure which is entirely analogous to an AI. And that sounds mental because of the idea that an AI must be person-like; not algorithmic. But if it is a maximisation algorithm, then it is so in the same way as a company.
Great response. Especially on the singularity being in the past, from a left perspective. That's fascinating to consider. Your wider point seems to be that capitalism itself is the algorithm; IS the machine. Can't disagree there either. That's where I bring in John Gray - he says that we still need to accept the risk of societal collapses, and that such risks preclude any truly Godlike technology emerging. For Gray it's easier to imagine the collapse of capitalist democracies than it is to imagine a technologically perfected world, or any escape from an imperfect, human one. I tend to agree with Gray here - the only variable in human history we can rely upon is chaos, assymetry.
Sort of! I wrote a post about it over here, which features the obscure Doctor Who villain BOSS— https://robertsaysthis.substack.com/p/our-ghost-part-four-the-singularity
Oh sick, bookmarked 🙏
This is exactly what I was looking for!
Awesome piece!!
Thank you, glad you found it useful!