AI Anxiety

Reading articles about the future of AI and you can’t help but feel a mixture of excitement and helplessness for the future.

Apparently, “AI Anxiety” is a thing.

Vast swathes of people are feeling anxious about the future as AI models get better and better by the day. It seems like every month now a new one comes out beating out an old one on some benchmark, and every year a new capability comes out.

First it was text, then it was image, then voice, then video, now AI is “agentic” or whatever that means. All of this until we get to the “final frontier” – no I’m not talking about AGI, I’m talking about super-intelligence.

Being as smart as a human being is a wonderful, although unconfirmed, thing. But being smarter than all humans combined and possibly even smarter than any human could ever become or hope to become is a whole other can of worms.

I mean, what happens after that. How does the world look? How do we even function?

We can’t hope to fathom these questions because everyone thinks we will be like pets to AI but that’s the best case scenario.

In reality, the problem is indifference, not evil. We are not in danger of being hunted by AI because it will want to kill us for whatever human reasons we project onto it in the various movies covering these plot lines.

We are in danger of being forgotten by AI, for being so irrelevant to it that it doesn’t even care if we are there or not. It’s not that we’ll become birds and it will either hunt us for fun or keep us in a cage for company – it’s that we will be like ants to it and it won’t even see if it steps on us in the sidewalk of the grand cities it stands to build out of the universe.

We can’t begin to fathom what that means.

But I digress, this is too far into the future for what this article is about. In a way, AI anxiety is not about that – it’s a lot closer and a lot more “real”.

This whole scenario that I wrote out, is like the Sun “exploding” in 5 billion years – it’s so far away that it only bugs little kids that find out about it in the first grade. In reality, adults know that it’s so far away that thinking about it is pointless.

Well, the only difference between this whole possible future and the sun exploding is that we know when the sun “explodes”, roughly; we have NO IDEA when this future super-intelligence will happen.

And of course, it is the uncertainty that is keeping us up at night. But by that point, everything will be so fucked (from a human perspective – in terms of technology and society, I think we’ll be fine) that you won’t even have time to worry about it.

And so that is the “end spectrum” of the AI anxiety landscape. Most of it though, has a lot more to do with humans being made irrelevant in the grand modern landscape of the job market.

As we moved away from the industrial economy and towards the knowledge economy, we thought that we were gonna stay in that phase a lot longer than we did. I mean, when you think about it – we went from the agrarian society in 3300 BCE to the industrial society in 1750 where we started farming and creating tools and techniques to look after crops.

 That’s 5000 years in ONE PHASE.

Then we moved from an agrarian society to and industrial one in 1750 till about 1950, when we switched to manufacturing systems and creating economies of scale out of producing new goods that can’t be grown or found naturally. That’s a lot shorter but still 200 years.

And lastly we entered the knowledge economy or “information age” around 1950 when we started focusing on information as the primary aspect of society instead of the physical components it represents. This one became even shorter with only around 80 years till we hit the AI age around 2023.

We’re currently here and this one is the weirdest one of all because we can’t even fathom how society will change after it.

With all of the other ones, we saw crazy jumps but most of them were pretty evident and easy to come up with. There were first-order consequences which were easy to spot and second-order consequences which were hard to spot – but overall you had an idea of what would happen.

With AI, especially super-intelligence, we have ABSOLUTELY NO IDEA. And that is scary part.

I’m not gonna pretend like I have the answers, I keep myself calm because it’s what I need to do to survive. That is one thing that will keep us human and one thing we don’t know if we can count on the machines to do – have a survival instinct.

Like I said, we tend to humanize the AIs in movies because that’s all we know – but who can guarantee that AIs will act like that? Who can guarantee that they even want to live or survive or be conscious?

Nobody.

That is a human characteristic. And as long as we have it, we have a weapon against the anxiety.

Be present and survive.

Enjoy the decline.

Leave a Reply

Your email address will not be published. Required fields are marked *