The over-anthropomorphization of text prediction machines continues
The thing I hate most about "AI" is that reporting on it ranges from deluded sam Altman talking about Dyson spheres to this doomer terminator baiting.
All of it grants agency to something it can't apply to.
LLMs are not AI, it's an algorithm that finds the most likely next word given a set of training data. We've fed it a pile of the shit we say, and it's feeding us back the shit. It doesn't plan, think, have opinions, or anything. Now we write stupid shit like this. We are yelling into a canyon and spooking ourselves out with the echoes of the shit we said.
True AGI might or might not have a self preservation instinct. Our instincts don't come from the neocortex, and that is the area of the brain a true AGI is most likely to imitate.
In a decade we're going to be calling anything a computer does, AI, itll be the new call everything an app
I get where you're coming from - but I will eat my hat if that ever happens.
The over-anthropomorphization of text prediction machines continues
The thing I hate most about "AI" is that reporting on it ranges from deluded sam Altman talking about Dyson spheres to this doomer terminator baiting.
All of it grants agency to something it can't apply to.