"Coding" was never the source of value, and people shouldn’t get overly attached to it. Problem solving is the core skill. The discipline and precision demanded by traditional programming will remain valuable transferable attributes, but they won’t be a barrier to entry. - John Carmack
"Guy who was fed a pay-to-win degree at a nepotism practicing school with a silver spoon shares fantasy, to his fan base that own large publications, about replacing hard working and intelligent employees with machines he is unable to comprehend the most basic features of"
They've been saying this kind of bullshit since the early 90s. Employers hate programmers because they are expensive employees with ideas of their own. The half-dozen elite lizard people running the world really don't like that kind of thing.
Unfortunately, I don't think any job is truly safe forever. For myriad reasons. Of course there will always be a need for programmers, engineers, designers, testers, and many other human-performed jobs. However, that will be a rapidly changing landscape and the number of positions will be reduced as much as the owning class can get away with. We currently have large teams of people creating digital content, websites, apps, etc. Those teams will get smaller and smaller as AI can do more and more of the tedious / repetitive / well-solved stuff.
It's worth noting that the new CEO is one of few people at Amazon to have worked their way up from PM and sales to CEO.
With that in mind, while it's a hilariously stupid comment to make, he's in the business of selling AWS and its role in AI. Take it with the same level of credibility as that crypto scammer you know telling you that Bitcoin is the future of banking.
Lol sure, and AI made human staff at grocery stores a thing of the....oops, oh yeah....y'all tried that for a while and it failed horribly....
So tired of the bullshit "AI" hype train. I can't wait for the market to crash hard once everybody realizes it's a bubble and AI won't magically make programmers obsolete.
Remember when everything was using machine learning and blockchain technology? Pepperidge Farm remembers...
Just the other day, the Mixtral chatbot insisted that PostgreSQL v16 doesn't exist.
A few weeks ago, Chat GPT gave me a DAX measure for an Excel pivot table that used several DAX functions in ways that they could not be used.
The funny thing was, it knew and could explain why those functions couldn't be used when I corrected it.
But it wasn't able to correlate and use that information to generate a proper function. In fact, I had to correct it for the same mistakes multiple times and it never did get it quite right.
Generative AI is very good at confidently spitting out inaccurate information in ways that make it sound like it knows what it's talking about to the average person.
Basically, AI is currently functioning at the same level as the average tech CEO.
The job of CEO seems the far easier to replace with AI. A fairly basic algorithm with weighted goals and parameters (chosen by the board) + LLM + character avatar would probably perform better than most CEOs. Leave out the LLM if you want it to spout nonsense like this Amazon Cloud CEO.
I just want to remind everyone that capital won't wait until AI is "as good" as humans, just when it's minimally viable.
They didn't wait for self-checkout to be as good as a cashier; They didn't wait for chat-bots to be as good as human support; and they won't wait for AI to be as good as programmers.
It's really funny how AI "will perform X job in the near future" but you barely, if any, see articles saying that AI will replace CEO's in the near future.
'Soon' is a questionable claim from a CEO who sells AI services and GPU instances. A single faulty update caused worldwide down time recently. Now, imagine all infrastructure is written with today's LLMs - which are sometimes hallucinate so bad, they claim 'C' in CRC-32C stands for 'Cool'.
I wish we could also add a "Do not hallucinate" prompt to some CEOs.
AI is terrible at solving real problems thru programming. As soon as the problem is not technical in nature and needs a decision to be made based on experience, it falls flat on its face.
Well, that would be the 3rd or 4th thing during my career that was supposed to make my job a thing of the past or at least severely reduce the need for it.
(If I remember it correctly, OO design were supposed to reduce the need for programmers, as were various languages, then there was Outsourcing, visual programming and on the server-side I vaguely remember various frameworks being hailed as reducing the need for programmers because people would just be able to wire modules together with config or some shit like that. Additionally many libraries and frameworks out there aim to reduce the need for coding)
All of them, even outsourcing, have made my skills be even more in demand - even when they did reduce the amount of programming needed without actually increasing it elsewhere (a requirement were already most failed) the market for software responded to that by expecting the software to do more things in more fancy ways and with data from more places, effectively wiping out the coding time savings and then some.
Granted, junior developers sometimes did suffer because of those things, but anything more complicated than monkey-coder tasks has never been successfully replaced, fully outsourced or the need for it removed, at least not without either the needs popping up somewhere else or the expected feature set of software increasing to take up the slack.
In fact I expect AI, like Outsourcing before it, in a decade or so is going to really have screwed the Market for Senior Software Engineers from the point of view of Employers (but a golden age for Employees with those skills) by removing the first part of the career path to get to that level of experience, and this time around they won't even be able to import the guys and galls in India who got to learn the job because the Junior positions were outsourced there.
I taught myself Python in part by using ChatGPT. Which is to say, I coaxed it through the process of building my first app, while studying from various resources, and using the process of correcting its many mistakes as a way of guiding my studies. And I was only able to do this because I already had a decent grasp of many of the basics of coding. It was honestly an interesting learning approach; looking at bad code and figuring out why it's bad really helps you to get those little "Aha" moments that make programming fun. But at the end of the day it only serves as a learning tool because it's an engine for generating incompetent results.
ChatGPT, as a tool for creating software, absolutely sucks. It produces garbage code, and when it fails to produce something usable you need a strong understanding of what it's doing to figure out where it went wrong. An experienced Python dev could have built in a day what took me and ChatGPT a couple of weeks. My excuse is that I was learning Python from scratch, and had never used an object oriented language before. It has no excuse.
I'd believe AI will replace human programmers when I can tell it to produce the code for a whole entire video game in a single prompt that is able to stand up to the likes of New Vegas, has zero bugs, and is roughly hundreds of hours of content upon first play due to vast exploration.
In other words, I doubt we'll see human programmers going anywhere any time soon.
Edit:
Reading other replies made me remember how I once, for fun, tried using a jailbroken copilot program to do python stuff slightly above my already basic coding skill and it gave me code that tried importing something that absolutely doesn't exist. I don't remember what it was called ince I deleted the file while cleaning up my laptop the other day, but I sure as hell looked it up before deleting it and found nothing.
That guy has never seen AI code before. It regularly gets even simple stuff wrong. Was he especially good is when it gives made up crap. Or it tells you a method or function you can use but doesn't tell you where it got that. And then you're like "oh wow I didn't realize that was available" and then you try it and realize that's not part of the standard library and you ask it "where did you get that" and it's like "oh yeah sorry about that I don't know".
20 years ago at a trade show, a new module based visual coding tool was introduced in my field which claimed "You'll never need another programmer".
Oddly enough, I still have a job.
The tools have gotten better, but I still write code every day because procedural programming is still the best way to do things.
It is just now reaching the point that we can do some small to medium scale projects with plug and play systems, but only with very specific equipment and configurations.
Let me weigh in with something. The hard part about programming is not the code. It is in understanding all the edge cases, making flexible solutions and so much more.
I have seen many organizations with tens of really capable programmers that can implement anything. Now, most management barely knows what they want or what the actual end goal is. Since managers aren't capable of delivering perfect products every time with really skilled programmers, if i subtract programmers from the equation and substitute in a magic box that delivers code to managers whenever they ask for it, the managers won't do much better. The biggest problem is not knowing what to ask for, and even if you DO know what to ask for, they typically will ignore all the fine details.
By the time there is an AI intelligent enough to coordinate a large technical operation, AIs will be capable of replacing attorneys, congressmen, patent examiners, middle managers, etc. It would really take a GENERAL artificial intelligence to be feasible here, and you'd be wildly optimistic to say we are anywhere close to having one of those available on the open market.
I don't get how it's not that AI would help programmers build way better things. if it can actually replace a programmer I think it's probably just as capable of replacing a CEO. I bet it's a better use case to replace CEO
That is what happens when you mix a fucking CEO with tech "How many workers can I fire to make more money and boast about my achievements in the annual conference of mega yacht owners" where as the correct question should obviously have always been (unless you are a psychopath) "how can I use this tech to boost productivity of my workers so they can produce the same amount of work in less amount of time and have more personal time for themselves"
Also these idiots always forget the "problem solving" part of most programming tasks which is still beyond the capability of LLMs. Sure have LLMs do the mundane stuff so that programmers can spend time on stuff that is more rewarding? No instead lets try to fire everyone.
And anyone who believes that should be fired, because they don't understand the technology at all or what is involved in programming for that matter. At the very least it should make everyone question the company if its leadership doesn't understand their own product.
To predict what jobs AI will replace, you need to know both of the following:
What's special about the human mind that makes people necessary for completing certain tasks
What AI can do to replicate or replace those special features
This guy has an MA in industrial engineering and an MBA, and has been in business his whole career. He has no knowledge of psychology and whatever knowledge of AI that he's picked up on the side as part of his work.
He's not the guy to ask. And yet, I feel like this is the only kind of guy anyone asks.
The sentiment on AI in the span of 10 years went from "it's inevitable it will replace your job" to "nope not gonna happen". The difference back then the jobs it was going to replace were not tech jobs. Just saying.
I managed to get an AI to build pong in assembly. Are are pretty cool things, but not sci-fi level just yet, but I didn't just say "build pong in assembly", I have to hand hold it a little bit. You need to be a programmer to understand how to guide the AI to do the task.
That was something very simple, I doubt that you can get it to do more complex tasks without a more lot of back and forth.
To give you an example I had a hard time getting it to understand that the ball needed to bounce off at an angle if intercepted at an angle, it just kept snapping it to 90° increments. I couldn't fix it myself because I don't really know assembly well enough to really get into the weeds with it so I was sort of stuck until I was finally able to get the AI to do what I wanted it to. I sort of understood what the problem was, there was a number somewhere in the system and it needed to make the number negative, but it just kept setting the number to a value. A non-programmer wouldn't really understand that's what the problem was and so they wouldn't be able to explain to the AI how to fix it.
I believe AI is going to become an unimaginably useful tool in the future and we probably don't really yet understand how useful it's going to be. But unless they actually make AGI it isn't going to replace programmers.
If they do make AGI all bets are off it will probably go build a Dyson Sphere or something at that point and we will have no way of understanding what it's doing.
I admit that I work faster with AI help and if people get more stuff done in less time there might be less billable hours in the future for us. But AI did not replace me, a 10 times cheaper dude from India did.
Most companies can't even give decent requirements for humans to understand and implement. An AI will just write any old stuff it thinks they want and they won't have any way to really know if it's right etc.
They would have more luck trying to create an AI that takes whimsical ideas and turns them into quantified requirements with acceptance criteria. Once they can do that they may stand a chance of replacing developers, but it's gonna take far more than the simpleton code generators they have at the moment which at best are like bad SO answers you copy and paste then refactor.
This isn't even factoring in automation testers who are programmers, build engineers, devops etc. Can't wait for companies to cry even more about cloud costs when some AI is just lobbing everything into lambdas 😂
Everyone was always joking about how AI should just replace CEOs, but it turns out CEOs are so easily lead by the nose that AI companies practically already run the show.
How much longer until cloud CEOs are a thing of the past? Wouldn't an AI sufficiently intelligent to solve technical problems at scale also be able to run a large corporate division? By the time this is actually viable, we are all fucked.
Don't worry guys. As long as project managers think "do the thing ... like the thing ... (waves hands around) ... you know ... (waves hands around some more) ... like the other thing ... but, um, ..., different" constitutes a detailed spec, we're safe.
The thing that I see most is that AI is dumb and can’t do it yet so we don’t need to worry about this.
To me, it’s not about whether it can or not. If the people in charge think it can, they’ll stop hiring. There is a lot of waste in some big companies so they might not realize it’s not working right away.
Source: I work for a big company that doesn’t do things efficiently.
Its not like jobs will disappear in a single day. Incremental improvements will render lower level tasks obsolete, it already has to a degree.
Someone will still need to translate the business objectives into logical structure, via code, language, or whatever medium. Whether you call that a "coder" or not, is kind of irrelevant. The nerdy introverts will need to translate sales-douche into computer one way or another. Sales-douches are not going to be building enterprise apps from their techbro-hypespeak.
I don't think ai will replace my job any time soon when it's first thought for going through a 2d matrix was to go through it 500 thousand times and check each one with a CPU intensive process leading my pc to come to a halt until I force stopped the script.
I worked at a different MAANG company and saw internal slides showing that they planned on being able to replace junior devs with AI by 2025. I don't think it's going according to plan.
At the end of the day, one thing people forget about with these things is that even once you hit a point where an AI is capable of writing a full piece of software, a lot of businesses will still pay money to have a human read through, validate it, and take ownership of it if something goes wrong. A lot of engineering is not just building something for a customer, but taking ownership of it and providing something they can trust.
I don't doubt that eventually AI will do almost all software writing, but the field of software companies isn't about to be replaced by non software people just blindly trusting an AI to do it right (and in legally compliant ways), anytime soon.