Retool, a development platform for business software, recently published the results of its State of AI survey. Over 1,500 people took part, all from the tech industry:...
Over half of all tech industry workers view AI as overrated::undefined
I think it will be the next big thing in tech (or "disruptor" if you must buzzword). But I agree it's being way over-hyped for where it is right now.
Clueless executives barely know what it is, they just know they want it get ahead of it in order to remain competitive. Marketing types reporting to those executives oversell it (because that's their job).
One of my friends is an overpaid consultant for a huge corporation, and he says they are trying to force-retro-fit AI to things that barely make any sense...just so that they can say that it's "powered by AI".
On the other hand, AI is much better at some tasks than humans. That AI skill set is going to grow over time. And the accumulation of those skills will accelerate. I think we've all been distracted, entertained, and a little bit frightened by chat-focused and image-focused AIs. However, AI as a concept is broader and deeper than just chat and images. It's going to do remarkable stuff in medicine, engineering, and design.
It is overrated. At least when they look at AI as some sort of brain crutch that redeems them from learning stuff.
My boss now believes he can "program too" because he let's ChatGPT write scripts for him that more often than not are poor bs.
He also enters chunks of our code into ChatGPT when we issue bugs or aren't finished with everything in 5 minutes as some kind of "Gotcha moment", ignoring that the solutions he then provides don't work.
Too many people see LLMs as authorities they just aren't....
Many areas of machine learning, particularly LLMs are making impressive progress but the usual ycombinator techbro types are over hyping things again. Same as every other bubble including the original Internet one and the crypto scams and half the bullshit companies they run that add fuck all value to the world.
The cult of bullshit around AI is a means to fleece investors. Seen the same bullshit too many times. Machine learning is going to have a huge impact on the world, same as the Internet did, but it isn't going to happen overnight. The only certain thing that will happen in the short term is that wealth will be transferred from our pockets to theirs. Fuck them all.
I skip most AI/ChatGPT spam in social media with the same ruthlessness I skipped NFTs. It isn't that ML doesn't have huge potential but most publicity about it is clearly aimed at pumping up the market rather than being truly informative about the technology.
It is overrated. It has a few uses, but it's not a generalized AI. It's like calling a basic calculator a computer. Sure it is an electronic computing device and makes a big difference in calculating speed for doing finances or retail cashiers or whatever. But it's not a generalized computing system that can basically compute anything that it's given instructions for which is what we think of when we hear something is a "computer". It can only do basic math. It could never be used to display a photo , much less make a complex video game.
Similarly the current thing that's called "AI", can learn in a very narrow subject that it is designed for. It can't learn just anything. It can't make inferences beyond the training material or understand. It can't create anything totally new, it just remixes things. It could never actually create a new genre of games with some kind of new interface that has never been thought of, or discover the exact mechanisms of how gravity works, since those things aren't in its training material since they don't yet exist.
I remember when it first came out I asked it to help me write a MapperConfig custom strategy and the answer it gave me was so fantastically wrong - even with prompting - that I lost an afternoon. Honestly the only useful thing I've found for it is getting it to find potential syntax errors in terraform code that the plan might miss. It doesn't even complement my programming skills like a traditional search engine can do; instead it assumes a solution that is usually wrong and you are left to try to build your house on the boilercode sand it spits out at you.
I have a doctorate in computer engineering, and yeah it’s overhyped to the moon.
I’m oversimplifying it and some one will ackchyually me but once you understand the core mechanics the magic is somewhat diminished. It’s linear algebra and matrices all the way down.
We got really good at parallelizing matrix operations and storing large matrices and the end result is essentially “AI”.
That’s because it is overrated and the people in the tech industry are actually qualified to make that determination. It’s a glorified assistant, nothing more. we’ve had these for years, they’re just getting a little bit better. it’s not gonna replace a network stack admin or a programmer anytime soon.
There is a lot of marketing about how it's going to disrupt every possible industry, but I don't think that's reasonable. Generative AI has uses, but I'm not totally convinced it's going to be this insane omni-tool just yet.
It is currently overhyped and so much of it just seems to be copying the same 3 generative AI tools into as many places as possible. This won't work out because it is expensive to run the AI models. I can't believe nobody talks about this cost.
Where AI shines is when something new is done with it, or there is a significant improvement in some way to an existing model (more powerful or runs on lower end chips, for example).
Of course, because hype didn't come from tech people, but content writers, designers, PR people, etc. who all thought they didn't need tech people anymore. The moment ChatGPT started being popular I started getting debugging requests from few designers. They went there and asked it to write a plugin or a script they needed. Only problem was it didn't really work like it should. Debugging that code was a nightmare.
I've seen few clever uses. Couple of our clients made a "chat bot" whose reference was their poorly written documentation. So you'd ask a bot something technical related to that documentation and it would decipher the mess. I still claim making a better documentation was a smarter move, but what do I know.
I'll join in on the cacophony in this thread and say it truly is way overrated right now. Is it cool and useful? Sure. Is it going to replace all of our jobs and do all of our thinking for us from now on? Not even close.
I, as a casual user, have already noticed some significant problems with the way that it operates such that I wouldn't blindly trust any output that I get without some serious scrutiny. AI is mainly being pushed by upper management-types who don't understand what it is or how it works, but they hear that it can generate stuff in a fraction of the time a person can and they start to see dollar signs.
It's a fun toy, but it isn't going to change the world overnight.
In a podcast I listen to where tech people discuss security topics they finally got to something related to AI, hesitated, snickered, said "Artificial Intelligence I guess is what I have to say now instead of Machine Learning" then both the host and the guest started just belting out laughs for a while before continuing.
On one hard there's the emergence of the best chat bot we've ever created. Neat, I guess.
On the other hand, there's VC capital scurrying around for the next big thing to invest in, lazy journalism looking for a source of new content to write about, talentless middle management looking for something to latch on to so they can justify their existence through cost cutting, and FOMO from people who don't understand that it's just a fancy chat bot.
I use github copilot. It really is just fancy autocomplete. It's often useful and is indeed impressive. But it's not revolutionary.
I've also played with ChatGPT and tried to use it to help me code but never successfully. The reality is I only try it if google has failed me, and then it usually makes up something that sounds right but is in fact completely wrong. Probably because it's been trained on the same insufficient data I've been looking at.
As with all tech; it depends. It's another tool in my toolbox and a useful one at that. Will it replace me in my job? Not anytime soon. However, it will make me more proficient at my job and my 30+ years of experience will keep its bad ideas out of production. If my bosses decide tomorrow that I can be replaced with AI in the current state, they deserve what they have coming. That said, they are willing to pay for additional tooling provided me with multiple AI engines and I can't be more thrilled. I'd rather give AI a simple task to do the busy work than work with overseas developers that get it wrong time and time again and take a week to iterate while asking how for loops work in Python.
In my experience, well over half of tech industry workers don't even understand it.
I was just trying to explain to someone on Hacker News that no, the "programmers" of LLMs do not in fact know what the LLM is doing because it's not being programmed directly at all (which even after several rounds of several people explaining still doesn't seem to have sunk in).
Even people that do understand the tech more generally pretty well are still remarkably misinformed about it in various popular BS ways, such as that it's just statistics and a Markov chain, completely unaware of the multiple studies over the past 12 months showing that even smaller toy models are capable of developing abstract world models as long as they can be structured as linear representations.
It's to the point that unless it's in a thread explicitly on actual research papers where explaining nuances seem fitting I don't even bother trying to educate the average tech commentators regurgitating misinformation anymore. They typically only want to confirm their biases anyways, and have such a poor understanding of specifics it's like explaining nuanced aspects of the immune system to anti-vaxxers.
That is a terrible graph. There's no y axis, there's no indication of what the scale is, and I don't know how many people they asked or who these people were or what tech company they worked in.
Just over 23% believe it is rated fairly, while a quarter of respondents were presumably proponents of the tech as they said it was underrated. However, 51.6% of people said it was overrated.
That sentence is a fantastic demonstration of how bad this article is. The article says that a quarter say the technology is underrated, but it looks more like half to me. Not that it matters because, as I said the scale is useless. Also they are lumping 51.6% I don't know how they came up with that number because again we don't know what the total was, just that it was more than 1,500. You can't calculate a percentage without knowing the total.
The graph has 11 options so were they rating it on a scale of between 1 and 11. What's that?
I work in an AI company. 99% of our tech relies on tried and true standard computer vision solutions instead of machine-learning based. It's just that unreliable when production use requires pixel precision.
We might throw a gradient descent here or there, but not for any learning ops.
The best use I've found for AI is getting it to write me covering letters for job applications. Even then I still need to make a few small adjustments. But it saves a bit of time and typing effort.
Other than that, I just have fun with it making stupid images and funny stories based on inside jokes.
I asked chatGPT to generate a list of 5 random words, and then tell me the fourth word from the bottom. It kept telling me the third. I corrected it, and it gave me the right word. I asked it again, and it made the same error. It does amazing things while failing comically at simple tasks. There is a lot of procedural code added to plug the leaks. Doesn't mean it's overrated, but when something is hyped hard enough as being able to replace human expertise, any crack in the system becomes ammunition for dismissal. I see it more as a revolutionary technology going through evolutionary growing pains. I think it's actually underrated in its future potential and worrisome in the fact that its processing is essentially a black box that can't be understood at the same level as traditional coding. You can't debug it or trace the exact procedure that needs patching.
Well, it depends on your bubble I guess. But personally I'd say it's underrated and overrated at the same time, but mostly underrated.
It depends on your expectations and way of usage in your toolbox I'd say. It keeps surprising me weekly how fast progress is. But we get used to it perhaps.
Overrated? Compared to what AGI that does not exist yet? Overhyped though? Absolutely.
We went from very little AI content making its way to your eyeballs and ears, to it occurring daily if not during your very session here today. So many thumbnails and writeups have used AI that to say it is overrated it a bit absurd unless you were expecting it to be be AGI, then yes the AI today is overrated, but it does not matter as you are consuming it still.
It's an effective tool at providing introductory information to well documented topics. A smarter Google Search, basically. And that's all I really want it to be. Overrated? Probably not. It's useful if you use it correctly. Overhyped? Yeah, but that's more a fault of marketing than technology.
Meh. Roughly 90% of what I know about baking is from chatgpt. There just wasn't a comparable resource. "Oh God the dough is too dry", "can I sub in this fat for this fat and if so how?", "if I change the bath do I have to change the score method?".
It is like I have a professional baker I can just talk to whenever. I am sure as I get better at baking I will exceed it's ability to help but I can't deny that what I have accomplished now I could not have in the same timeframe without it.
As a college student, I agree with the idea and statement of AI being overrated. It'll definitely have its' place in this world, but I definitely don't foresee us being able to utilize it to the fullest before we end up in a nuclear hellhole.
The generative AI is great but I don't expect it to fully replace human workers or not entirely. The current tech has a lot of limitations and I can see this tech as greatly improving the work productivity and handling some monotone and mundane tasks but it would require some sort of human supervision.
The overhype is a double edge sword. On the one hand, we have to deal with shitty chatbots being shoved into every product, and every marketing campaign telling you how AI in your toaster is going to revolutionize the world.
On the other hand, it being oversold might mean we aim ahead of the target for once on regulations. I just want us to actually get laws in place before capabilities for misinformation, deep fakes, etc. get truly scary.
Using the "Mistral instruct" model to select your food in the canteen works like a charm.
Just provide it with the daily option, tell it to select one main-, side dish and a dessert and explain the selection. Never let me down. Consistently selects the healthier option that still tastes good.
"Half of tech workers" is exactly the expected number, completely unsurprising. The absolute null hypothesis. Why? Well, since the days of ENIAC the number of programmers doubled roughly every three years. That means that at any point in time, half have less than three years of experience and thus no idea what they're doing. In the past roughly decade, maybe one and a half, most new devs went into webdev which explains the javascript ecosystem, now AI is getting a taste of that.
Even though its true. These articles still make me laugh thinking about how they're all written by a popular girl in class when the new prettier girl shows up. Sure Becky, we get it, Kelly dated a 38 year old and snorts like a pig when she eats.
Yeah, and my moms a tech worker, and just tried ChatGPT for the first time today. I wouldn’t go listening to her opinion on the topic, but she’d also not offer it with no knowledge or experience of it.
It's very overrated, and devs out there, if you are using it, the rest of us can tell, and not in a good way.
Edit: Some of you guys are really butthurt, and it's fairly amusing. I'm sure the guys like you I work with who produce unusable code would also be butthurt if I said that to them.