Everybody is stuck on how it can make text/video/pictures which is neat but not nearly as useful. This will just be used for porn/scams/ads overwhelmingly.
However, parsing huge datasets and extrapolating impossible to detect patterns and subsequent correlation of seemingly unrelated phenomena is going to be lit 🔥
Absolutely, but also when you consider ethical challenges (copyright, livelihood of artists), sustainability challenges (energy use) etc. The use cases that you describe are not nearly as controversial as LLMs like ChatGPT.
Issue with that, that's not as cool as generating AI slop from a few words, especially for boomers, who always thought art should be just a weekend hobby, done purely for the sake of self-enjoyment, all because art doesn't involve getting muddy, oily, or getting "cool workplace injuries", thus it's a fake job.
I'm working for a company that's using it for sheet metal forming, you just upload an STL and robots make it out of a blank sheet.
Eventually we won't need dies anymore (good for environment) and will make sheet metal more efficiently (Jevons paradox territory, but we need to reduce total consumption anyways).
IMHO we should have Pigouvian pollution taxes and then let the market decide which ideas are worth pursuing.
Ai is a great tool for VFX and plenty of other things. It's been the norm for decades. Fake techies turned it into a buzzterm. It's a great way to enable up and comers without resources to grow. If you're lazily replacing people in workflow, of course, that's predatory. That's what a lot of CEO's *want, but they won't achieve. In reality, these tools are being created by the people they're supposed to "replace" to make their jobs easier. I have a passion for art, programming, and a lot of other things that are "effected" by ai. So far, it seems like fear mongering. Traditionalist always get fucked in the art world. You just kinda shoot for it. (I work in graphite and animation)
You just need to proofread stupid long overly polite emails to make sure they're actually overly polite and don't tell the recipient random made-up bullshit.
Issues with those usecases are normalization or such technologies on a larger scale, and the eventual reduction of the artistic process to having a single idea.
If we were in a post-capitalistic world, I wouldn't be as concerned about the normalization part. However, one of my biggest fears is that the anti-AI movement gets tired out, and then with better AI technologies and sneakier uses, it gets normalized even more.
When I'm creating, I also interested in the implementation of the idea, not just the idea itself. Generative AI simply reduces the creative process to "coming up with ideas". And a "good idea" does not guarantee "good outcomes". I cannot count the number of good ideas wasted in bad execution, including AI generated stuff. In that case, many good ideas were just put into a generator instead of actually going through the creative process.
Sure, AI could become better, and many "AI promters" could graduate into "AI art directors". There's one problem with that: That could also kill AI art, as its biggest selling point towards its customers and fans is its reduction of the artistic process to coming up with ideas.
You'll know how much the means of creating art have changed over the centuries. Different or more time efficient does not mean worse.
Also if you have been an artist for a few decades now you'd been alive while digital art was introduced and the complains it raised to traditional artists.
Complains here are very similar to those. It's just a new tool. It can be used to do good of bad art same as a Photoshop brush. And Adobe is as bad and big corporations (probably bigger and worse) than openAI.
And no, making AI art is not instant. Neither just writing "make me a nice bunnie" and enjoy. It also have a process, with many steps, iterations and that if what you aim to do is something good a lot of times it needs to be complete with traditional digital art. Once again, it's just a tool, how it's used is up to the artist.
I perfectly know that this is not about the "integrity of art". This is mostly about "commission art" or "industrial filling art"(like videogame not important assets, backgrounds, etc) that it was paying the bills for many people and it has been incredibly threatened by generative AI as for the people paying for that type of art the results of an AI model are good enough for a fraction of the price.
But again, it's the same that happened before with digital art. Before there were a need for way more traditional artists jobs for the same result as fewer digital artists.
Progress has always killed jobs, and people have need to learn new skills. That's why we need social protection systems so people can keep employed despite that.
Genning is still power hungry and expensive, but it has its applications. The problem is that industrialists want to do with it what they've wanted to do with every previous step of automation, which is, replace workers with it.
And the problem is, the way people justify their existence to the societies we have is through employment or profits. If you don't have those, you go homeless and now according to SCOTUS, you are an unperson.
And so now it is conspicuous any time an employer lays someone off or removes them from a job, even to maximize profits. That is a life-threatening action, and it raises questions of whether institutions exist for humankind, or vice versa. If it's vice versa than Viva la revolución! Party like it's 1789! The ownership class will tremble!
But for now we seem happy to let billionaires put all their resources into making their number go up and stopping us from resisting this impulse by force. Including robot dogs with guns.
When La Résistance started organizing in occupied Paris, it was because the German garrison picked that fight. The Germans couldn't help themselves (despite orders to police gently) but be brutal and abusive against the French, and individuals in the public felt compelled to misbehave in small acts of resistance (slashing tires, defacing propaganda posters, cutting phone lines). Things escalated from there.
I don't get this massive hate for AI. I am running it on my PC locally, have been using it as a toy to make some funny images, videos, voice clones to make my family and friends laugh. I might be naive but why the hate towards a tool that nobody forces you to use? It has its problems, sure, not to me it looks just like when painters when you present them with a camera.
why the hate towards a tool that nobody forces you to use?
Right here is the problem, because many companies like Adobe and Microsoft have made obtrusive AI that I would really like to not interact with, but don't have a choice at my job. I'd really like to not have to deal with AI chatbots when I need support, or find AI written articles when I'm looking for a how-to guide, but companies do not offer that as an option.
Capitalists foam at the mouth imagining how they can replace workers, when the reality is that they're creating worse products that are less profitable by not understanding the fundamental limitations of the technology. Hallucinations will always exist because they happen in the biology they're based on. The model generates bullshit that doesn't exist in the same way that our memory generates bullshit that didn't happen. It's filling in the blanks of what it thinks should go there; a system of educated guesswork that just tries to look convincing, not be correct.
In a lot of cases you are forced to use AI. Corporate "support" chatbots (not new, but still part of the cause for fatigue), AI responses in search engines that are shown without you asking for them and tend to just be flat out incorrect, Windows Recall that captures constant screenshots of everything you do without an option to uninstall it, etc.
And even if you're not directly prompting an AI to produce some output for you, the internet is currently flooded with AI-written articles, AI-written books, AI-produced music, AI-generated images, and more that tend to not be properly indicated as being from AI, making it really hard to find real information. This was already a problem with generic SEO stuffing, but AI has just made it worse, and made it easier for bad actors to pump out useless or dangerous content, while not really providing anything useful for good actors in the same context.
Pretty much all AI available right now is also trained on data that includes copyrighted work (explicitly or implicitly, this work shouldn't have been used without permission), which a lot of people are rightfully unhappy about. If you're just using that work for your own fun, that's fine, but it becomes an issue when you then start selling what the AI produces.
And even with all of that aside, it's just so goddamned annoying for "AI" to be shoved into literally everything now. AI CPUs, AI earbuds, AI mice, AI vibrators, it never ends. The marketing around AI is incredibly exhausting. I know that's not necessarily the fault of the technology, but it really doesn't help make people like it.
I can code 3 or 4x faster using an llm than I can without. Granted most of the stuff I have to write is under 200 lines, AI becomes significantly less useful when the codebase is any larger than that.
I realize I'm also an outlier. Most people didn't get such a productivity boost.
80% of my programing work is solving problems and designing stuff. The only productivity boost I got is when working with proprietary libraries that have most of their documentation in customer support tickets (wouldn't be a problem if I could just read the bloody source code or our company didn't think that paying UNHOLY AMOUNTS OF MONEY for shit makes it better) or when interacting with a new system, where I know exactly what I want, but just don't know the new syntax or names. It's handy, but definitely not a game changer.
I think the problem is capitalism and that the AI is used to make things better for the Capital owner without making anything realistically better for the worker and sometimes worse cause the owner is an out of touch idiot.
Template writers and clipart generators are peachy. Saves us time. People who develop those, they have mostly positive intentions. There's nothing wrong with sustainable research and progress in software of this sort.
You should be focusing on the salespeople, the investors (speculators), the marketers, the corporate buyers. These people have mostly bad intentions.
i had to admonish a couple of people from using it for shits and giggles.. it felt a bit off but i managed to salvage the situation with an explanation, which then felt neater because in the end both times we agreed ai needs less attention rather than more (i.e. dont play around with it, if you want to see it "in action"/need an example, there is already enough generated crap in the sink)
(Just to get more reason for people to downvote my comment: China is a fake communist state collaborating with far-right dictatorships, Tienanmen was real, free Tibet, stop the Uyghur reeducation camps, Putin is an imperialist, all glory to Ukraine, gatekeeping of fandoms suck!)
yes (the environmental angle is a complete distraction and red herring from moving towards more sustainable energy production generally; the ethical one is just plain nonsense spread by people with absolutely no idea how these things work)
yes (people use and like them, people have fun with them and create great art with them. You might not but that’s a you problem)
and … well actually probably no tbh (but that’s a problem with capitalism not technology).
Stop making shit up to dismiss new technology because you’re a luddite
In theory valid point and I approve but I'd disagree the sustainability factor. Even with renewables we won't have infinite energy, so we still need to think what we need to use the availabile energy for. The taining ist already ridiculously expensive and should the model become extremely popular it might still pull a lot of energy.
Needs to be evaluated on a per model usefulness / Energy Consumption basis.
And I doubt most current models would score very highly on that. (but please tell me those who do)
The amount of solar energy that hits the Earth every day is enough to power it for a year, and that's not including wind, wave, or anything else. (yes I know we can't gather it all but the amount of energy available is absolutely gigantic