AI rule
AI rule


AI rule
AI CEOs be like
Online Communism 😃
Real Life Communism 😠
Communism for fascists and fascism for the commons. It's the american way.
More like theft of data = "communism"
I remember when copying data wasn't theft, and the entire internet gave the IP holders shit for the horrible copyright laws...
I work as an AI engineer and any company that's has a legal department that isn't braindead isn't stealing content. The vast majority of content posted online is done so under a creative commons licence, which allows other people to do basically whatever they want with that content, so it's in no way shape or form stealing.
If its published online its not theft Thats like saying that if i publish a book and someone uses it to learn a language then they are stealing my book
Well, theft of the labor of contributors to give to non contributors is communism ... So, your statement is true, it's just more broad than that.
Online communism unless you actually want something for free.
Intellectual property is fake lmao. Train your AI on whatever you want
"Artists don't deserve to profit off their own work" is stupid as shit. Complain about copyright abuse and lobbying a la Disney and I'll be right there with you, but people shouldn't have the right to take your work and profit off it without either your consent or paying you for it.
Artists and other creatives who actually do work to create art (not shitting out text into an image generator) should take every priority over AI "creators."
No you don't understand, the machine works exactly like a human brain! That makes stealing the work of others completely justifiable and not even really theft!
/s, bc apparently this community has a bunch of dumbass tech bros that genuinely think this
Equating training AI to not being able to profit is stupid as shit and the same bullshit argument big companies use to say "we lost a bazillion dollars to people pursuing out software" someone training their AI on an art work (that is probably under a creative commons licence anyway) does suck money out of an artists pocket they would have otherwise made.
Artists and other creatives who actually do work to create art (not shitting out text into an image generator) should take every priority over AI "creators."
Why are you the one that gets to decide what is "work" to create art? Should digital artists not count because they are computer assisted, don't require as much skill and technique as "traditional" artists and use tools that are based on the work of others like, say, brush makers?
And the language you use shows that you're vindictive and angry.
"I can" and "you can't" are not opposites.
No. Fuck that. I don't consent to my art or face being used to train AI. This is not about intellectual property, I feel my privacy violated and my efforts shat on.
Unless you have been locked in a sensory deprivation tank for your whole life, and have independently developed the English language, you too have learned from other people's content.
Then don't post your art or face publicly, I agree with you if it's obtained through malicious ways, but if you post it publicly than expect it to be used publicly
If the large corporations can use IP to crush artists, artists might as well try to milk every cent they can from their labor. I dislike IP laws as well, and you can never use the masters' tools to dismantle their house, but you can sure as shit do damage and get money for yourself.
Luckily, AI aren't the master's tools, they're a public technology. That's why they're already trying their had at regulatory capture. Just like they're trying to destroy encryption. Support open source development, It's our only chance. Their AI will never work for us. John Carmack put it best.
Based take.
Me, literally training a neutral net to generate pictures of carrot cakes right now:
Best not be including my carrot erotica in your training data.
I feel the current AI crawling bots + "opt-out your data" tactic is ingeniously evil.
It's hilarious really
Companies have been stealing data for so long, and then another company comes and steals their data by scraping it they go surprised Pikachu
The best part is the end result, not where the data comes from. Tired of hearing about AI models "stealing" data. I put all my art, designs and code online and assume it'll be used to train models (which I'll be able to use later on)
You're fine with a corporation making money off your copyrighted work? Without seeing a cent of it?
Why not? Same as a person being inspired to reuse certain aspects. Artists reuse other artists work constantly and usually more blatantly than what AI does.
I wouldn't want to copyright every visual pattern conceivable, everything would be a copyright violation of some sort.
So long as it's correctly attributed it's no more immoral than what a corpo inherently does.
I interpreted this less as being about art models and more about predictive models trained off of data with racial prejudice (i.e. crime prediction models trained off of old police data)
You're putting your stuff out there so some cryptobro cracker can steal it, claim credit for it, and then call you a peasant for helping him make it.
It's sort of an accelerationistic take on broken IP law I'm sure of it.
One thing I've started to think about for some reason is the problem of using AI to detect child porn. In order to create such a model, you need actual child porn to train it on, which raises a lot of ethical questions.
Cloudflare says they trained a model on non-cp first and worked with the government to train on data that no human eyes see.
It's concerning there's just a cache of cp existing on a government server, but it is for identifying and tracking down victims and assailants, so the area could not be more grey. It is the greyest grey that exists. It is more grey than #808080.
well, many governments had no issue taking over a cp website and hosting it for montha to come, using it as a honeypot. Still they hosted and distributed cp, possibly to thousands of unknown customers who can redistribute it.
I'm pretty sure those AI models are trained on hashes of the material, not the material directly, so all you need to do is save a hash of the offending material in the database any time that type of material is seized
That wouldn't be ai though? That would just be looking up hashes.
You absolutely do not real CSAM in the dataset for an AI to detect it.
It's pretty genius actually: just like you can make the AI create an image with prompts, you can get prompts from an existing image.
An AI detecting CSAM would have to be trained on nudity and on children separately. If an image-to-prompts conversion results in "children" AND "nudity", it is very likely the image was of a naked child.
This has a high false positive rate, because non-sexual nude images of children, which quite a few parents have (like images of their child bathing) would be flagged by this AI. However, the false negative rate is incredibly low.
It therefore suffices for an upload filter for social media but not for reporting to law enforcement.
This dude isn't even whining about the false positives, they're complaining that it would require a repository of CP to train the model. Which yes, some are certainly being trained with the real deal. But with law enforcement and tech companies already having massive amounts of CP for legal reasons, why the fuck is there even an issue with having an AI do something with it? We already have to train mods on what CP looks like, there is no reason its more moral to put a human through this than a machine.
This is a stupid comment trying to hide as philosophical. If your website is based in the US (like 80 percent of the internet is), you are REQUIRED to keep any CSAM uploaded to your website and report it. Otherwise, you're deleting evidence. So all these websites ALREADY HAVE giant databases of child porn. We learned this when Lemmy was getting overran with CP and DB0 made a tool to find it. This is essentially just using shit any legally operating website would already have around the office, and having a computer handle it instead of a human who could be traumatized or turned on by the material. Are websites better for just keeping a database of CP and doing nothing but reporting it to cops who do nothing? This isn't even getting into how moderators that look for CP STILL HAVE TO BE TRAINED TO DO IT!
Yeah, a real fuckin moral quandary there, I bet this is the question that killed Kant.
Never forget: businesses do not own data about you. The data belongs to the data subject, businesses merely claim a licence to use it.
Legally, businesses very much own the data about you unfortunately.
No, they very explicitly don't. They claim a licence in perpetuity to nearly all the same rights as the data owner, but the data subject is still the owner.
Also, that licence may not be so robust. A judge should see that the website has no obligation to continue hosting the website, and they offer nothing in return for the data, so the perpetual licence is not a reasonable term in the contract and should be struck down to something the data subject can rescind. In some respects we do have this with "the right to be forgotten" and to have businesses delete your data, however the enforcement of this is sorely lacking.
Laws change over time, though. Everyone is the victim of this practice, so eventually the law should catch up.
For Stable Diffusion, it comes from images on Common Crawl through the LAION 5b dataset.
Jesus Christ who called in the tech bro cavalry? Get a fucking life losers you're not artists and nobody is proud of you for doing the artistic equivalent of commissioning an artist (which you should be doing instead of stealing their art and mashing it into a shitty approximation of art)
It's like photography. Photography + photoshop for some workflows. There's a low barrier to entry.
Would you say the same thing to someone proud of how their tracing came out?
These are not comparable to AI image generation.
Even tracing has more artistic input than typing "artist name cool thing I like lighting trending on artstation" into a text box.
Annoying and aggressive about it to the point where you'd like to wring their necks? Yeah, that's exactly what that's like.
it's funny, but still, where did you get the data? XD
From Reddit comments.
Content platforms could start to poison their data with random AI generated garbage
All damn time as a project committee.
found it