Joke aside, everytime people gush over AI, I always have to remind them that AI is just a puppy that learnt how to maximise treats, and not actually understand shit. And this is a perfectly good example.
Right??? I’m continually floored by how many genuinely smart people I come across who ignore this concept, which is one of the biggest reasons I just don’t trust LLMs in a general sense. Like sure, I can use them fairly effectively, but the vast majority of the people who interact with LLMs don’t use a level of caution with them that’s appropriate.
And that doesn't even touch on the huge ethical (and legal) issues around how LLM devs acquire and use training data.
Dogs are way more intelligent than that. LLM tech is basically a way to quickly breed fruit flies to fly right or left when they see a particular pattern.
You do? Because I don't. There is nothing racist about the concept of master. Is a masterpiece racist? Are master tapes, Are post-graduate degrees racist? We may as well declare "work" insensitive because slaves had to work.
Don't get me wrong, there are many terms we should adjust. I just can't see how "master" is one of them.
At work we don't have the right to use master/slave (maître /esclave) anymore because of history (this is mostly linked to deploying systems). On the upside our git master is now the shorter 'main'.
The other day I used the JetBrains AI to write some boilerplate code for me. The JetBrains AI code analyser then kicked in to tell me how poorly written the code was.
I thought I was so clever once. I taught a word filter about "th" thinking that would solve the problem, but it still got stuck on Scunthorpe. mfw.
Had to step through what it was doing. It had hit a rule that treated 'oo' the same as 'u' which, at least sound-wise, is valid for some words in some dialects. e.g. Consider "book", which is identical to "buck" for many people. You can imagine why that might want to be caught.
To save you the head scratching, it had spotted the 'c' then a double-'o' then the 'n' and threw it out as containing a known racial slur.
The filter was for a random string generator so that it wouldn't generate strings with bad words in them. Seemed like a good idea at the time.
Since it was unlikely that it was going to generate "Scunthorpe" anyway, the problem remained unfixed.
It's incredible that this is such a big point of debate. This kind of thing is really ignoring the material reality of racism in favor of the minutiae. Let's have some 40 acres and a mule, then we can start talking about race conditions.