I don’t like toxic VC, but devs have proven time and time again that they don’t know what “toxic” is.
If this were to just ban people screaming slurs and shit over the mic, ok, sure. But what happens when it starts booting people for harmless shit-talk and bantering with friends?
Also this really calls into question what exactly Activision is going to determine as “harmful language”…
I also have the feeling that tone and noise levevl will play a role and people will start throwing slurs gently and that'll be enough, while loud, but innocent victory celebrations will end in sudden bans
And of course, as the article states: not everyone is a native english speaker, not everyone has the clearest voice or mic. Many false positives inconing indeed
On Wednesday, Activision announced that it will be introducing real-time AI-powered voice chat moderation in the upcoming November 10 release of Call of Duty: Modern Warfare III.
The company is partnering with Modulate to implement this feature, using technology called ToxMod to identify and take action against hate speech, bullying, harassment, and discrimination.
While the industry-wide challenge of toxic online behavior isn't unique to Call of Duty, Activision says the scale of the problem has been heightened due to the franchise's massive player base.
ToxMod is an AI-powered voice moderation system designed to identify and act against what Activision calls "harmful language" that violates the game's code of conduct.
On its surface, real-time voice moderation seems like a notable advancement to combat disruptive in-game behavior—especially since privacy concerns that might typically come with such a system are less prominent in a video game.
The full rollout of the moderation technology, excluding Asia, is planned to coincide with the launch of Modern Warfare III, beginning in English, with additional languages added over time.
The original article contains 503 words, the summary contains 171 words. Saved 66%. I'm a bot and I'm open source!