Press "Enter" to skip to content

Call Of Duty Videogame Will Now Use AI To Monitor Voice Chat ‘Toxicity’ And Hate Speech During Online Matches

winepressnews.com

by Jacob M. Thompson

 

Activision, the development studio behind the widely popular Call of Duty videogame franchise, announced on Wednesday that in a bid to reduce “toxic and disruptive behavior with in-game voice chat moderation,” the company has partnered with tech company Modulate to deploy AI-powered, real-time voice chat moderation.

This new feature will be voluntarily implemented in the new “Call of Duty: Modern Warfare III,” set for release on November 10th.

Call of Duty’s new voice chat moderation system utilizes ToxMod, the AI-Powered voice chat moderation technology from Modulate, to identify in real-time and enforce against toxic speech—including hate speech, discriminatory language, harassment and more.

This new development will bolster the ongoing moderation systems led by the Call of Duty anti-toxicity team, which includes text-based filtering across 14 languages for in-game text (chat and usernames) as well as a robust in-game player reporting system…

READ FULL ARTICLE HERE… (winepressnews.com)

Live Stream + Chat (zutalk.com)

We need your help to keep Caravan to Midnight going,

please consider donating to help keep independent media independent

Breaking News: