AI-Powered Voice Chat Moderation System Introduced in Call of Duty

TL;DR Summary
Activision has partnered with Modulate to introduce an AI moderation system called ToxMod in Call of Duty games, aiming to identify and address hate speech, discrimination, and harassment in real time during voice chats. The beta rollout of ToxMod has begun in North America, and a worldwide release is scheduled for November 10th. While the AI tool will not take direct action against players, it will provide reports to Activision's moderators. Human involvement remains crucial to address potential biases in speech recognition systems.
- Call of Duty will use AI to moderate voice chats The Verge
- Call of Duty’s new AI voice chat technology detects toxic players in real time CharlieINTEL.com
- Call of Duty voice chat to get AI moderation to catch & punish “toxic” players Dexerto
- Call of Duty: Modern Warfare 3 Has an AI-Powered Voice Chat Moderation System IGN
Reading Insights
Total Reads
0
Unique Readers
1
Time Saved
1 min
vs 2 min read
Condensed
73%
312 → 83 words
Want the full story? Read the original article
Read on The Verge