Activision is turning to AI to make your Call of Duty matches a bit less toxic
A new voice moderation tool, powered by AI, has been added to Call of Duty to protect players from “toxic or disruptive behaviour they may encounter.” This tool is said to help increase Activision’s ability to “identify and enforce against bad behaviour that has gone unreported.”
The initial rollout of the voice chat moderation tool will begin in North America for Call of Duty: Modern Warfare 2 and Call of Duty: Warzone from today. It’ll be rolled out globally (excluding Asia) on November 10 when Call of Duty: Modern Warfare 3 arrives.
How exactly does this tool work, and what behaviour will it be looking out for? In a comprehensive FAQ shared to Activision’s website, the developer shares that voice chat moderation is “managed and operated by Activision and uses the AI-powered model ToxMod from Modulate.” It will be focused on “detecting harm within voice chat versus specific keywords”, and violations of Call of Duty’s Code of Conduct will see more toxic players be punished for their behaviour.