
Yet another online game has taken extra measures in an attempt to combat the rising tide of toxicity that (unfortunately) goes hand-in-hand with gaming. Call of Duty will begin using special AI software to detect when someone is being abusive.
The tool in question is called ToxMod and is used to “identify in real-time and enforce against toxic speech – including hate speech, discriminatory language, harassment and more”, and is “the only proactive voice chat moderation solution purpose-built for games”. All those times you wondered if someone else is listening in, well, now your paranoia is founded because someone is actually listening to you.
Get ready forModern Warfare IIIwith the official gameplay reveal!
Advert
Advert
What could possibly go wrong, we ask while side-eyeing Instagram for having incredibly questionable moderation rules which often target the wrong users. Even though we’d like to think all will go well, there are just too many problems with moderation, especially when people are asked to use their best judgement.
If you’re feeling a little dejected, maybe because you know this software is going to catch you out, just remember that you canwatch Nicki Minaj step on enemiesto your heart’s content. There, don’t you feel all better now?
The new moderation tool will be rolled out in its beta stage across North America from 31 August onwards. As for when it’ll roll out to all users, that’s something that Activision will update its players on accordingly. Just behave, play nicely and you shouldn’t run into any issues.
Advert
Advert
Topics:Call Of Duty,Call Of Duty Modern Warfare,Call of Duty: Modern Warfare II,Activision,PlayStation,Xbox,PC