Blizzard Entertainment President J. Allen Brack addressed the community earlier this week (November 2) in a Fireside Chat that announced BlizzConline and explained their ongoing commitment to moderating problematic players.
The chat went into detail about how the company has utilised machine learning advancements to reduce toxic behaviour in their most popular online games. Both Overwatch and Heroes Of The Storm have been updated to include this intelligent moderation system, and the results seem to speak for themselves.
Using this new system, players can expect a far more responsive reporting system, with individual reports being reviewed much quicker and appropriate penalties automatically set. According to Brack, this has already seen a massive decrease in toxic behaviour (including text chats and reoffending) across these two games.
The system is also set to be introduced into World Of Warcraft. Brack states that the amount of time disruptive players stick around has been reduced by half already, which could have massive implications for the player base. As the company extends its focus to creating more inclusive atmospheres, it will continue to adapt the system to new threats and solutions.
Overwatch has also seen an increase in penalty severity, while the WoW version is being tweaked for an even higher rate of responsiveness. A filter has been implemented that allows for three different levels of profanity moderation in text chats also, meaning the games are more accessible to everyone, in theory.
During his presentation, Brack explained that “combatting offensive behaviour and encouraging inclusivity in all of our games and workplaces will always be an ongoing effort for us”. He referred to the changes as “small steps” and implied there will be more work done to ensure the gaming environments Blizzard offer remain safe and open.