Valorant Tackles the Industry's Most Toxic Trend
Why isn’t keeping players safe a top priority everywhere?
Toxicity in online video games is a massive problem that isn’t going away. That’s especially apparent in competitive games like Valorant, where players face the threat of abusive behavior from teammates and enemies alike. Too many on the platform take the high-stakes feeling of matches as an excuse to lash out. To coincide with the start of Season 5, Valorant developer Riot Games shared an update on its most recent moderation efforts and what players can expect in the future.
In a new developer update video, Riot Games runs through a lot of information on Season 5, from gameplay changes to insights on competitive play. While a relatively short time was spent discussing player behavior, that section of the video does provide some interesting details on how the developer is trying to fight back against online toxicity.
Last year, Valorant launched updates to how it handles disruptive behavior. Allegations of abusive chat are evaluated manually, with players receiving warnings for breaking the rules and bans for engaging in the same behavior repeatedly. According to Stephen Kraman, head of Valorant’s competitive systems team, 500,000 warnings or bans were handed out last year, and the warnings alone were responsible for decreasing the amount of repeated disruptive communications by 25 percent.
Kraman says that Riot is continuing to iterate on its new moderation efforts, hinting that the developer may reveal an update to its manual evaluation system soon. In particular, he says the team is working on improvements for how information on bans and warnings are given out to players, and how they can provide feedback on the systems in use. Valorant’s system for automatically evaluating voice communications is now in place around the world, with South Korea being the last country that still needs to have it implemented.
Finally, remaking a match when a player drops is getting easier. A vote to reset and start the match with full teams will now begin automatically when one player drops, and only three votes will be needed to confirm the remake. That should make it a little easier to avoid being stuck in a losing match or to be penalized for leaving after one dropped player has already left teams unbalanced.
Riot’s Valorant update doesn’t reveal any groundbreaking changes to how it moderates toxicity, but it’s heartening to see the developer shining a light on the more troubling parts of online gaming. Abusive behavior from players is a problem that’s present in every online game to some extent, but it’s an element of the multiplayer world that often goes under-discussed. No company wants to be associated with the worst parts of its fanbase, after all, and there may be some fear that talking about moderation publicly could invite criticism that the developer is doing too much or not enough. But taking a public stance that abusive behavior won’t be tolerated and celebrating when moderation policies result in repeat offenders being banned sends the signal that toxic players aren’t welcome and helps fight the perception that developers aren’t doing enough for their communities.
Riot has been increasingly public with its anti-toxicity efforts since an incident last year when a streamer shared a clip of some extreme misogynistic abuse she’d received in chat. Since then, executive producer Anna Donlon released a lengthy statement condemning toxic players and pledging to do more to keep the community safe. That was followed by improved and more visible moderation efforts, including using hardware bans to prevent disruptive players from creating new accounts to evade suspensions and bans.
The kind of behavior that Riot’s policies are designed to crack down on just isn’t going to go away entirely. As much as anyone might want it, there’s no way to completely insulate a community from people who want to destroy it. Because of that, it’s important to not only take a hard stance against toxicity, but also promote it publicly and be clear about your intentions. A note in a developer update doesn’t solve the problem alone, but if more developers made it a point to give moderation successes as much attention as any other aspect of their games, it could go a long way toward shifting attitudes around what’s acceptable in online games.