The 'War on Toxicity' Will Fail

We build games to bring players together, but what happens when they don't get along? Zero tolerance bans, AI-assisted moderation, and even player-run tribunals all continue to fail. But why?

On a fall afternoon in Plymouth, England a 46-year old unemployed father of three, Mark Bradford was letting off steam playing Call of Duty when his character was killed. The killer? A 13-year old boy and “loose friend” who happened to live nearby. As teenage boys tend to do, he jeered and mocked Bradford on voice chat over and over causing him to log off in anger. But when Bradford disconnected, he didn’t just rage quit - he drove over to the boy’s house and strangled him. The boy’s mother had to intervene, tearing Bradford off her son leaving the boy with “scratches and reddening to the neck.” Later when asked what caused Bradford to lose his cool, he remarked “I'd been playing the whole day and he was baiting me and baiting me and just would not shut up. He went on and on and I just lost it. I hold my hands up, I lost the plot. In a moment of madness I went round to his house. I didn't know what I was going to do”.

Any gamer who’s played online, multiplayer competitive games can resonate with the teenage boy (or even Bradford). When I was a kid my younger brother and I would play Mortal Kombat against each other, and I learned that if I abused Raiden’s Torpedo ability a little too much my brother would explode and Torpedo me IRL. Today, my brother is (like Bradford) a father of three, but by all accounts would be the father you’d hope your child, or sibling, to be. His behavior as a kid was nothing unique or abnormal, but today it would be called something gamers and the industry likes to call “Toxic.”

Toxicity is “dangerous”, says Wired Magazine, and defines it as including “sexual harassment, hate speech, threats of violence, doxing (publicizing others’ private information), spamming, flaming (strong emotional statements meant to elicit negative reactions), griefing (using the game in unintended ways to harass others), and intentionally inhibiting the performance of one’s own team” and points out that the perps tend to be “younger, male, and high in emotional reactivity and impulsivity.” Who knew?

Toxicity in games has been a prime focus in the game’s industry for the past decade, spurring over 180 game studios and publishers including Blizzard Entertainment, Riot Games, EA and Ubisoft to band together under a coalition called the Fair Play Alliance whose vision is a world “where games are free of harassment, discrimination, and abuse, and where players can express themselves through play.”

The Fair Play Alliance is co-founded by Kimberly Voll, who holds a PhD in computer science but also a degree in Cognitive Science. Before founding the FPA, she worked at Riot Games for over five years within the “player behavior team.” Riot has been on the frontlines of the “War on Toxicity,” and is regarded as the origin for the term (at least, in an official professional sense). But the term has been controversial for its implication that the player is intending to “be toxic” when bad player behavior, at its root, is actually a lot more complicated. Indeed, even the FAQ on FPA’s website backs away from even using the term. You won’t find the word “toxic” anywhere within the Fair Play Alliance’s website.

The FPA suggests game developers adopt a “Disruption and Harms Online Gaming” framework. The framework, which spans 48 pages long in its PDF format, suggests developers adopt penalty and reporting systems like the ones you see in so many online games today but admits that “an increasing attempt to moderate and control behavior is not a sustainable path.” But yet the industry persists on that front, increasingly adding new systems of control and moderation in an effort to realize the FPA’s goal of a world “free of harassment.”

But as a games system designer of over 13 years, and a competitive/multiplayer/online gamer of at least twice that amount of time, I can tell you that the FPA is right in that the war is unwinnable. It’s my belief that “toxicity” or disruptive is misunderstood, and isn’t just a player problem but a developer problem.

To be Toxic is to Be Human

I have a confession to make: I’ve been toxic in games before.

As a teen playing WarCraft III I used to “shit talk” the other team, especially when they would use “cheesy” tactics like trying to tower rush me. In StarCraft, I would sometimes drag out the game by hiding some of my building pieces causing the other player to explore every pixel of the map to find out where my floating command center was hiding. Sometimes it was because I felt the match “wasn’t fair,” but other times I just did it for fun. Looking back, I’m not really proud of it but I also don’t think I’m a sociopath or a “toxic” person either. I’m just, I don’t know, normal?

The dirty truth is that the data (which you’ll only see from the inside) shows that by far the majority of players are considered “toxic” at least once by another player, and the majority still have been toxic multiple times. So to brand someone as “toxic” is a bit dehumanizing, and isn’t helpful when thinking about disruptive behavior.

When you dig further into the reasons why someone may be a dick online, it really turns out to be more about them than it is about you. Imagine someone who works at Subway making sandwiches from 9-5, comes home from a stressful week of being yelled at for putting mayo in someone’s footlong when they said no mayo, trying to let off some steam in PUBG. Maybe their teammate they were matched with isn’t on their A-game tonite, and that ticks this guy off so he says some mean things or engages in friendly fire. While it doesn’t excuse the behavior, the point is to understand that these situations happen, and are normal.

It’s only recent that the industry has been “clutching their pearls” (as the kids say) over problematic player behavior. As recently as 2002 Microsoft poked fun at the competitive culture. Like many things now, what was normal just 18 years ago is today cringe. Social norms - or what society views as acceptable conduct - are always shifting, and often times we can’t come to an agreement over what these norms should be. In fact, oftentimes developers (or behavior scientists) and players can’t agree with what is “toxic” and what’s not.

Toxicity Comes from Design

The Price for Matchmaking

In 2002, Bungie was riding high with its hit game Halo: Combat Evolved earning the coveted Game of the Year title, and universal acclaim. Not one to rest on its success, Bungie looked to push the game further in Halo 2. Although already massively popular at LAN parties, Bungie sought to streamline the experience for the new XBOX Live service by introducing what they called “matchmaking.” Matchmaking would break apart from the traditional approach of having a list of servers or “lobbys” where players would choose from a list of different games managed by a creator, and instead the game would put players together in a match based upon their geographic location and skill level. This proved to be a massive success, so much so matchmaking is ubiquitous today as the de facto method for online, multiplayer games.

But little did Bungie know that they would pay a price in exchange for the convenience and streamlined experience. On forums and newsgroups, players became skeptical of the integrity of the matchmaking system. Players believed they were being unfairly placed in games alongside players who weren’t as skilled as them, and worse players began what was beginning to be known as “trolling” other players, sending them “flaming” messages making sure they knew they sucked, or their mom was fat. The age of toxicity had begun.

But the trolling and flaming didn’t keep developers from pushing forward with matchmaking. Just 8 years later Blizzard would release StarCraft 2, and to much dismay of the players it didn’t have chatrooms or lobbies - but a new feature: “Quick Play,” or matchmaking. And similar to Halo 2, StarCraft 2 players would question the integrity of the matchmaking.

The Elephant in the Room

Today, most game studios employ dedicated matchmaking designers and engineers. And although much progress has been made to improve the matchmaker by very well meaning and smart people (including one of my favorites, Josh Menke), it still distrusted and often the centerpiece of player angst. But the matchmaker can’t take all the blame, because most of it is amplified by something also seemingly innocent: player ranked rewards.

Part of the systems design of competitive games is to create a “progression path” for the player to maintain engagement. Although PvP games usually garner less total number of players than PvE games, the engagement (or how long they’ll play the game in its “lifetime") is much higher. Part of the alchemy in crafting engagement for the player is a good progression path. The idea is to create a “story” for the player where they begin in bronze league, get better at the game and progress to silver, then eventually to gold, platinum etc. This ranked design is usually considered part of matchmaking since matchmaking also includes a bell curve of distribution of player skill (i.e. ELO, which comes from Chess) but what’s unique about online games and the bell curve is that players are rewarded for their skill. The designer manifests a power hierarchy (the ranked progression) and artificial scarcity of rewards (gatekept behind ranks).

Although it’s traditional to reward for higher skill like in sports or the Olympics, athletes have years of education, training and coaching in sportsmanship. Joe who plays PUBG after work? Probably less so.

In competitive games, players usually opt to play in groups rather than solo (if it is even offered at all). Solo play, like 1v1 ranked in StarCraft, is notoriously hardcore and players can quickly feel their inadequacy and drop out. But in a group matches, it’s less clear. Dunning-Kruger is in full effect, and most players don’t blame themselves but instead either the matchmaker for putting them with (in their view) lesser skilled players, or worse blame the other players directly for their loss.

It is my belief that most online toxicity today comes from the designer gating extrinsic rewards behind a rank that the player is desperately trying to achieve, and in most cases will never achieve. The truth is that the vast majority of players will never obtain the highest rank in a competitive game, and that their ranking on the distribution curve is borderline genetic.

So now imagine Joe who comes home, stressed out to play Rocket League the night before the season ends, is 50 points away from getting Gold, but his teammate (via the matchmaker) missed the last goal in the final seconds of the match. What are the expected outcomes in this situation? If the player demonstrates toxic behavior after the match, is it just the players fault or the designer who created this situation? It’s interesting that it’s vogue to blame Facebook and “social media” for it’s effect on social capital, but game designers get a pass.

Theorycrafting Solutions

A few years ago I wrote a tweetstorm on toxicity, MMR and ranked rewards. Although it had support, there was also skepticism on solutions. Is the solution really to remove the matchmaker and/or remove ranked rewards? The formula has been massively successful and lucrative for online games, and it’s unlikely that developers will give up on their cash cow no matter how much they dislike toxicity.

Player Curated Communities

Although skill-based matchmaking has made finding a game fast and convenient, it has dominated all other types of ways to play. In an era of Roblox, the “Metaverse” and “democratization of gaming,” giving players options and variety is increasingly important. Not all players want to play “skill-based ranked” and not all players in ranked have the same values or goals. Some want to let off steam and “chill,” some want to play to win, and some are just trying to figure out what the hype is about. In the “dedicated server” era, players could curate and craft different experiences in their own servers. No fancy AI moderation needed. It turns out players like having others to play with and be part of their community, and will maintain and police their own communities.

Important sidenote: when you do add other types of game modes, make sure they have some sort of progression (it doesn’t have to reward skill) otherwise there’s still only one “real” way to play as players see that as the one that gives the rewards.

Pro-Social Design

Take a cue from MMOs like World of Warcraft, where communities are built within guilds. A community of players creates a culture of trust reducing the chance of disruptive player behavior among guildmates. And when disruptions do happen, players self-regulate the community. In my WoW guild, I would often mediate disagreements and conflict among guild members. That experience is very unlikely to happen in a random matchmade party.

Games which reward individuals over teams can become anti-social, instead focus on rewarding groups. Rather than treating players as disposable (the next best team is only a queue away), create interdependence among players - in order to succeed we need to work together.

Player Education

Rather than “actioning” an account when a player first breaks the rules, instead educate them about what the rules are and how they broke them. Tell them that to act out in a competitive game is normal, and have them self-reflect on what really caused them to break the rules. A lot of times players don’t even know why they were banned in the first place, and if the system is too sensitive it can create false positives. By allowing them to relive their toxic moment, they can begin to learn what’s right and what’s wrong.

On the other side of the coin, educate players on how to deal with bad mannered players. As mentioned earlier in this post, understanding that individuals who bully or inflict pain on others usually are feeling pain themselves. Rather than thinking of yourself as a victim, think of yourself as an audience to some troubled person. Or think about it in the Stoic sense that you can’t control what others say or do to you, you can only control your reaction to it.

Mute, Don’t Perma Ban

Permanently banning players rarely is effective, because often times the most “toxic” players are also the most engaged players. They will simply make a new account (a “smurf” account) and the game will have to re-seed the player’s social profile all over again, giving them fresh victims to prey upon.

Don’t Reward Skill (Too Much)

What is the mission of your company and your game? Is it to bring players together and foster community, or is it to figure out who has the highest skill and celebrate them? Maybe it’s okay just giving the most skilled players a rank or icon - maybe distribute the skins, titles and other fancy stuff for other things you value.

Remember that we live in a world where people feel like they’re increasingly irrelevant (automation, wealth divide), why make them feel that way in your game as well?

Final Thoughts

It sometimes is lost on us as designers that we’re creating games for real people, for humans. And humans are flawed, if we weren’t “human nature” would be called “human science.” In The Happiness Hypothesis, psychology researcher and professor Jonathan Haidt coined the idea of the “elephant and the rider”:

“The image I came up with for myself, as I marveled at my weakness [of willpower], was that I was a rider on the back of an elephant.

I’m holding the reins in my hands, and by pulling one way or the other I can tell the elephant to turn, to stop, or to go. I can direct things, but only when the elephant doesn’t have desires of his own. When the elephant really wants to do something, I’m no match for him.”

Mark Bradford, like many players, regretted his behavior and struggled to make sense of his own actions. We’re social and emotional beings, and as we learn about the self it’s clear that we have less agency over our actions than we think - especially when we’re immersed in fast paced combat in an online game. In situations like that, the elephant is running the show. As game designers we should see players more as elephants (emotional), less as the rider (rational).

⚡ Bonus Hot Takes!

  • We’re going to have a Clubhouse Chat on this topic this Wed @ 6PM PST - Visit for more details!

  • I wanted to review in detail all of the different ways studios have been combating toxicity, but the piece is too long as it is… so here’s my quick take on each:

    • Muting - Allowing other players to mute others is good and works.

    • Team muting - Muting players based on player reports is good.

    • Auto-muting - Muting players based on AI or based on keywords can work and can be okay depending upon the robustness of the AI.

    • No text/voice chat - Some games turn off all text chat. While this can work, players notoriously spam emotes (like “Well Met!” in Hearthstone or “What a Save!” in Rocket League) to troll anyways. In my view, turning off player communication is dehumanizing. At that point, why are you having players even play against each other?

    • Auto-banning - Banning players automatically in a match (or kicking out of a session) based on things they said (text or voice) is not good. Really not good if your machine learning isn’t robust enough, like this example of a player being booted out of an Apex match for using the Japanese word “nigero” (which means “run away”).

    • “Karma” Incentives - Giving extrinsic rewards to players (like skins, titles etc.) for good standing in the game (or by a social score given by other players) sounds good in theory, but can be gamed by players and can drive collectors who really want that reward crazy.

    • Tribunals - Riot tried it, and Riot said it didn’t really work. Valve is sort-of trying it now with DOTA 2 (“Overwatch”), and anecdotally I hear it isn’t really working either.

    • Constantly talking about Toxicity - Counterproductive. It just makes your game and brand look more toxic than it probably really is. Celebrate good behavior rather than platforming bad (this is why sportsmanship in esports is important).

    • Forcing toxic players to play with other toxic players - Though funny, doesn’t work. You want the player to be encouraged to learn and grow out of their behavior, not wallow in it.

    • Bully Hunting - Yea, no.

  • I often am asked about why WoW PvP doesn’t have solo queue. While it’s not my decision to add it or not add it, I tend to be skeptical solo queue would be healthy for the game.

  • There are many studies that show that anonymity amplifies toxic behavior, but as Facebook toxicity demonstrates, it’s not a hard and fast rule.