Broken Windows Theory: Why a Toxic Reputation is Self-Fulfilling

A look at online gaming toxicity through the lens of real world problems

Broken Windows.jpg

While bad actors exist in just about every popular online game, some titles have become infamous for their toxic communities. Ask any seasoned gamer to name the most toxic games and the same names always come up: League of Legends, Call of Duty, World of Warcraft, and Counter-Strike.

Despite developers’ continued efforts to tackle this problem, the player bases for these games have remained incredibly toxic over the years. In this whitepaper, we explore how a game’s reputation of toxicity is analogous to ‘broken windows’ in a city, and devise a framework for discussing the factors that promote toxic behaviors in the long term.

Broken Windows Theory

Originating from criminology research, the ‘broken windows theory’ states that visible signs of anti-social behavior in an environment encourage further offenses, because of the perception that crimes go unpunished and disorder is the norm [1]. In a neighborhood where a broken window is left unrepaired, the theory suggests that more windows would soon be broken and followed by other forms of vandalism. The original authors of the broken windows theory refer to a experiment where a car sat untouched on the streets for over a week, but after the car was intentionally smashed by the experimenter, passersby soon joined in and the car was destroyed within hours [2].

For online games, toxic behaviors such as griefing and flaming are equivalent to smashing up windows, and the (perceived) lack of consequences for perpetrators reinforces the image of a derelict neighborhood. The result is a negative cycle of increasing toxicity, driven by four compounding factors:

Cycle of Toxicity Small.png

Factor 1: Perception of Disorder

Car on Fire.jpg

When new players see offenders get away with bad behaviors, they gain the impression that there are no consequences to such actions. These initial impressions matter greatly for shaping their attitude in the long run. Studies have found that individuals’ long term behaviors are greatly influenced by their first impressions of the community [3], and the absence of a threat of punishment is the biggest contributor to online anti-social behavior [4]. In other words, if people around them are not punished for breaking windows, they are likely to join in and break windows soon thereafter.

Also, the perception of a game’s handling of toxicity can be more impactful than the truth. As compared to the physical world, there are fewer cues that someone has been punished for their actions in the digital world. While in real life one may overhear police sirens and learn from neighbors about an arrest, gamers form an opinion based on the incomplete information they have: how many toxic players are still active in game? Have any friends ever been punished? What are their favorite streamers saying? For games with a toxic reputation, the natural conclusion is that the game developers do not care about player behaviors.

Factor 2: Self-Selection

Exit Sign.jpg

If a neighborhood deteriorates and faces increased anti-social behaviors, upstanding residents may choose to move out of the community for greener pastures. Prospective newcomers, upon seeing broken windows and vandalism, may be less inclined to settle in the area. Only those who tolerate or partake in disorderly behaviors are content to stay behind - though in real life there are barriers (e.g. financial, social, ethnographic) that prevent people from leaving even if they wanted to.

Rarely do such barriers exist in the virtual world. A survey by ADL on player retention found that around one in five players who experience toxicity stopped playing a game entirely, and more than one-quarter of respondents avoided certain games due to their reputation for hostile environments [5]. Conversely, research shows that players who themselves behave in a toxic manner tend to stay on longer in such environments [6]. Female gamers are affected in more ways by online toxicity as they are harassed or insulted for their gender, in addition to gender agnostic performance-related abuse [7], and are more likely to withdraw from gaming as a result [8].

The self-selection of players who are willing to put up with toxicity creates an echo chamber where abusive, sexist, and racist language becomes a normal part of speech. Without a diverse community to call out inappropriate behaviors, trolling, sexism, and racism are passed off as “in-jokes” or “banter”, fostering a ‘frat culture’ that drives away outsiders and exacerbates the problem.

Factor 3: Behavior Normalization

Baboons Fighting.jpg

Over time as players regularly witness toxic encounters, behaviors that would be appalling to outsiders become normalized in the community. Due to desensitization from repeated exposure, veteran players are significantly more likely to use aggressive language and flame others [9]. In addition, gamers playing with friends tend to reinforce or imitate each others deviant behaviors as a result of social learning [6].

These effects lead researchers to believe that toxicity is contagious [10][11]. Players who encounter toxicity are more likely to exhibit similar incivility in subsequent weeks, causing a chain of contamination that affects the entire game community. As the broken windows theory postulates, people who grow up around window breakers learn that breaking windows is acceptable.

This factor is subtly distinct from the first we discussed. With a perception of disorder, players may know that toxic behaviors are undesirable but think they can get away with it. Whereas, when these behaviors become normalized, players feel indifferent because toxicity is “a part of life in the world of modern multiplayer games” [12]. Left unchecked, newer generations of gamers will internalize this status quo of disorder and continue to join the ranks of jaded veteran players.

Factor 4: Low Community Ownership

Litter on Beach.jpg

With everyone around them being toxic, individuals can feel disinhibited and join in on the toxicity. The commonplace nature of toxic behaviors induces mob mentality and increases anonymity, causing individuals to feel less accountable for their actions, an effect psychologists call deindividuation [13]: if everyone is littering, why shouldn’t I also do it?

The anarchic, “free-for-all” atmosphere following persistent disorder leaves victims and bystanders with a sense of defeat, thinking that nothing can be changed. A large majority of gamers believe that in-game reporting systems are ineffective, citing that previous reports did not result in meaningful action or that moderators are uninterested [5][14][15]. While this belief may not necessarily be true, perceptions and impressions are important for shaping player behavior, as we discussed earlier.

Furthermore, a defeatist mentality reduces community initiative to self-police. One study found that only a fifth of players said they stand up to harassment when they see it [16], and in another study participants said that they do not get involved with toxicity between other players [17]. Social sanctions, such as speaking out against or ostracizing offenders from player communities, are an informal alternative to moderator enforced punishments in creating consequences for undesirable behaviors. When these community-led penalties disappear, bad actors get off scot-free which then worsens the perception of disorder, thereby perpetuating the cycle of toxicity.

Conclusion

We showed how a disorderly environment empowers bad actors and drives away newcomers, which subsequently normalizes anti-social behaviors and disillusions residents. For an online game with no physical space, players’ perception of the game is the closest approximation to an environment, and so the first step towards tackling toxicity is to repair its reputation. In our next whitepaper, Breaking the Cycle of Toxicity, we propose solutions to curb online gaming toxicity by addressing the four factors discussed here.

While we recognize that the broken windows theory has been associated with controversial policing tactics in New York City that lasted up till the past decade, the theoretical framework it provides remains useful for understanding the prevalence of uncivil behavior in communities. By drawing parallels to the real world, we hope that it will be easier for you to visualize the impact of your decisions whether in developing a game, conducting experiments, or forming policy.

‘Breaking the Cycle of Toxicity’ will be published in late 2021, so stay tuned.


References

[1] Kelling, G., & Wilson, J. (1982). Broken Windows - The police and neighborhood safety

[2] Zimbardo, P. G. (1969). The Human Choice: Individuation, Reason, and Order versus Deindividuation, Impulse, and Chaos

[3] Engel, C., Beckenkamp, M., Glöckner, A., Irlenbusch, B., Hennig-Schmidt, H., Kube, S., Kurschilgen, M., Morell, A., Nicklisch, A., Normann, H. T., & Towfigh, E. (2014). First impressions are more important than early intervention: Qualifying broken windows theory in the lab

[4] Davis, J. P. (2002). The Experience of ‘ Bad ’ Behavior in Online Social Spaces : A Survey of Online Users

[5] Anti-Defamation League (2020). Free to Play? Hate, Harassment and Positive Social Experience in Online Games 2020

[6] Shores, K. B., He, Y., Swanenburg, K. L., Kraut, R., & Riedl, J. (2014). The identification of deviance and its impact on retention in a multiplayer game

[7] Bergström, H., & Ericsson, N. (2020). How toxicity differ between male and female players in competitive Overwatch

[8] Fox, J., & Tang, W. Y. (2016). Women’s experiences with general and sexual harassment in online video games: Rumination, organizational responsiveness, withdrawal, and coping strategies

[9] Elliott, T. (2012). Flaming and gaming–computer-mediated-communication and toxic disinhibition.

[10] Shen, C., Sun, Q., Kim, T., Wolff, G., Ratan, R., & Williams, D. (2020). Viral vitriol: Predictors and contagion of online toxicity in World of Tanks

[11] Lin, J., & Riot Games (2013). The Science Behind Shaping Player Behavior in Online Games

[12] Blackburn, J., & Kwak, H. (2014). STFU NOOB!: predicting crowdsourced decisions on toxic behavior in online games

[13] Postmes, T., Spears, R., & Lea, M. (1998). Breaching or building social boundaries? SIDE-effects of computer-mediated communication.

[14] Cook, C., & Kowert, R. (2021). Toxic Waste Removal: Toxicity in Games and the Perceived Effectiveness of Reporting

[15] Pohjanen, A. E. (2018). Report, please! A survey on players’ perceptions towards the tools for fighting toxic behavior in competitive online multiplayer video games

[16] Cary, L. A., Axt, J., & Chasteen, A. L. (2020). The interplay of individual differences, norms, and group identification in predicting prejudiced behavior in online video game interactions

[17] Adinolf, S., & Türkay, S. (2018). Toxic Behaviors in Esports Games: Player Perceptions and Coping Strategies