Detox - The First Line of Defense

Our Features

 
Icon - Neural Network.png

Deep Learning

Advanced neural networks understand the semantic difference between “I’m dumb” and “You’re dumb”.

hand-drawn empty pill bottle icon

Spell Check

Spelling correction ensures prediction accuracy is not hindered by typos made in the heat of battle.

Icon - Scoreboard.png

Loss Detection

Proactive protection kicks in during dire situations so you can calm things down before tempers flare.

 
 
hand-drawn simplified needles in back acupuncture icon

New Player Protection

Guard newbies against trolls (the bad kind!) to give them space to make mistakes and learn from them.

Icon - Sliders.png

Adjustable Filters

Tailor filter strictness to your target audience, or let your users decide based on their personal preferences.

 

Case Study: World of Warcraft (WoW)

World of Warcraft is one of the most popular MMORPGs of all time, attracting gamers from all walks of life. Many of the game objectives are achieved through working with other players, requiring coordinated groups to take down dungeon bosses, tackle raid encounters, and defeat other groups in player-versus-player combat. However, in spite of efforts by the game developers to moderate anti-social behaviors, the WoW community has earned a reputation for being toxic and unwelcoming to newer players, causing many to quit the game.

A number of factors explain why toxicity is endemic in WoW. In many instances, groups are formed of complete strangers by the in-game matchmaking systems, or through random players banding together in a one-off encounter. Due to the huge player base that WoW enjoys, it is highly unlikely that players will cross paths again after their interactions are over. As a result, every other player is a faceless character in a near infinite sea of players. Like tourists littering in a foreign country, gamers face little social consequences for their actions and have no stake in their community.

“every other player is a faceless character in a near infinite sea of players”

The age of the game also serves as a barrier of entry to newer players. Veterans who have played WoW across its 16 year lifespan adhere closely to an unspoken ‘meta’ and get frustrated when other players deviate from the meta, such as not taking the most efficient path when running through a dungeon. These veterans have completed the activity dozens - if not hundreds - of times before, so they have little patience for players who are learning the ropes and “wasting” their time.

Detox was created as a third party add-on* in order to alleviate these problems. By automatically filtering toxic messages, Detox blocks out the negativity and makes playing with others a less stressful experience. Players using Detox no longer have to feel anxious about getting flamed should they make the tiniest of mistakes, allowing them to fully enjoy the game instead.

* DetoxAI is not affiliated with nor endorsed by Activision Blizzard.

 
In-game screenshot 1: Detox notifies player of blocked toxic messages in chat, giving them the choice to view or ignore the messages. For the purposes of demonstration, all of the toxic messages were revealed in these screenshots.

In-game screenshot 1: Detox notifies player of blocked toxic messages in chat, giving them the choice to view or ignore the messages. For the purposes of demonstration, all of the toxic messages were revealed in these screenshots.

In-game screenshot 2: Angry player lashing out at less experienced gamers after a failed raid encounter.

In-game screenshot 2: Angry player lashing out at less experienced gamers after a failed raid encounter.

In-game screenshot 3: Frustration from failing to complete a dungeon leads to toxic player giving up and insulting the group leader.

In-game screenshot 3: Frustration from failing to complete a dungeon leads to toxic player giving up and insulting the group leader.

FAQs

 

What is toxicity?

Toxicity in the online context refers to behaviors that adversely affect other individuals. Insults, harassment, excessive use of profanities, and severely negative remarks are examples of toxic communication that can worsen the mood of recipients, leading to a poor experience with your game. What is considered toxic is inherently subjective and frequently depends on the situation, therefore we provide the tools to tailor the strictness of Detox according to your player demographics.

How does Detox recognize toxicity?

At the core of the Detox Chat Understanding Engine (CUE) is a deep neural network that is powered by machine learning and natural language processing. The baseline CUE network has been trained on millions of messages and data points to pick out toxic messages, so that the network can be deployed out of the box with reliable accuracy.

For Enterprise clients, we provide custom model calibration to tailor the network to your game community’s communication style and game-specific jargon.

How do we integrate with Detox?

Our devkit contains the software library / package that can be imported, loaded, or referenced in your game code, depending on the language you use (e.g. Javascript, C#, Lua, etc.). As the functionality is embedded in your game client, all computation is done directly by your players' hardware so you never have to worry about infrastructure, service up-time, or privacy issues.

What does the Detox library do?

The library takes in a chat message as input and outputs a yes/no value for whether the message is toxic. It determines this based on message content as well as the rules you configured, such as filter strictness, new player protection, and loss detection. You can then decide if the message should be blocked, hidden from view, and/or escalated internally to your moderation team.

 

Don’t just take our word for it.
Try it for yourself.