AutoMod is a machine learning system built to combat Twitch trolls
With a large audience comes an increase in the racist and sexist abuse during streams. In some cases the vitriol is spewed at such a rate that human moderators can’t stem the tide. For those people, Twitch is offering a new tool called AutoMod. AutoMod tries to detect inappropriate messages and preemptively send them to the moderation queue for approval or dismissal.
The AutoMod system has four levels of aggression and will crack down on four different categories: identity, sexual language, aggressive speech and profanity. While trolls will try to circumvent AutoMod’s filtering, Twitch says the tool goes beyond banned words and will “employ machine learning and natural language processing to identify and block inappropriate content from appearing in chat. Beyond identifying inappropriate words and phrases, AutoMod can detect potentially inappropriate strings of emotes and other characters or symbols that others could use to evade filtering.”
AutoMod is available in English, with beta versions in Arabic, Czech, French, German, Italian, Japanese, Korean, Polish, portuguese, Russian, Spanish and Turkish.
Streamers can turn on the feature in their personal settings page.
Source: Ars Technica