top of page
Search
Writer's pictureInsightrix Communities

Google Battles Trolls Through Perspective

Updated: Jan 3, 2020

Written by: Evan Goodfellow


Fighting Trolls is a Real Issue. Google has a new API called Perspective to help aid in the fight.



Fighting the Good Fight


In a recent article on Mashable by Marcus Gilmer entitled, Google Rolls Out its New Tool to Fight Disgusting Internet Trolls, the author looks at Google’s new API which was created to deal with internet trolls. The new API is called Perspective and was created by Jigsaw and Google’s Counter Abuse Technology. The software was created primarily for publishers who are constantly having to monitor message forums which are rife with toxic comments from trolls. The article caught my attention because one of the main concerns new clients have when buying market research online community software is around the qualitative discussion boards and how to control trolls.


Luckily our community software removes the anonymity that many discussion boards offer which drastically cuts down on trolls. But where do you start when trying to program a software so that it can distinguish between positive and negative comments along a continuum? Gilmer writes how the programmers first needed to define what “toxic” is and create a scale ranging from “very healthy” to “very toxic.” “For reference, the definition of “toxic” in this context was, “a rude, disrespectful, or unreasonable comment that is likely to make you leave a discussion.” Jared Cohen the president of Jigsaw said in an interview with The Washington Post, that the API collected millions of comments from such sites as The New York Times, as well as from online harassment victims to help them define and create the toxicity scale.


Gilmer writes that “Jigsaw product manager CJ Adams said the purpose of Perspective was to empower publishers to “host robust debates and … help people stay online without having to read every bit of abuse hurled their way.” While some news agencies have given up and closed comments sections down altogether, Google see’s the potential debate and the exchanging of ideas can have and sought to use technology to eliminate or greatly reduce this threat. Perspective seeks to keep healthy debate and discussion alive.


Filter According to Your Preferred Level of Toxicity


You can go on to the API website and see how the tool works. Gilmer writes how you can see on the site three main highly charged topics such as Brexit, Climate change, and the 2016 U.S. election. “Using a slider tool, you can see how certain comments were filtered by toxicity. Slide all the way to the left and you get only the least “toxic” comments. Slide to the right, and you get everything, including the most “toxic” comments,” writes Gilmer.


The article goes on to state that while Perspective is still in it’s infancy, sites can now request to use Perspective. Some publications that have already adopted the tool include The New York Times,The Economist, and The Guardian. Each publisher can use the API as they see fit. They can use it to flag comments that are toxic, in order for a moderator to go through and delete them. Or as Gilmer writes, “they can even set up their forum so that each user can select their toxicity threshold.”


While the API is relatively new, it does seem to show promise not just in being able to stop harassment but also in its ability to encourage healthy debate. Gilmer writes, “Jigsaw is definitely taking an optimistic approach to the API’s implementation, focusing on how it could evolve discourse rather than simply shut down harassment.” It will be interesting to see how accurate software can become in detecting healthy debate, and understanding toxic comments.


Healthy Online Research Communities


We at Insightrix Communities know that discussion formats are of vital importance for the discussing of ideas for a city, an association or a company. That is why our software has been created in such a way so as to encourage healthy discussions, and have the safeguards in place to deal with inappropriate posts. This is done through members filling out profile surveys with personal information, email verification, and moderator tools that can easily remove someone from the community, and censor unwanted comments. Our discussion threads are often cited by our members as well as clients for having positive, constructive discussions. The majority of the members who use our community software are there because they are seeking some change either in their city, association, or with a brand they love which helps create positive constructive comments.


If you would like to find out more about how our software can help you, please contact us.


Check us out on LinkedIn!

5 views0 comments

Comentários


bottom of page