I have enabled Google Perspective for posts on the community. Google Perspective checks posts for “toxic” text before allowing them to added to the community.
We currently don’t have an issue with any kind of toxicity in posts but I wanted this added to keep moderation of the community as automated as possible allowing the @staff to focus on answering questions and helping members.
When you submit a post that may be toxic you will see the following image and have a chance to change your wording to something more appropriate:
For more information:
Perspective is an API that makes it easier to host better conversations. The API uses machine learning models to score the perceived impact a comment might have on a conversation. Developers and publishers can use this score to give realtime feedback to commenters or help moderators do their job, or allow readers to more easily find relevant information, as illustrated in two experiments below. Our first model identifies whether a comment could be perceived as “toxic” to a discussion.
Let us know if you run into any issues with Perspective.