Part of a suite of new tools for managing Facebook communities
Facebook is launching a suite of new tools to help group admins get a grip on their communities. Some simply offer a clearer overview of posts and members, while others are designed to help admins tackle conflict — including an AI-powered feature that Facebook says can identify “contentious or unhealthy conversations” taking place in the comments.
This tool is called Conflict Alerts, and Facebook says it’s just testing it for now so availability is unclear. It’s similar to the existing Keywords Alerts feature, which lets admins create custom alerts for when commenters use certain words and phrases, but uses machine learning models to try and spot more subtle types of trouble. Once an admin has been alerted, they can take action by deleting comments, booting users from a group, limiting how often individuals can comment, or how often comments can be made on certain posts.
How exactly the feature will detect “contentious or unhealthy conversations” isn’t clear though, and when reached for comment, Facebook offered little additional detail. A spokesperson said only that the company would use machine learning models to look at “multiple signals such as reply time and comment volume to determine if engagement between users has or might lead to negative interactions.”
Presumably, though, Conflict Alerts uses AI systems similar to those Facebook already deploys to flag abusive speech on the site. These sorts of models are far from 100 percent reliable and are often fooled by simple things like humor, irony, and slang. However, they should be able to pick up on the more obvious cues that an argument is happening — like someone calling people “IDIOTS!” as in the sample screenshot above.
Other new tools for admins being introduced today include a new admin homepage that will function as a dashboard, offering an overview of ”posts, members and reported comments” as well as access to new member summaries, which compile “each group member’s activity in the group, such as the number of times they have posted and commented, or when they’ve had posts removed or been muted in the group.”
And there’s also a new Admin Assist feature for automated comment moderation. This will let admins restrict who is allowed to post comments (letting them block recently joined users, for example) as well as curb spam and unwanted promotions by banning certain links. The Conflict Alerts feature will be part of Admin Assist.