Facebook just gave group administrators access to more features, even as they abuse the platform and promote moderation evasion
Facebook unveiled these new features while refusing to address problems plaguing its groups, including that over 1,000 are dedicated to misinformation
Written by Kayla Gogarty
Published
Facebook unveiled new features today for Facebook Groups that empower administrators to encourage anonymous posting and make subgroups. This announcement completely ignores the problems plaguing the groups, such as “harmful topic communities” that are at risk for violence and administrators who use tactics to avoid content moderation.
On November 4, Facebook announced new tools for Facebook Groups at its Facebook Communities Summit and in an accompanying blog post. These features allow moderators to create a “customizable greeting message” for new members “to automatically receive” and allows members to customize post formats and give other members community awards. Notably, the announcement revealed additional tools for group administrators that will allow them to unlock a lot more features that were typically allowed only for specific group types; Facebook is also letting them make subgroups, have community chats and recurring events, and use preset “feature sets,” including one with anonymous posting.
By giving administrators more power and access to more features, Facebook is ignoring all of the significant problems with its groups, particularly that “harmful topic communities” and administrators abuse the features they already have.
Citation From the November 4, 2021, edition of the Facebook Communities Summit
TOM ALISON (HEAD OF FACEBOOK APP): Now we also know that communities engage in their own different ways. Previously, admins had to select a group type to unlock special features for their groups. So if you were to select, let’s say, the parenting group type, you’d be able to unlock features like mentorship from other members or give special badges. But as you can imagine not every group can be limited to just one type. For example, you could be a parenting group, but also be a [UNINTELLIGIBLE] group within your local community. So moving forward, I’m excited that we’re going to give admins the ability to select whatever features they want. So, if you want anonymous posting in your group, you can have it. If you want mentorship and sales listings, you can have it too. And if you might not know exactly what you want feature-wise for your group and are looking for guidance, you can soon use feature sets, a new tool in admin home that’s going to offer pre-set post formats, badges, and admin tools to help you get started so that you can curate the best experience for your members.
…
We’re also thinking about how to deepen the connections you’ve made within these groups. And what it comes down to is that we want to make it easier for groups to have different types of discussions that are most relevant to them. So right now groups have one main feed where all discussions live and it can be hard to find content you want to engage in when there are so many different topics covered in one place. So I’m excited that soon admins will be able to create subgroups within their groups that can be more focused spaces organized around a theme or occasion that admins can manage from one place. And when people want to connect in real time, they’ll also soon be able to use community chats in both Facebook and Messenger, as well as create recurring events when they want to get together more regularly, whether it’s virtually or in-person.
The company is reportedly aware that “harmful topic communities” -- such as those dedicated to the QAnon conspiracy theory, COVID-19 denial, “Stop the Steal,” and anti-vaccine efforts -- can lead to offline violence or harm. But Facebook has repeatedly failed to remove many groups that seemingly violate its policies related to COVID-19, vaccine, and election misinformation -- even when the violative groups are brought to its attention. In fact, Media Matters reported on over 1,000 groups dedicated to such misinformation that were active as of last week.
Facebook also already gives administrators a lot of responsibility, which some abuse. In fact, these administrators often use various evasion tactics to avoid content moderation and encourage members to do so as well. These tactics include using code words, creating backup groups, creating networks of groups, changing to more innocuous group names, and putting more extreme content either in the comments or on alternative platforms like Telegram, MeWe, and Gab. Administrators acknowledge that Facebook has the biggest reach, so it is important for them to maintain a presence on the platform, even if more incendiary content has to be on other platforms with even less policy enforcement.
As but one example, The Unvaccinated Arms group has had at least four iterations, with Facebook removing and reinstating at least one group and eventually removing all restrictions on it. Although two versions of this group are now archived, there are two active private versions that encourage their over 36,000 combined members to use a Telegram channel for any links so that the group can “stick together and grow” without raising suspicion on Facebook.
Administrators are also able to make groups “private” and even “hidden” -- which renders them invisible to anyone not already a member -- making them harder to monitor and moderate. (Tools that researchers use to analyze content on the site, such as CrowdTangle, include only publicly available posts from public groups, pages, and profiles.)
Giving groups and administrators more tools and features seems poised to only make things worse.