Tom Alison, VP of Engineering for Facebook posted a new change to the social media platform on the site’s Newsroom website on Wednesday. In order to curve what Facebook calls “hateful content” and “misinformation” being shared and discussed online, they’ve made some changes that will effect Facebook Groups, and their members.

The update reads: “Today we’re sharing the latest in our ongoing work to keep Groups safe, which includes our thinking on how to keep recommendations safe as well as reducing privileges for those who break our rules. These changes will roll out globally over the coming months.”

This attempt to police groups not only applies to the public, but also private groups on Facebook. AI has been patrolling the site nonstop already, along with 30,000+ people constantly reviewing flagged or reported content, but now there will be even more censorship applied by hiding groups from the general public.

Recommendations for “low-quality groups” will now show up lower on the “Recommended Groups” list, so that less people will discover they exist on the platform, although they will still be searchable, according to Alison.

Given Facebook’s widespread censorship of conservatives, there will without a doubt be a demand for a scale where the public can understand what Facebook constitutes as high vs. low-quality groups.

Facebook also recently removed civic and political groups from the Recommended lists.

“We are also adding more nuance to our enforcement. When a group starts to violate our rules, we will now start showing them lower in recommendations, which means it’s less likely that people will discover them. This is similar to our approach in News Feed, where we show lower quality posts further down, so fewer people see them.”

Source: Facebook Newsroom

As for those who repeatedly violate Facebook’s ever-changing rules, you’re going to pay a heavy price. How are these violations policed exactly? According to Facebook’s Community Standards page, violations can be picked up by AI or reported to Facebook by other users. From the time a violation is flagged, the Facebook Community Operations team reviews the content and makes a determination.

Of course, this could all change at any time because….reasons.

“We believe that groups and members that violate our rules should have reduced privileges and reach, with restrictions getting more severe as they accrue more violations, until we remove them completely. And when necessary in cases of severe harm, we will outright remove groups and people without these steps in between.”

Violations on the platform are handled a number of ways including removing the content, slapping a warning label on the content, fully disabling accounts or contacting law enforcement agencies.

In an attempt to “minimize harm” and keep people “safe” on the platform, Facebook states that they have removed over 1.5 million pieces of content within groups they claim violated their policies on “organized hate” and 12 million pieces of content within groups that violated their policies that they say contained “hate speech“. Both of these things are arguably subjective, and the Supreme Court of the United States has ruled countless times that there is no “hate speech” exemption to the First Amendment.

Groups that Facebook has found to be “harmful” or they claim “promote violence” (Boogaloo Boys, QAnon-related groups, and “US-based militias”) were among Groups recently removed in droves from the platform. In contrast, violent Antifa-based groups or groups affiliated with BLM remain, where they are able to plan and coordinate upcoming events that have resulted in rioting, looting, violence, and the burning of buildings, having not been removed or flagged.

For group administrators who repeatedly violate Facebook’s Community Standards, Facebook will be removing the user’s ability to create new, similar groups Facebook has already deemed “dangerous” by their own, evolving standards. Admins will also not be able to invite new users to the group once the group or admin has violated Facebook’s rules. Admins found to have been in violation will have to approve each post and if any of the approved content is posted that violates the rules, Facebook will remove the group altogether.

Facebook Groups that one may find interesting and want to join, but have violated their policies in the past, will now be welcomed into the group with a warning label explaining that the group has at some point violated Facebook’s Community Standards.

It seems as if each day Facebook comes out with a new way for users to potentially violate their Community Standards or TOS, while other, even more extreme content is allowed to remain and flourish. Unless challenged, Facebook will continue to tighten the grip on those with dissenting voices and those who want to gather online in the digital public square to exchange ideas and discuss topics that Facebook finds objectionable and hateful.

Haley Kennington
Follow Haley