IE 11 is not supported. For an optimal experience visit our site on another browser.

Far right militias are returning to Facebook — and it’s looking the other way

Inattention from the platform's parent company Meta risks abetting another event like Jan. 6.

It’s happening again: Armed, antigovernment militias and other like-minded groups are organizing and recruiting on social media at a scope and pace not seen since the lead up to Jan. 6. This time, according to a startling report by journalist Tess Owen for Wired magazine last week, Meta’s Facebook platform is their haven of choice.

I say “again” because we’ve been here before. Work done by the House Select Committee on the January 6 Attack demonstrated the breadth of social media’s contribution to rallying people to Washington, D.C., and stoking the violence and mob mentality that day. While only a hint of the Jan. 6 committee’s work was included in their public report, committee investigators prepared a 122-page memo detailing their research and their interviews of workers at Twitter, Facebook, YouTube and other platforms. According to a draft version of the report, first revealed by The Washington Post, the investigators’ key finding was that those tech companies ignored their own employees’ who raised red flags about increasingly violent chatter. Is history repeating itself — and if so, why Facebook, why now, and what can be done?

Facebook’s mainstream credibility and visibility provide extremist recruiters with a wider audience.

Three years after the Capitol insurrection, extremist, far-right militias — many already banned by Facebook — are back on the platform. This time, they appear more organized, smarter and more encouraged by Meta’s indifference toward enforcing previous bans. The groups are quite open about their particular niche ideology, such as antigovernment Three Percenter beliefs, and about advertising meetups and combat training to prepare, as one recruiter wrote, for “what’s coming.” Whatever that is.

Wired reports that there are about 200 Facebook groups and profiles that are antigovernment and far-right extremist using the platform’s “Groups” function to support or plan militia activity across the U.S. A typical example is the Free America Army, with 650 members. They use the logo of the Three Percent militia network — an image of a man in tactical gear brandishing a long gun. These aren’t your garden variety, weekend warrior, beer and BBQ buddies shooting at cans in the woods. It had drawn in members from the Kentucky Three Percenters, the Virginia Liberty Guard, and the Guardians of Freedom — a group with members arrested for their role on Jan. 6. Despite being previously banned by Meta, Free America Army leaders have returned to Facebook, and also serve as administrators for a larger, public group called Freedom Across America with 2,000 members.

Why Facebook? As explained in the Wired piece, Facebook’s mainstream credibility and visibility provide extremist recruiters with a wider audience. Further, the Facebook group space helps militias coalesce and form networks across previously decentralized militias.

Katie Paul, director of the Tech Transparency Project, told Wired she has monitored hundreds of militias and related groups for the last three years and finds them “increasingly emboldened with more serious and coordinated organizing in the past year.” (TTP is a a research initiative of Campaign for Accountability, a nonprofit watchdog group.) She notes, “Facebook remains the largest gathering place … where they can plan and coordinate with impunity.”

It’s the impunity part that’s as troubling as the networking and organizing part. Facebook claims that it “bans paramilitary organizing” and listed Three Percenters as “an armed militia group” alongside dozens of other militia groups in its 2021 “Dangerous Individuals and Organizations” list. The platform has removed some of the violent extremist sites recently, and told Wired that they are “removing groups and accounts that violate our policies.” But in the past 12 months, other banned groups, like immigrant border militias and the Boogaloo movement have resurfaced on Facebook.

Perhaps, as the country ramps up to an election fervor, Meta executives think it best to simply stay out of the content moderation business.

Facebook is clearly not rising to meet the threat — but why not? Meta earns billions in profits and is loaded with smart people who know everything there is to know about social media, so it doesn’t ring true when they claim that "we keep investing heavily in people, technology, research, and partnerships to keep our platforms safe.” It hardly seems coincidental that militias are returning to the site a year after Meta let go over 200 employees tasked with moderating content. Such cuts make it clear where the company’s priorities lie. And last week came reports of imminent layoffs of the Meta-funded oversight board, which oversees content moderation across its various platforms.

Why is Meta seemingly backing off moderation now, in the face of warning signs and indicators of violence ahead of a presidential election? In part it’s precisely because of that upcoming election. After the 2016 election, social media companies were accused of allowing Russian disinformation designed to benefit the Trump campaign. Four years later, those platforms were accused of silencing the infamous Hunter Biden laptop story at the behest of the FBI and/or the Democratic Party. Meta CEO Mark Zuckerberg said in a podcast interview, “The background here is that the FBI came to us — some folks on our team — and was like, ‘Hey, just so you know, you should be on high alert. We thought there was a lot of Russian propaganda in the 2016 election, we have it on notice that basically there’s about to be some kind of dump that’s similar to that.’”

Zuckerberg conceded that the bureau never mentioned Hunter Biden’s laptop, but said that Facebook made a mistake in taking down the story: “When we take down something that we’re not supposed to, that’s the worst.”

Decision-making at Facebook is complex and layered with often competing objectives. Security can conflict with profit. Content moderation can dampen clicks and followers. It’s also quite possible that Facebook is once bitten, twice shy. Perhaps, as the country ramps up to an election fervor, Meta executives think it best to simply stay out of the content moderation business, lest they once again be accused of favoring one party or another.

But doing next to nothing about armed militias preparing for battle while you sit idly by isn’t a safe choice. It’s a dangerous decision that will ultimately be bad for business — and even worse for America.