The second impeachment trial of former president Donald Trump opened with a video montage, mostly drawn from social media accounts, showing the Jan. 6 attacks on the Capitol. As the trial has proceeded, we’ve seen the House impeachment managers make more heavy use of social media posts, a trend likely to continue.
The same Congress that is using social media clips in their prosecution of a former president must also play a much bigger role in regulating those forums.
These many posts across multiple platforms show that the domestic terrorists who violently breached the Capitol that day hold a deep and synergistic relationship with their social media outlets. The platforms provide amplification of extremist messaging, disinformation, affirmation and a sense of belonging — all aspects of the radicalization process. And for those platforms, increased membership and traffic means increased revenue from sponsors and advertisers.
But the fact that social media played such a prominent role in the radicalization of white supremacists and far-right extremists that led to the riots that day also means that those same platforms can, and must, play a part in getting us to a better, safer place.
The same Congress that is using social media clips in their prosecution of a former president must also play a much bigger role in regulating those forums. Without jeopardizing free speech or civil liberties, social media companies can be an asset in the ongoing fight against domestic terror groups. And there needs to be accountability when they fail — which may involve Congress making companies criminally liable for aiding and abetting the kind of violence we saw at the Capitol riots through posts on their platforms.
In preventing what could become a permanent insurgency by a politically disenfranchised and disillusioned violent movement, we need a holistic, all-hands-on-deck approach across society.
Social media companies can be an asset in the ongoing fight against domestic terror groups.
Since we still don't have a domestic terrorism law, state and federal law enforcement agencies won’t be the star players in the larger solution to our growing problem online. Educators, legislators and businesses must all do their part in asserting a commitment to truth, transparency and the core American values of our democracy — like the rule of law, our Constitution, free and fair elections, and three equal branches of government.
Most of all, Big Tech social media platforms must lead the initiatives to take back control of the amplified echo chambers of violent extremism.
There is developing evidence that Big Tech leaders, if only out of pure survival instinct, increasingly understand that they must step up and own their role in having gotten us here, and carve out a space to help address the threat they likely helped facilitate.
During an election security news conference in 2020, FBI Director Christopher Wray cited his agency’s partnership with social media providers in the fight against election disinformation and foreign adversaries. Twitter and the FBI partnered to quickly take down 130 fake accounts linked to Iran that were suspected of trying to influence public discussion about the September 2020 Biden-Trump presidential debate as it was happening.
As for the domestic terrorism threat, it looks like a similar, but perhaps quieter, working relationship with law enforcement has already begun. In June 2020, Facebook took down accounts of two organizations it considers to be hate groups: the Proud Boys and the American Guard. That same month, Facebook removed hundreds of Boogaloo Bois accounts from its platform. One month later, the platform banned QAnon.
In the U.K., Facebook partnered with the police to add video of firearms training sessions into its algorithm to help identify violent attacks in its own system, with the protocol to take the videos down and notify authorities. And, late last year, we learned that Facebook had approached the FBI with its concerns about members of a Michigan "militia," whose conduct eventually resulted in 13 men being charged with planning to storm the state Capitol and kidnap the governor. Even Parler, the “alt-tech” social media platform favored by Trump supporters and the far right, reportedly provided information to the FBI about at least one Capitol riot suspect.
We shouldn’t allow terrorists to shop for the most permissible forum.
In fact, a review of the Capitol riot criminal cases by the George Washington University Program on Extremism warrants our attention: In over 150 of the roughly 200 arrests in connection with the riot, law enforcement included social media references as part of their criminal evidence.
The prevalence of social media content in the evidence used against the insurrectionists cuts two ways. It shows social media was part of the problem, but it signals that, because information consumption via posts and tweets seems so paramount to these people, those same platforms can be useful — if they want to be.
Whether driven by capitalistic motives, a desire to be good corporate citizens, or both, when it comes to domestic terrorism, we need social media providers to be less a part of the problem, and more a source of the solution. But we shouldn’t trust just the private sector, or law enforcement, or a combination thereof, to ponder and preserve the myriad of civil liberty and privacy concerns as they move forward. More is needed — particularly in the form of regulation and legislation. Here’s what that should look like.
Congress must set enforcement standards that social media companies are required to follow. Legislation should also hold these companies accountable for not only the dangerous content they stop, but for the bad content they miss. If the government can regulate airlines and track their accidents and “near misses,” the same can be done for an industry that could imperil our democracy. None other than Facebook CEO Mark Zuckerberg has advocated for more regulation of his own industry — perhaps it’s time Congress took him up on it.
Congress should also weigh in with laws aimed at preserving free speech rights in cyberspace so that neither the platforms nor the police can forge an unholy alliance that tramples the Constitution in the name of security. These laws should create uniform standards of acceptability so that content that gets you booted from Twitter also gets you suspended from Parler and the others. We shouldn’t allow terrorists to shop for the most permissible forum.
Let’s establish our community standards and protections now, before the good intentions of Silicon Valley and law enforcement — or the threat we’re trying to stop — get out ahead of us.