Days before a pro-Trump crowd stormed the US Capitol, social media posts announced the deadly January 6 riot.
“You better be ready, chaos is coming and I’ll be in DC on 06/01/2021 fighting for my freedom!” Maryland resident Andrew Ryan Bennett wrote in a shared Facebook post. on January 4, 2021 with #STOPTHESTEAL. Two days later, Bennett would broadcast live video to Facebook from inside the Capitol. The videos included footage of Bennet, who was wearing a baseball cap with a Proud Boys motto, chanting “Break it down!” outside the loudspeaker lobby door, where a woman was shot and killed, according to the FBI.
Bennett was one of more than 700 federal prosecutors charged with crimes related to the attack on Capitol Hill on January 6. Some of these people recorded their participation in the scrum on platforms like Facebook and YouTube, owned by Google.
A year after the uprising, lawmakers, researchers and American journalists are still wondering about the role played by social networks in the attack that killed five people. Members of Congress criticized the companies for downplaying their role in the riot. Prompted by the attack, social networks have taken a closer look at how they tackle misinformation spread by politicians and public figures.
A investigation by ProPublica and The Washington Post published on Tuesday found evidence that Facebook played a “critical role in spreading lies that fomented the January 6 violence.” At least 650,000 posts in Facebook groups attacked the legitimacy of Joe Biden’s presidential victory over Donald Trump and many called for political violence, media reported.
Trump and his supporters continue to peddle baseless claims on social media that the election was stolen from him. Concerned about the risk of violence, Facebook, Twitter and other social media sites last year took the rare step of kicking Trump off their platforms.
On Thursday, Biden and Vice President Kamala Harris both spoke about the attack on the Capitolcalling on Americans to face the truth about the “brutal attack” that took place a year ago.
“We need to be absolutely clear about what is true and what is a lie,” Biden said, speaking from the US Capitol’s Statuary Hall. “Here is the truth. The former President of the United States of America created and spread a web of lies about the 2020 election.”
During his remarks, Biden also criticized Trump for his inaction on Jan. 6 as well as his role in inciting and inciting the mob. “They came here angry,” Biden said. “Not in the service of America, but rather in the service of one man.”
Trump canceled a scheduled press conference Thursday at his Mar-a-Lago resort, but a rally is scheduled for Jan. 15 in Arizona. Here is an overview of the impact of the attack on social networks:
Social media sites are changing their policies and rolling out new tools
Trump’s indefinite suspension from Facebook on Jan. 7, 2021 has forced the social network, which rebranded itself as Meta in October, to examine how it moderates speeches posted by public figures. Trump had 35 million followers on Facebook and 24 million on Instagram, a photo service owned by the social media giant.
In June, Facebook launched new application protocols for content posted by public figures during times of civil unrest and violence. Facebook made the changes after a semi-independent oversight board upheld Trump’s suspension, but noted in its decision that the company does not outline indefinite suspensions in its content policies. Facebook later clarified that Trump would be suspended from Facebook for two years, and the social network said it would assess the risk of violence near the end of his suspension period, which runs until at least January 2023.
Despite Trump’s suspension, Media matters to America said on Thursday that the former president’s January 6 messages continued to receive interactions and that his fundraising committee had issued hundreds of announcements.
The social network also said it will provide regular updates in 2022 on when it leaves content that violates its rules due to being topical and no longer presumes politicians’ speech is inherently in the public interest. Before Biden’s inauguration, Facebook said it removed content that included the phrase “stop theft”, banned US ads promoting weapon accessories and protective gear, and blocked the creation of new events taking place around the Capitol, among other stages.
Trump had a much larger presence on Twitter, with nearly 89 million followers. Twitter has permanently suspended the former president for violating its rules against glorifying violence. Also in response to the attacks, Twitter no longer allowed people to reply to, like or retweet tweets that violated its updated Civic Integrity Policy and permanently suspended thousands of accounts that primarily shared QAnon content.
The company then began asking the public if they thought world leaders should be subject to the same rules as other users. It has launched a pilot program called Birdwatch where users can identify tweets they think are misleading and add more context and has teamed up with Associated Press and Reuters to elevate credible information on the platform.
“Our approach, before and after January 6, has been to take strong enforcement action against accounts and tweets that incite violence or are likely to cause harm offline. Government commitment and attention, civil society and the private sector are also essential. We recognize that Twitter has an important role to play and we are committed to doing our part,” a Twitter spokesperson said in a statement.
US lawmakers still want answers from social media
In March, U.S. lawmakers quizzed Facebook CEO Mark Zuckerberg, Google CEO Sundar Pichai, and then-Twitter CEO Jack Dorsey on a variety of topics, including the insurgency.
“I think the responsibility here lies with the people who took the necessary steps to break the law and cause the insurgency,” Mark Zuckerberg said.
While Zuckerberg downplayed Facebook’s role, Dorsey acknowledged that Twitter played a role. Rep. Mike Doyle, a Democrat from Pennsylvania, asked leaders to answer “yes” or “no” to a question about whether their platforms fueled the spread of misinformation and the planning for the Capitol Hill riot.
“Yes,” Dorsey answered. “But you also have to consider the broader ecosystem. It’s not just about the technology systems that we use.”
In August, a House of Representatives select committee investigating the Jan. 6 attack requested records from 15 social media companies, including Facebook, Twitter, Reddit, TikTok, YouTube, Gab and Parler. As part of the request, the committee also asked the companies about any policy changes made to address misinformation, messages condoning violent extremism and other offensive content.
The committee did not respond to a request for comment on the investigation.
Facebook whistleblower comes forward
Criticisms of Facebook’s role have also come from within the company. Internal documents collected by former Facebook product manager Frances Haugen, acting as a whistleblower, shed more light on the social network’s response to the January 6 attack. Facebook employees felt the company had not done enough to crack down on misinformation ahead of the 2020 U.S. presidential election, the documents show.
A complaint filed on behalf of Haugen with the United States Securities and Exchange Commission accuses Facebook of misleading public investors about its role in perpetuating misinformation and violent extremism related to the 2020 election and to the January 6 insurrection.
Haugen’s legal team disclosed redacted documents to Congress and the SEC. A consortium of news organizations, including CNET, also viewed the redacted versions.
In the complaint, Haugen accuses Facebook of failing to adopt or maintain measures to combat misinformation and violent groups, including content related to the January 6 uprising, to “promote virality and growth on its platforms. “. The social network, for example, could have done more to limit the sharing of posts containing misinformation, according to the complaint.
Since the documents were released, U.S. lawmakers and advocacy groups have pushed for more regulation, including a federal data protection law.
“It’s time for Congress to ensure that Facebook puts our safety before its profits,” said José Alonso Muñoz, deputy communications director for United We Dream, a nonprofit focused on the immigrant community.
GameSpot’s Carrie Mihalcik contributed to this report.