Clicky

mobile btn
Friday, March 29th, 2024

Congressional scrutiny of social media company moderation grows in wake of U.S. shootings

© Shutterstock

Although the Internet was, at one time, deemed to be one of the last frontiers, its uses have changed and multiplied exponentially over the years – and now, a collection of 13 members of Congress have written social media companies asking for evidence of response to violent threats and harmful content.

Traditionally, these companies have operated similar to town halls and forums of old – essentially acting as venues for others to practice free speech but not being responsible for the content of that speech at large. This position has come under increased scrutiny in the past decade, as lawmakers and the public have pointed out parallel growth between violent incidents online and in the physical world, with shootings at the top of the list.

In their letter, the lawmakers pointed to high-profile cases such as Highland Park, Ill.; Uvalde, Texas; and Buffalo, N.Y. – where shooters assailed a crowd, an elementary school, and a grocery store, respectively. Though these have dominated the headlines in recent days, there have been more than 350 mass shootings in the United States this year.

“As Members of Congress, we continue to grieve the lives lost and the tragic impact to our communities, but it is also a call to action,” the representatives wrote. “As representatives, we must always be asking whether there are ways to prevent mass shootings like this. The more that we learned about the shooters, the more we learned that there were warning signs which, if properly heeded, may have prevented these tragedies. Sources revealed that the shooter in the Highland Park incident frequently used the social media platform, Discord, to share violent materials, including videos depicting him committing mass murders. The shooter in Buffalo also used Discord to document his plans for the shooting and live streamed it on Twitch.”

Shooters, they added, often post clues to their violent intentions online, but even when that content is reported, in many cases, companies fail to act. Youtube, Twitter, Meta, Discord, TikTok, Twitch – all these and more were pointed out by the writers as having dealt with troubling content flagged by users.

With that in mind, they asked representatives of the companies to address and define their content moderation policies and dedicated staff, their procedures and turnaround for dealing with flagged content, how often cases have been reported to local law enforcement, and what actions they have taken to improve their reporting mechanisms and moderation in the wake of increased mass shooting, hate crime, and other recent attacks.

The effort was led by U.S. Reps. Brad Schneider (D-IL), Tony Gonzales (R-TX), and Brian Higgins (D-NY).