Clicky

mobile btn
Tuesday, November 26th, 2024

Social media companies must use AI to thwart extremist content, experts tell congressmen

© Shutterstock

Social media companies need to step up and use the technology they already have to limit extremist groups from spreading misinformation and conspiracy theories online, experts said Wednesday during a U.S. House Homeland Security Committee virtual forum on extremism during the COVID-19 pandemic.

And as extremists from both the left and right spread their messages of discord, legislators also need to take action, witnesses Jonathan Greenblatt, CEO and national director of the Anti-Defamation League, and Ali Soufan, CEO and chairman of The Soufan Group, said.

“Extremists always have capitalized on times of crisis and uncertainty, as we’ve seen in recent months, and the era of COVID-19 is no different,” Greenblatt said. “White supremacists and other trolls are kind of innovating the use of this technology in order to spread online and offline threats. Extremists are exploiting this moment.”

Soufan said efforts to contain COVID-19 would make it harder for the United States to fight ISIS and other groups because of the misinformation coming from various extremist groups. Additionally, extremist groups plotting to attack facilities and infrastructure presents another clear threat to the country. Soufan noted that anti-5G groups have attacked cell phone towers, while white supremacist groups have planned attacks on assisted living facilities and hospitals in Missouri and Boston.

U.S. Rep. Max Rose (D-NY), chairman of the Intelligence and Counterterrorism Subcommittee, and U.S. Rep. Emanuel Cleaver (D-MO), a member of the Homeland Security Committee, asked the witnesses what social media companies should be doing to stop the spread of misinformation and hate speech coming from extremist groups.

Greenblatt said that legislation exempts social media companies from liability for helping to spread the views of extremist groups.

Social media companies need to innovate against extremism by using the technology they already have, such as artificial intelligence, to stop the spread of hate-filled misinformation, Greenblatt said. Additionally, companies need to work to stop allowing extremist organizations to monetize hate by pulling their advertising revenues. Greenblatt said he would like to see a seven-second delay on content that allows social media companies to pull content that is inappropriate.

Soufan said there were steps the government could take too, including aggressively using sanctions on individuals and organizations that are engaged in malicious disinformation campaigns.

First, the government needs to acknowledge that misinformation and conspiracy theories spread through social media channels are a real threat to Americans.

“What social media companies are doing is normalizing hate, and allowing these groups to capitalize on each other’s conspiracy theories, piggy-backing on them in an echo chamber,” Soufan said.

Rep. Rose has worked to make technology companies do more to thwart terrorism’s spread on social media. He introduced legislation known as the Raising the Bar Act in November 2019 that would establish an exercise program for online terrorism content to be flagged and removed by social media companies, to test the efficacy of those companies’ practices in eliminating that kind of content from their platforms within 24 hours. He also previously worked with social media companies to formalize the Global Internet Forum to Counter Terrorism as a non-profit organization and pushed Facebook to ban extremist 4chan links on its platform.

“Terrorists’ use of the internet has been magnified by this public health crisis, with social media and gaming platforms increasingly being used for recruitment and propaganda here and around the world — particularly as people spend more and more time online at home,” Rose said.

Witnesses also testified that more needs to be done to ensure law enforcement has access to information while protecting individual citizens’ rights. Terrorists and others can currently use Facebook’s Oculus program to hold virtual training sessions, the witnesses said.

“That’s not something we’re theorizing – this is happening,” Soufan said. “If we keep looking the other way, it’s going to happen more and more.”