Posts Under: section 230

Congress Must Continue to Protect Good Samaritans that Engage in Online Content Moderation

Today, the Senate passed legislation—the Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA, H.R. 1865)— that would give law enforcement and victims much-needed legal tools to prosecute online traffickers and rogue websites.  SIIA congratulates Congress for its bicameral, bipartisan effort to enable full prosecution of those involved in sex trafficking, but we remain concerned about the potential unintended outcomes of any legislation that limits critical internet speech protections provided under Section 230 of the Communications Decency Act (CDA 230). CDA 230 enables many platforms and websites work closely with law enforcement, partner with other companies and outside groups to share signals of illegal activity, particularly acts of human trafficking.  It has also become common among large internet platforms to invest in new technologies, such as machine learning, to proactively police and moderate content.  Over decades, CDA 230 has pr ...


What are the Responsibilities of Tech Companies in an Age of International Terrorism?

Yesterday, at George Washington University, an energetic panel of government officials, scholars and policy advocates from business and civil society discussed the role of tech companies in an age of international terrorism.  There is more to this thorny issue, but the panel began with a good outline of the issues at stake. The panel met at a sad time when the world is mourning the loss of life from the attacks in Brussels.  Coming after attacks in Paris, San Bernardino and Istanbul, we are clearly at a critical junction in the struggle against violent extremism.  And that made today’s topic tragically timely and relevant. So what are the responsibilities of tech companies in an age of international terrorism?  I’d say that they have three:


Social Media Should be Section 230 “Good Samaritans” in an Age of International Terrorism

This past Friday, February 5, Twitter announced  - in a tweet, of course – that it had shut down more than 125,000 terrorism-related accounts since the middle of 2015, most of them linked to the Islamic State. The social media site removes accounts that are reported to them, and it also uses spam-fighting tools to identify and take down other violent accounts.  It works with public interest groups to encourage counter speech and reports violent accounts to the government.