U S DEPARTMENT OF JUSTICE Section 230 — Nurturing Innovation or Fostering Unaccountability SUMMARY OF PUBLIC WORKSHOP PRIVATE ROUNDTABLE June 2020 SUMMARY OF PUBLIC WORKSHOP PANELS Panel 1 – Litigating Section 230 The first panel moderated by Principal Deputy Associate Attorney General Claire McCusker Murray focused on the state of the law at the time Section 230 was enacted early cases interpreting the statute and challenges that litigants face today Professor Jeff Kosseff opened the panel by describing the dilemma early platforms faced—moderate content but risk being held liable for content posted by third parties or avoid liability but risk having the platform overrun with obscene or defamatory content Two cases highlighted this dilemma In 1991 in Cubby v CompuServe the victim of a defamatory posting sued CompuServe The court found CompuServe to be a content distributor that did not review postings before publishing—the online equivalent of a bookstore—and thus not liable But in 1995 in Stratton Oakmont v Prodigy Prodigy was found liable as the publisher of all content on its platform—the equivalent of a book publisher that made editorial decisions—because it moderated some content Section 230 addressed this problem by providing immunity to internet platforms for hosting third-party content and the removal of third-party content in certain circumstances In the opinion of Professor Kosseff Section 230 helped create the modern Internet by allowing new platforms to thrive without fear of liability Professor Ben Zipursky described Section 230 as the digital equivalent to a Good Samaritan law that precludes torts liability for helping passers-by during emergencies Patrick Carome an attorney who has represented various online platforms described the first major case applying Section 230 Zeran v America Online AOL was sued by an individual falsely accused of selling t-shirts mocking the Oklahoma City bombing but AOL invoked Section 230 and was found not liable The Fourth Circuit interpreted Section 230 immunity broadly and the decision continues to shape how courts view Section 230 Professor Zipursky noted that subsequent cases have improperly expanded Section 230 One case he highlighted was Batzel v Smith arguing that the Ninth Circuit made “a serious error when it stated that the active passive distinction is irrelevant to Section 230 that reposting what someone else wrote is immunized by Section 230 ” According to Professor Zipursky republication of defamation is a distinct tort that was not intended to be immunized under Section 230 1 Carrie Goldberg an attorney representing cyberstalking victims also criticized the reach of Section 230 Ms Goldberg described the challenges one of her clients faced when stalked and harassed by his ex-boyfriend through the Grindr app Because the claims arose from content posted on the app by third-parties Grindr was immunized from tort liability even though Ms Goldberg argued that Grindr had the means to help stop the harassment Annie McAdams an attorney who represents sex trafficking victims shared similar stories and discussed a case currently on appeal in Texas where the trial court initially rejected Facebook’s claim of Section 230 immunity Ms McAdams argued that if Section 230 is read narrowly and more cases are allowed to proceed to discovery platforms would behave differently Other panelists however argued that weakening or narrowing Section 230 would simply lead to more litigation and more expenses with little gain for victims 2 Panel 2 – Addressing Illicit Activity Online The second panel moderated by Assistant Attorney General Beth Williams discussed whether Section 230 encourages or discourages platforms to address online harms such as child exploitation and terrorism and its impact on state and federal law enforcement The panel included victims’ rights advocates law professors a state attorney general and a tech industry representative The victims’ rights advocates described the many online harms in today’s world Yiota Souras Senior Vice President of the National Center for Missing and Exploited Children discussed how the problem of child exploitation continues to grow and that in 2019 her organization received nearly 17 million reports of suspected Child Sexual Abuse Material CSAM including over 69 million files with videos and images Professor Mary Anne Franks addressed the problem of nonconsensual intimate imagery sometimes referred to as “revenge porn” and how Section 230 provides immunity against state laws for companies that host this and other harmful material Nebraska Attorney General Doug Peterson discussed how some platforms voluntarily provide state authorities with good cooperation but that state attorneys general need an exception to Section 230 immunity so they have the power to protect their citizens against an acceleration of online crime and to act as a complement to federal enforcement Matt Schruers the President of the Computer Communications Industry Association said his clients do not want their services used for illegal purposes According to Mr Schruers bad actors who solicit or participate in illegal activity do not have Section 230 protection and can be prosecuted The industry is devoting tremendous resources toward trust and safety programs but he acknowledged that more investment can and should be done He also called for additional resources for law enforcement to prosecute more cases Ms Souras explained how some of the top few companies are partners in the fight against CSAM and do tremendous work while others recklessly look the other way Professor Kate Klonick noted that large companies are addressing problems on their platforms because maintaining their brands and keeping advertisers incentivizes them to remove bad content Professor Franks stated that the law not just fear of bad publicity needs to provide incentives to companies to be Good Samaritans The panel also discussed encryption and its potential impact on internet crime Mr Schruers said encryption is needed to protect against fraud and foreign adversaries who target protestors and opponents A balancing needs to be done he admitted but encryption is a critical tool Ms Souras agreed there has to be balance but citing to the 12 million 3 reports of CSAM her organization received from Facebook last year she expressed concern that end-to-end encryption would mean reports to NCMEC would dry up and fewer children would be saved The panel concluded by discussing whether Section 230 should be amended to reset incentives to address these harms Mr Schruers observed that Section 230 does not prevent state law enforcement from pursuing perpetrators of crime on the internet it only prevents actions against the platforms Attorney General Peterson ended the panel asking for the ability to enforce state criminal laws against platforms and implored lawmakers to allow state attorneys general to go after bad actors and to clean up the industry instead of waiting on the industry to clean up itself 4 Panel 3 – Imagining the Alternative This panel moderated by Associate Deputy Attorney General Ryan Shores examined the implications on competition innovation and free speech of different approaches to addressing the problems with Section 230 discussed in the earlier panels The panelists were Professor Eric Goldman Santa Clara University Neil Chilson Charles Koch Institute Julie Samuels Tech NYC David Chavern News Media Alliance and Pam Dixon World Privacy Forum Professor Goldman and Mr Chilson set the stage by briefly outlining the issues of how Section 230 functions and the proposals being circulated to address problems Professor Goldman highlighted the difficulties of defining what is third party content versus first party content and distinguishing content from conduct Mr Chilson noted the challenges with creating incentives to moderate protecting speech and addressing concerns about over-removal of content while continuing to encourage technical innovation and economic growth Mr Chavern described Section 230 as a market distortion arguing that platforms have extraordinary editorial control in their commercial decisions and use of algorithms to decide what content gets distributed to whom and how it is tied to advertising Ms Samuels argued on the other hand that Section 230 is vital to allow parties to compete in the online environment where media allows for the first time in history distribution of ideas from “many-to-many” contrasted with traditional media which is “one-to-many” Mr Chilson noted that intermediary liability is unusual in our legal system and that since Section 230 does not change liability for the content creator it embodies a principle of personal responsibility Professor Goldman explained that Section 230 solves the moderator’s dilemma of when one tries and fails to filter out harm and is liable for what one missed and that this benefit is key for keeping markets open and allowing for innovation Ms Dixon advocated for the use of voluntary consensus standards as an approach to solving discrete observable definable problems in the online world Mr Chavern contended that we should focus on how to build back systems that incentivize quality while taking actions to address abuse online Professor Goldman suggested that any reform must weigh the costs against many benefits that Section 230 provides Participants expressed concern with respect to a flat carve-out for smaller businesses Mr Chilson suggested it might incentivize consolidation Prof Goldman noted it can be difficult to define size and Ms Dixon had concerns about unintended consequences Mr Chavern contended that the largest platforms create the greatest risks and have the most resources so starting with them represents a feasible incremental approach 5 With respect to First Amendment issues Mr Chavern said that “freedom of speech is not freedom of reach ” meaning that a user’s free speech is distinct from a platform making that speech reach millions of people Amplifying unlawful speech he suggested is a separate act for which platforms should be held accountable Professor Goldman argued that Section 230 is a positive tool that promotes speech and enables platforms to solve online harm 6 SUMMARY OF THE AFTERNOON ROUNDTABLE The afternoon roundtable convened a robust group of academic scholars technology experts civil society representatives and industry representatives to discuss whether and how Section 230 could be reformed to mitigate the undesirable consequences discussed in the morning panels while preserving the law’s benefits The morning panelists also attended the roundtable and participated at certain points in the discussion The first afternoon discussion was titled “Content Moderation Free Speech and Conduct Beyond Speech ” Section 230 has generated complaints both that it allows online platforms to engage in too much censorship and that it does not require them to remove enough content On the one hand some fault Section 230 for allowing platforms to engage in politically-driven censorship removing content on politically or socially charged issues that expresses viewpoints disfavored by the platforms and their employees On the other hand some fault Section 230 for permitting platforms not to remove objectionable and even unlawful content including content that is defamatory or that promotes violence or sexual exploitation This roundtable discussion explored both the bases for these criticisms as well as the benefits and pitfalls of proposed solutions Building on the morning discussions the roundtable focused on two big issues First beginning with the problem of online defamation participants debated whether it would be preferable to adopt a different liability regime for platforms from the absolute blanket immunity conferred by Section 230 Although some argued for a distributor liability regime under which platforms would be liable if they knew or had reason to know third-party content was defamatory the weight of opinion was that such a regime would be unsound with respect to defamation Participants emphasized the difficulty of determining whether speech is libelous and the risk of abuse of notice and takedown procedures Second the roundtable addressed whether online platforms are engaging in content moderation based on political viewpoints under the shield of Section 230 immunity Several participants expressed concerns that major online platforms moderate content with a bias against politically-conservative speech A number of participants highlighted the lack of rigorous data to substantiate the claim In addition other participants noted that many different groups representing diverse and sometimes conflicting viewpoints believed that they were victims of biased moderation Participants suggested that these feelings may reflect that given the scale of large platforms there may be many anecdotal examples that individuals can point to as evidence of bias even if those examples represent a small fraction 8 of overall moderation decisions There was considerable support for greater transparency into how platforms enforce their content moderation rules to help address concerns over bias Participants suggested that such transparency would be valuable in increasing understanding and building trust At the same time some noted that actual restrictions on how platforms moderate conduct could impinge the platforms’ First Amendment rights The second discussion of the afternoon roundtable was titled “Addressing Illicit Activity Online and Incentivizing Good Samaritans ” Victims of non-consensual intimate imagery dating app stalking online child sexual abuse material and terrorism complain that Section 230 as currently written both fails to incentivize companies to take down abusive or violent material and enables bad actors to proliferate these types of material with minimal consequence Well-publicized cases include an individual who impersonated his former partner on dating apps and facilitated unwanted sexual and violent behavior toward that individual women whose former partners have posted pornographic images of them online without their consent and online forums that have enabled foreign terrorist organizations to target American citizens for recruitment and organize acts of terror This roundtable topic began with a series of hypothetical scenarios designed to gauge the scope of Section 230 today including the interaction of Section 230 with federal and state criminal law The panel discussed incentives for companies to monitor and potentially take down content being posted online as well as complications that companies face in taking down yet storing abusive material for investigative purposes – including in light of emerging privacy laws that demand deletion of certain data upon request The panel also discussed the efficacy of existing sectoral frameworks designed to intercept criminal activity such as “Know Your Customer” banking regulations Finally the panel discussed some potential changes to Section 230 proposed by experts and others that would re-scope the law or provide additional incentives to tech companies to be Good Samaritans 9
OCR of the Document
View the Document >>