POPULARITY
The TikTok ban decision passed down by the Supreme Court late last year is a clear violation of First Amendment precedent, but President Trump's refusal to enact it as law is a constitutional crisis in the making. We brought Stanford Law professor, rising First Amendment star scholar, and Moderated Content host Evelyn Douek on the […]
When someone disrespects you in a subtle way, with plausible deniability, it can do just as much damage as overt bigotry. So we're talking about microaggressions — what they are, and how science fiction explores them but also perpetuates them. And later in the episode, we talk to Stanford professor Evelyn Douek about what's next for content moderation on the internet.
From April 1, 2021: This week on Arbiters of Truth, the Lawfare Podcast's miniseries on our online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Issie Lapowsky, a senior reporter at the tech journalism publication Protocol. They discussed last week's hearing before the House Energy and Commerce Committee with the CEOs of Facebook, Google and Twitter—the first time the companies had been called to testify on the Hill after the Capitol riot, which focused public attention on the content moderation policies of tech platforms when it comes to domestic extremism. The hearing produced some interesting takeaways, but also a lot of moments when the CEOs were awkwardly forced to answer complicated questions with a simple "yes" or "no" answer.They also discussed Issie's reporting on how tech companies have struggled to figure out how to address far-right extremism in the United States as opposed to Islamist extremism. And they talked about Section 230 reform and what it's like reporting on the tech space.To receive ad-free podcasts, become a Lawfare Material Supporter at www.patreon.com/lawfare. You can also support Lawfare by making a one-time donation at https://givebutter.com/c/trumptrials.Support this show http://supporter.acast.com/lawfare. Hosted on Acast. See acast.com/privacy for more information.
From August 20, 2020: This week on Lawfare's Arbiters of Truth series on disinformation, Evelyn Douek and Quinta Jurecic spoke with Alex Stamos, the director of the Stanford Internet Observatory and former chief security officer of Yahoo and Facebook. Alex has appeared on the podcast before, but this time, they discussed a new coalition he helped set up called the Election Integrity Partnership—a coalition focused on detecting and mitigating attempts to limit voting or delegitimize election results. Disinformation and misinformation around the U.S. presidential election has already started popping up online, and it's only going to increase as November draws closer. The coalition aims to counter this in real time. So how will it actually work?They also asked Alex for his hot takes on TikTok—the popular video sharing platform facing pressure over concern about influence from the Chinese government.To receive ad-free podcasts, become a Lawfare Material Supporter at www.patreon.com/lawfare. You can also support Lawfare by making a one-time donation at https://givebutter.com/c/trumptrials.Support this show http://supporter.acast.com/lawfare. Hosted on Acast. See acast.com/privacy for more information.
Stanford's Evelyn Douek and Alex Stamos are joined by Stanford Internet Observatory's Shelby Grossman to discuss SIO's just-released report on the Strengths and Weaknesses of the Online Child Safety Ecosystem. Read the report here.SIO is also calling for presentation proposals for its annual Trust and Safety Research Conference. Proposals are due April 30. Details are here: https://io.stanford.edu/conferenceJoin the conversation and connect with Evelyn and Alex on your favorite social media platform that doesn't start with “X.”Moderated Content is produced in partnership by Stanford Law School and the Cyber Policy Center. Special thanks to John Perrino for research and editorial assistance.Like what you heard? Don't forget to subscribe and share the podcast with friends!
Stanford's Evelyn Douek and Alex Stamos are joined by University of Washington professor Kate Starbird to discuss research on election rumors.Kate Starbird is an associate professor at the University of Washington in the Department of Human Centered Design & Engineering where she is also a co-founder of the Center for an Informed Public. - University of WashingtonHouse Judiciary Committee Kate Starbird interview transcriptHouse Judiciary Committee Alex Stamos interview transcriptSports CornerNoted American sports expert Evelyn Douek discusses the NCAA women's basketball championship in this slam dunk segment. Dawn Staley's South Carolina Gamecocks defeated superstar Caitlin Clark's Iowa Hawkeyes 87-75 on Sunday in what is expected to be the most watched women's basketball game of all time with an average ticket price hovering around $500. - Jill Martin/ CNN, Alexa Philippou/ ESPNJoin the conversation and connect with Evelyn and Alex on your favorite social media platform that doesn't start with “X.”Moderated Content is produced in partnership by Stanford Law School and the Cyber Policy Center. Special thanks to John Perrino for research and editorial assistance.Like what you heard? Don't forget to subscribe and share the podcast with friends!
Stanford's Evelyn Douek is joined by Professor Genevieve Lakier of the University of Chicago Law School to discuss the Supreme Court oral arguments in Murthy v. Missouri. For one of their previous conversations on this topic, listen to this episode from September last year talking about the 5th Circuit's decision in the case.They also discuss Stanford's amicus brief in the case, and the Stanford Internet Observatory's blog post summarizing factual errors that have pervaded the case.Join the conversation and connect with Evelyn and Alex on your favorite social media platform that doesn't start with “X.”Moderated Content is produced in partnership by Stanford Law School and the Cyber Policy Center. Special thanks to John Perrino for research and editorial assistance.Like what you heard? Don't forget to subscribe and share the podcast with friends!
Johanna speaks with Evelyn Douek, assistant Professor at Stanford Law School (and former corporate lawyer in Australia), about the public and private regulation of speech online. The pair discusses: the private and public regulation of speech online why 'everything is content moderation' and what it might mean to take a systems thinking approach the importance of platform transparency the history, significance, and politics of Section 230 institutional competence and the role of courts the state of trust and safety the value of a functioning parliament and the differences between Australia and the U.S. in making tech policy At 21:11, Johanna refers to the Foreign Interference Committee. For clarification, this reference was made in relation to the Big Tech Inquiry of the Senate Economics References Committee, chaired by Senator Bragg (colloquially referred to as the social media foreign interference committee). This is not to be confused with the Select Committee on Foreign Interference through Social Media, chaired by Senator Paterson. Relevant Links: Evelyn Douek: https://www.evelyndouek.com/ Moderated Content podcast: https://law.stanford.edu/directory/evelyn-douek/moderated-content/ Content Moderation as Systems Thinking, by Evelyn Douek, Harvard Law Review: https://harvardlawreview.org/print/vol-136/content-moderation-as-systems-thinking/ Casey Newton's Platformer newsletter: https://www.platformer.news/ The Washington Post's technology newsletter: https://www.washingtonpost.com/politics/the-202-newsletters/the-technology-202/ Rest of World: https://restofworld.org/ Follow: Evelyn Douek on Twitter: @evelyndouek Stanford Law School on Twitter: @StanfordLaw Stanford Law School on LinkedIn: Stanford Law School (SLS)
Stanford's Evelyn Douek and Riana Pfefferkorn weigh in on the latest online trust and safety news and developments:Update on last week's segment on Law Enforcement Data Requests:California passed a law last year that seeks to block warrants requesting information about abortions from tech companies. - Andrea Vittorio/ Bloomberg LawCalifornia lawmakers are looking at ways to stop dragnet reverse warrants and keyword search warrants. - Tonya Riley/ CyberScoopThe FTC Takes on TwitterThe Federal Trade Commission is probing whether Twitter still has the staff and budget to comply with a 2011 consent decree for privacy and data protection standards and reporting. - Ryan Tracy/ The Wall Street Journal, Kate Conger, Ryan Mac, David McCabe/ The New York Times, Brian Fung/ CNNHouse Republicans created an outrage fest about FTC investigations into Twitter's compliance with its consent decree. - Jared Gans/ The Hill, Emily Brooks, Rebecca Klar/ The HillNot to say “we told you so,” but this FTC action was predicted in an episode last year which still provides a good primer on Twitter's data security problems with the FTC. - Evelyn Douek, Whitney Merrill, Riana Pfefferkorn/ Stanford LawHouse Republicans passed an anti-jawboning law, H.R. 140, the Protecting Speech from Government Interference Act. Of course, it does not apply to Congress, and it faces long odds in the senate. - Brian Fung/ CNNSens. Mark Warner (D-VA) and John Thune (R-SD) introduced the RESTRICT Act, which would give the Secretary of Commerce authority to ban technology products from companies with ties to foreign adversaries, including TikTok. - Brian Fung/ CNN, Brendan Bordelon, Gavin Bade/ PoliticoAny user can lose access to social media accounts for refusing to verify their age and parental consent is required for children under 18 to create social media accounts under a bill, SB 152, that passed the Utah State Legislature and is soon expected to be signed into state law. - Kim Bojórquez, Erin Alberty/ AxiosTwitter announced new enterprise packages for access to collect tweets through its API with the lowest tier priced at more than $500,000 per year. - Chris Stokel-Walker/ WiredMore: Academics currently receive free access. Now, most if not all academics will be priced out of even the lowest tier of data access.Join the conversation and connect with Evelyn on Twitter at @evelyndouek.Moderated Content is produced in partnership by Stanford Law School and the Cyber Policy Center. Special thanks to John Perrino for research and editorial assistance.Like what you heard? Don't forget to subscribe and share the podcast with friends!
Should governments regulate how Facebook moderates speech? Can you sanction an automated smart contract that’s used for international money laundering? Was it a coincidence that every social media platform banned Donald Trump at the same time? In the first part of our 4-part miniseries looking at trust online, we welcome evelyn douek, host of the… Continue reading 71 Do We Trust the Internet? with evelyn douek and Primavera de Filippi (Trust episode 1)
From March 18, 2021: On this episode of Arbiters of Truth, the Lawfare Podcast's miniseries on our online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Daphne Keller, the director of the Program on Platform Regulation at Stanford's Cyber Policy Center and an expert on Section 230 of the Communications Decency Act, the statute that shields internet platforms from civil liability for third-party content on their websites. The statute has been criticized by both Democrats and Republicans, and both President Trump and President Biden separately called for its repeal. So what should we expect in terms of potential revision of 230 during the current Congress? What does Daphne think about the various proposals on the table? And how is it that so many proposals to reform 230 would be foiled by that pesky First Amendment?Support this show http://supporter.acast.com/lawfare. Hosted on Acast. See acast.com/privacy for more information.
As Justice Kagan has asked, “Every other industry has to internalize the costs of its conduct. Why is it that the tech industry gets a pass?” Yet she and the other 8 Supreme Court Justices seemed wary this week as they heard oral arguments in two cases that could upend the Section 230 immunity that social media companies enjoy, Gonzalez v. Google and Twitter v. Taamneh. Today, we hear from three experts: Stanford Law professor Evelyn Douek, National Constitution Center President and CEO Jeffrey Rosen and UC Berkeley computer science professor Hany Farid. Up for discussion — what's at stake in these two cases, which way the wind seems to be blowing and, of course, will killing Section 230 kill the internet? Questions? Comments? Email us on@voxmedia.com or find us on Twitter @karaswisher and @nayeema Learn more about your ad choices. Visit podcastchoices.com/adchoices
Over time, the First Amendment has meant lots of different things to a lot of different people. In this episode, with University of Chicago law professor Genevieve Lakier by her side, host Evelyn Douek travels back to the time when modern free speech doctrine first started to emerge. Together they consider the values that have influenced how America thinks about free speech and how these values came to shape the way American law approached regulating the internet back when very few people even knew what the internet was. We hear from someone who was there—Sen. Ron Wyden—now one of the most famous names in internet regulation—about Section 230—one of the most (in)famous online speech regulations out there. But things have been changing in the politics of online speech regulation, and things are really starting to get weird now.Views on First is brought to you by the Knight First Amendment Institute at Columbia University. Please subscribe and leave a review. We'd love to know what you think. To learn more about the Knight Institute, visit our website, knightcolumbia.org, or follow us on Twitter at @knightcolumbia or on Mastodon at the same handle.
Thanks to the ruling in Knight v. Trump, then-president Trump could no longer block critics on social media. Hooray! But the ruling was only the start of the story, and quickly new questions arose. How would it affect other government officials? What might it mean for the development of the law more generally? Could the ruling be used in ways that the Knight Institute team didn't expect and doesn't agree with? In this episode, host Evelyn Douek is joined by Harvard Law Professor Noah Feldman and former Twitter head of integrity Yoel Roth. Together they explore the ramifications of Knight v. Trump, and ask: Did the case establish much-needed guardrails around free speech online, or is it starting us on a slippery slope that could fundamentally change how the First Amendment applies to social media platforms?Harp sound effect used in this episode comes from SPANAC on Free Sounds Library. Permission to use this sound is via Creative Commons Attribution 4.0 International license.Views on First is brought to you by the Knight First Amendment Institute at Columbia University. Please subscribe and leave a review. We'd love to know what you think. To learn more about the Knight Institute, visit our website, knightcolumbia.org, or follow us on Twitter at @knightcolumbia or on Mastodon at the same handle.
What is Twitter (or any social media platform) as a matter of First Amendment law? In the first of five episodes, host Evelyn Douek begins to crack open this question, starting with perhaps the most famous Twitter handle of all — @realdonaldtrump. As president, Trump used his account to hire and fire government officials, butt heads with North Korea, and block his critics, a practice that one group of lawyers started to question.Guests Jameel Jaffer and Katie Fallow — executive director and senior counsel at the Knight First Amendment Institute, respectively — discuss the Knight Institute's landmark case –Knight v. Trump – establishing that the First Amendment bars public officials from blocking critics from their social media accounts. They are joined by some of the plaintiffs from that lawsuit—comedy writer Nick Jack Pappas, chocolatier and political consultant Holly Figueroa O'Reilly, and sociologist Philip Cohen, who recount their experiences of being blocked (and then unblocked) by Trump.This episode contains strong language.Views on First is brought to you by the Knight First Amendment Institute at Columbia University. Please subscribe and leave a review. We'd love to know what you think. To learn more about the Knight Institute, visit our website, knightcolumbia.org, or follow us on Twitter at @knightcolumbia or on Mastodon at the same handle.
Air Date 1/3/2022 Today, we take a look at the emerging implementation of facial recognition technology in public and commercial spaces along with the tracking and "amplifagandizing" capabilities of TikTok Be part of the show! Leave us a message or text at 202-999-3991 or email Jay@BestOfTheLeft.com Transcript BestOfTheLeft.com/Support (Get AD FREE Shows and Bonus Content) Join our Discord community! SHOW NOTES Ch. 1: Twitter, Facial Recognition and the First Amendment - The Lawfare Podcast - Air Date 4-15-21 This week on Arbiters of Truth, the Lawfare Podcast's miniseries on our online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Jameel Jaffer and Ramya Krishnan of the Knight First Amendment Institute. Ch. 2: Addressing the TikTok Threat Part 1 - Your Undivided Attention - Air Date 9-8-22 TikTok, the flagship app of the Chinese company Bytedance, recently surpassed Google and Facebook as the most popular site on the internet in 2021, and is expected to reach more than 1.8 billion users by the end of 2022. Ch. 3: https://www.abc.net.au/radionational/programs/lifematters/privacy,-your-face-and-the-rise-of-facial-recognition/13946990 Several large retailers, including Kmart and Bunnings, already use facial recognition technology in their stores – collecting biometric data that is as unique as a fingerprint – but are customers aware of how their facial data is being captured and used? Ch. 4: NJ Legal Rights & NYPD's Facial Recognition Technology - The Brian Lehrer Show - Air Date 9-30-22 Alexander Shalom, senior supervising attorney and director of Supreme Court advocacy at the ACLU-NJ, talks about the implications of a case in Hudson County, NJ, where a suspect was identified using the NYPD's facial recognition technology. Ch. 5: The problem with banning TikTok - Vox - Air Date 8-29-20 TikTok's in trouble. But so is the internet as we know it. Ch. 6: Are You Feeding a Powerful Facial Recognition Algorithm? - NOVA PBS Official - Air Date 4-23-21 Facial recognition technology has great potential to help law enforcement identify suspects. But collecting and storing data from online photos has raised concern among critics. Ch. 7: Addressing the TikTok Threat Part 2 - Your Undivided Attention - Air Date 9-8-22 Ch. 8: Why Facial Recognition Technology Is So Dangerous - Second Thought - Air Date 7-3-20 Government crackdowns, hyper-personalized ads, real-time location tracking of citizens. Mass surveillance is a growing threat in the modern world. In this episode, we'll take a look at why it's so dangerous. MEMBERS-ONLY BONUS CLIP(S) Ch. 9: The Real Danger Of ChatGPT - Nerwriter1 - Air Date 12-30-22 The Nerdwriter is a series of video essays about art, culture, politics, philosophy and more. Ch. 10: Hustle / Grind Alpha Bro vs. Random ChatGPT Guy - Andrew Rousso - Air Date 12-13-22 Inside everyone, there are two wolves but inside me... there are three. FINAL COMMENTS Ch. 11: Final comments on the new ad system for the show MUSIC (Blue Dot Sessions): Opening Theme: Loving Acoustic Instrumental by John Douglas Orr Voicemail Music: Low Key Lost Feeling Electro by Alex Stinnent Activism Music: This Fickle World by Theo Bard (https://theobard.bandcamp.com/track/this-fickle-world) Closing Music: Upbeat Laid Back Indie Rock by Alex Stinnent Produced by Jay! Tomlinson Visit us at BestOfTheLeft.com Listen Anywhere! BestOfTheLeft.com/Listen Listen Anywhere! Follow at Twitter.com/BestOfTheLeft Like at Facebook.com/BestOfTheLeft Contact me directly at Jay@BestOfTheLeft.com
From March 11, 2021: On this episode of Arbiters of Truth, the Lawfare Podcast's miniseries on disinformation and misinformation, Evelyn Douek and Quinta Jurecic spoke with Genevieve Lakier, an assistant professor at the University of Chicago Law School and First Amendment expert. It's basically impossible to have a conversation about content moderation without someone crying “First Amendment!” at some point. But the cultural conception of the First Amendment doesn't always match the legal conception. Evelyn and Quinta spoke with Genevieve about what First Amendment doctrine actually says, how its history might be quite different from what you think and what the dynamism of the doctrine over time—and the current composition of the Supreme Court—might suggest about the First Amendment's possible futures for grappling with the internet.Support this show http://supporter.acast.com/lawfare. Hosted on Acast. See acast.com/privacy for more information.
Due to the Veterans Day holiday, our team is taking a break and bringing you a Lawfare Archive episode that we think you'll find timely given some events from the last few weeks.From April 2, 2020: On this episode of the Arbiters of Truth series on disinformation, Evelyn Douek and Quinta Jurecic spoke with Nate Persily, the James B. McClatchy Professor of Law at Stanford Law School. Persily is also a member of the Kofi Annan Commission on Democracy and Elections in the Digital Age, which recently released a report on election integrity and the internet for which Nate provided a framing paper. Alongside his work on internet governance, Nate is also an expert on election law and administration. They spoke about the commission report and the challenges the internet may pose for democracy, to what extent the pandemic has flipped that on its head, and, of course, the 2020 presidential election.Support this show http://supporter.acast.com/lawfare. Hosted on Acast. See acast.com/privacy for more information.
From April 8, 2021: If you're listening to this podcast, the odds are that you've heard a lot about QAnon recently—and you might even have read some alarming reporting about how belief in the conspiracy theory is on the rise. But is it really?This week on Arbiters of Truth, the Lawfare Podcast's miniseries on our online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Joseph Uscinski, an associate professor of political science at the University of Miami who studies conspiracy theories. He explained why conspiracy theories in America aren't actually at a new apex, what kinds of people are drawn to ideas like QAnon and what role—if any—social media platforms like Facebook and Twitter should have in limiting the spread of conspiracy theories.Support this show http://supporter.acast.com/lawfare. Hosted on Acast. See acast.com/privacy for more information.
From September 3, 2020: This week on Lawfare's Arbiters of Truth series on disinformation, Evelyn Douek and Quinta Jurecic spoke with Alissa Starzak, the head of public policy at Cloudflare—a company that provides key components of the infrastructure that helps websites stay online. They talked about two high-profile incidents in which Cloudflare decided to pull its services from websites publishing or hosting extremist, violent content. In August 2017, after the white nationalist rally in Charlottesville, Cloudflare's CEO Matthew Prince announced that he would no longer be providing service to the Neo-Nazi website the Daily Stormer. Two years later, Cloudflare also pulled service from the forum 8chan after the forum was linked to a string of violent attacks.They talked about what Cloudflare actually does and why blocking a website from using its services has such a big effect. They also discussed how Cloudflare—which isn't a social media platform like Facebook or Twitter—thinks about its role in deciding what content should and shouldn't stay up.Support this show http://supporter.acast.com/lawfare. Our GDPR privacy policy was updated on August 8, 2022. Visit acast.com/privacy for more information.
From September 24, 2020: Evelyn Douek and Quinta Jurecic spoke to Nina Jankowicz, a disinformation fellow at the Wilson Center, about her new book: “How to Lose the Information War: Russia, Fake News, and the Future of Conflict.” The book chronicles Nina's journey around Europe, tracing down how information operations spearheaded by Russia have played out in countries in the former Soviet bloc, from Georgia to the Czech Republic. What do these case studies reveal about disinformation and how best to counter it—and how many of these lessons can be extrapolated to the United States? How should we understand the role of locals who get swept up in information operations, like the Americans who attended rallies in 2016 that were organized by a Russian troll farm? And what is an information war, anyway?Support this show http://supporter.acast.com/lawfare. Our GDPR privacy policy was updated on August 8, 2022. Visit acast.com/privacy for more information.
You've likely heard that Elon Musk wanted to buy Twitter… and that he is now trying to get out of buying Twitter… and that at first he wanted to defeat the bots on Twitter… but now he's apparently surprised that there are lots of bots on Twitter. It's a spectacle made for the headlines, but it's also, at its core, a regular old corporate law dispute. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek spoke with Adriana Robertson, the Donald N. Pritzker Professor of Business Law at the University of Chicago Law School, to talk about the legal issues behind the headlines. What is the Delaware Court of Chancery in which Musk and Twitter are going to face off? Will it care at all about the bots? And how do corporate lawyers think and talk about this differently from how it gets talked about in most of the public conversation about it?Support this show http://supporter.acast.com/lawfare. See acast.com/privacy for privacy and opt-out information.
You've likely heard that Elon Musk wanted to buy Twitter… and that he is now trying to get out of buying Twitter… and that at first he wanted to defeat the bots on Twitter… but now he's apparently surprised that there are lots of bots on Twitter. It's a spectacle made for the headlines, but it's also, at its core, a regular old corporate law dispute. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek spoke with Adriana Robertson, the Donald N. Pritzker Professor of Business Law at the University of Chicago Law School, to talk about the legal issues behind the headlines. What is the Delaware Court of Chancery in which Musk and Twitter are going to face off? Will it care at all about the bots? And how do corporate lawyers think and talk about this differently from how it gets talked about in most of the public conversation about it? See acast.com/privacy for privacy and opt-out information.
When the Supreme Court handed down its opinion in Dobbs v. Jackson Women's Health Organization, overturning Roe v. Wade, the impact of the decision on the internet may not have been front of mind for most people thinking through the implications. But in the weeks after the Court's decision, it's become clear that the post-Dobbs legal landscape around abortion implicates many questions around not only data and digital privacy, but also online speech. One piece of model state legislation, for example, would criminalize “hosting or maintaining a website, or providing internet service, that encourages or facilitates efforts to obtain an illegal abortion.” This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Evan Greer, the director of the digital rights organization Fight for the Future. She recently wrote an article in Wired with Lia Holland arguing that “Section 230 is a Last Line of Defense for Abortion Speech Online.” They talked about what role Section 230's protections have to play when it comes to liability for speech about abortion and what content moderation looks like in a post-Dobbs world. Support this show http://supporter.acast.com/lawfare. See acast.com/privacy for privacy and opt-out information.
When the Supreme Court handed down its opinion in Dobbs v. Jackson Women's Health Organization, overturning Roe v. Wade, the impact of the decision on the internet may not have been front of mind for most people thinking through the implications. But in the weeks after the Court's decision, it's become clear that the post-Dobbs legal landscape around abortion implicates many questions around not only data and digital privacy, but also online speech. One piece of model state legislation, for example, would criminalize “hosting or maintaining a website, or providing internet service, that encourages or facilitates efforts to obtain an illegal abortion.” This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Evan Greer, the director of the digital rights organization Fight for the Future. She recently wrote an article in Wired with Lia Holland arguing that “Section 230 is a Last Line of Defense for Abortion Speech Online.” They talked about what role Section 230's protections have to play when it comes to liability for speech about abortion and what content moderation looks like in a post-Dobbs world. See acast.com/privacy for privacy and opt-out information.
Since the beginning of the pandemic, we've talked a lot on this show about how falsehoods about the coronavirus are spread and generated. For this episode, Evelyn Douek and Quinta Jurecic spoke with two emergency medicine physicians who have seen the practical effects of those falsehoods while treating patients over the last two years. Nick Sawyer and Taylor Nichols are two of the cofounders of the organization No License for Disinformation, a group that advocates for medical authorities to take disciplinary action against doctors spreading misinformation and disinformation about COVID-19. They argue that state medical boards, which grant physicians the licenses that authorize them to practice medicine, could play a more aggressive role in curbing falsehoods. How many doctors have been disciplined, and why do Nick and Taylor believe that state medical boards have fallen down on the job? What are the possibilities for more aggressive action—and how does the First Amendment limit those possibilities? And how much good can the threat of discipline do in curbing medical misinformation, anyway?Support this show http://supporter.acast.com/lawfare. See acast.com/privacy for privacy and opt-out information.
Since the beginning of the pandemic, we've talked a lot on this show about how falsehoods about the coronavirus are spread and generated. For this episode, Evelyn Douek and Quinta Jurecic spoke with two emergency medicine physicians who have seen the practical effects of those falsehoods while treating patients over the last two years. Nick Sawyer and Taylor Nichols are two of the cofounders of the organization No License for Disinformation, a group that advocates for medical authorities to take disciplinary action against doctors spreading misinformation and disinformation about COVID-19. They argue that state medical boards, which grant physicians the licenses that authorize them to practice medicine, could play a more aggressive role in curbing falsehoods. How many doctors have been disciplined, and why do Nick and Taylor believe that state medical boards have fallen down on the job? What are the possibilities for more aggressive action—and how does the First Amendment limit those possibilities? And how much good can the threat of discipline do in curbing medical misinformation, anyway? See acast.com/privacy for privacy and opt-out information.
Algorithms! We hear a lot about them. They drive social media platforms and, according to popular understanding, are responsible for a great deal of what's wrong about the internet today—and maybe the downfall of democracy itself. But … what exactly are algorithms? And, given they're not going away, what should they be designed to do?Evelyn Douek and Quinta Jurecic spoke with Jonathan Stray, a senior scientist at the Berkeley Center for Human-Compatible AI and someone who has thought a lot about what we mean when we say the word “algorithm”—and also when we discuss things like “engagement” and “amplification.” He helped them pin down a more precise understanding of what those terms mean and why that precision is so important in crafting good technology policy. They also talked about what role social media algorithms do and don't play in stoking political polarization, and how they might be designed to decrease polarization instead.If you're interested, you can read the Senate testimony by Dean Eckles on algorithms that Jonathan mentions during the show.We also mentioned this article by Daniel Kreiss on polarization.Support this show http://supporter.acast.com/lawfare. See acast.com/privacy for privacy and opt-out information.
Algorithms! We hear a lot about them. They drive social media platforms and, according to popular understanding, are responsible for a great deal of what's wrong about the internet today—and maybe the downfall of democracy itself. But … what exactly are algorithms? And, given they're not going away, what should they be designed to do?Evelyn Douek and Quinta Jurecic spoke with Jonathan Stray, a senior scientist at the Berkeley Center for Human-Compatible AI and someone who has thought a lot about what we mean when we say the word “algorithm”—and also when we discuss things like “engagement” and “amplification.” He helped them pin down a more precise understanding of what those terms mean and why that precision is so important in crafting good technology policy. They also talked about what role social media algorithms do and don't play in stoking political polarization, and how they might be designed to decrease polarization instead.If you're interested, you can read the Senate testimony by Dean Eckles on algorithms that Jonathan mentions during the show.We also mentioned this article by Daniel Kreiss on polarization. See acast.com/privacy for privacy and opt-out information.
The House committee investigating the Jan. 6 insurrection is midway through a blockbuster series of hearings exploring Donald Trump's efforts to overturn the 2020 election and disrupt the peaceful transfer of power. Central to those efforts, of course, was the Big Lie—the false notion that Trump was cheated out of victory in 2020.This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Kate Starbird, an associate professor of Human Centered Design & Engineering at the University of Washington—and repeat Arbiters of Truth guest. Kate has come on the show before to talk about misinformation and Jan. 6, and she and a team of coauthors just released a comprehensive analysis of tweets spreading misinformation around the 2020 election. So she's the perfect person with whom to discuss the Jan. 6 committee hearings and misinformation. What does Kate's research show about how election falsehoods spread, and who spread them? How has, and hasn't, the Jan. 6 committee incorporated the role of misinformation into the story it's telling about the insurrection? And is there any chance the committee can break through and get the truth to the people who most need to hear it?Support this show http://supporter.acast.com/lawfare. See acast.com/privacy for privacy and opt-out information.
The House committee investigating the Jan. 6 insurrection is midway through a blockbuster series of hearings exploring Donald Trump's efforts to overturn the 2020 election and disrupt the peaceful transfer of power. Central to those efforts, of course, was the Big Lie—the false notion that Trump was cheated out of victory in 2020.This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Kate Starbird, an associate professor of Human Centered Design & Engineering at the University of Washington—and repeat Arbiters of Truth guest. Kate has come on the show before to talk about misinformation and Jan. 6, and she and a team of coauthors just released a comprehensive analysis of tweets spreading misinformation around the 2020 election. So she's the perfect person with whom to discuss the Jan. 6 committee hearings and misinformation. What does Kate's research show about how election falsehoods spread, and who spread them? How has, and hasn't, the Jan. 6 committee incorporated the role of misinformation into the story it's telling about the insurrection? And is there any chance the committee can break through and get the truth to the people who most need to hear it? See acast.com/privacy for privacy and opt-out information.
If you've been watching the hearings convened by the House select committee on Jan. 6, you've seen a great deal about how the Trump campaign generated and spread falsehoods about supposed election fraud in 2020. As the committee has argued, those falsehoods were crucial in generating the political energy that culminated in the explosion of the January 6 insurrection. What shape did those lies take, and how did social media platforms attempt to deal with them at the time? Today, we're bringing you an episode of our Arbiters of Truth series on the online information ecosystem. In fact, we're rebroadcasting an episode we recorded in November 2020 about disinformation and the 2020 election. In late November 2020, after Joe Biden cemented his victory as the next president but while the Trump campaign was still pushing its claims of election fraud online and in court, Evelyn Douek and Quinta Jurecic spoke with Alex Stamos, the director of the Stanford Internet Observatory. Their conversation then was a great overview of the state of election security and the difficulty of countering false claims around the integrity of the vote. It's worth a listen today as the Jan. 6 committee reminds us what the political and media environment was like in the aftermath of the election and how the Trump campaign committed to election lies that still echo all too loudly. And though it's a year and a half later, the problems we're discussing here certainly haven't gone away.Support this show http://supporter.acast.com/lawfare. See acast.com/privacy for privacy and opt-out information.
If you loaded up the internet or turned on the television somewhere in the United States over the last two months, it's been impossible to avoid news coverage of the defamation trial of actors Johnny Depp and Amber Heard—both of whom sued each other over a dispute relating to allegations by Heard of domestic abuse by Depp. In early June, a Virginia jury found that both had defamed the other. The litigation has received a great deal of coverage for what it might say about the fate of the Me Too movement—but the flood of falsehoods online around the trial raises questions about how useful defamation law can really be in countering lies. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with RonNell Andersen Jones, the Lee E. Teitelbaum Professor of Law at the University of Utah College of Law and an expert on the First Amendment and the interaction between the press and the courts. Along with Lyrissa Lidsky, she's written about defamation law, disinformation, and the Depp-Heard litigation. They talked about why some commentators think defamation could be a useful route to counter falsehoods, why RonNell thinks the celebrity litigation undercuts that argument, and the few cases in which claims of libel or slander really could have an impact in limiting the spread of lies.Support this show http://supporter.acast.com/lawfare. See acast.com/privacy for privacy and opt-out information.
On May 31, by a five-four vote, the Supreme Court blocked a Texas law from going into effect that would have sharply limited how social media companies could moderate their platforms and required companies to abide by various transparency requirements. We've covered the law on this show before—we recorded an episode right after the U.S. Court of Appeals for the Fifth Circuit allowed Texas to implement the law, in the same ruling that the Supreme Court just vacated. But there's enough interesting stuff in the Supreme Court's order—and in Justice Samuel Alito's dissent—that we thought it was worth another bite at the apple. So this week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic invited Genevieve Lakier, professor of law at the University of Chicago and Evelyn's colleague at the Knight First Amendment Institute, to walk us through just what happened. What exactly did the Supreme Court do? Why does Justice Alito seem to think that the Texas law has a decent chance of surviving a First Amendment challenge? And what does this suggest about the possible futures of the extremely unsettled landscape of First Amendment law?Support this show http://supporter.acast.com/lawfare. See acast.com/privacy for privacy and opt-out information.
From October 29, 2020: On this episode of Lawfare's Arbiters of Truth series on disinformation, Evelyn Douek and Quinta Jurecic spoke with Casey Newton, veteran Silicon Valley editor for The Verge who recently went independent to start a newsletter on Substack called Platformer. Few people have followed the stories of platforms and content moderation in recent years as closely and carefully as Casey, so Evelyn and Quinta asked him about what's changed in the last four years—especially in the lead-up to the election. They also spoke about the challenges of reporting on the tech industry and whether the increased willingness of platforms to moderate content means that the name of this podcast series will have to change.Support this show http://supporter.acast.com/lawfare. See acast.com/privacy for privacy and opt-out information.
As transparency reporting about content moderation enforcement has become standard across the platform industry, there's been growing questions about the reliability and accuracy of the reports the platforms are producing. With all reporting being entirely voluntary and the content moderation industry in general being very opaque, it's hard to know how much to trust the figures that companies report in their quarterly or biannual enforcement reports. As a result, there's been growing calls for independent audits of these figures, and last month, Meta released its first ever independent audit of its content moderation reporting systems. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek sat down with someone who actually knows something about auditing: Colleen Honigsberg, an associate professor of law at Stanford Law School, whose research is focused on the empirical study of corporate and securities law. They talked about how auditors work, the promises and pitfalls of auditing in other contexts and what that might teach us for auditing in the content moderation context, and whether this is going to be a useful regulatory tool. Support this show http://supporter.acast.com/lawfare. See acast.com/privacy for privacy and opt-out information.
On May 14, a shooter attacked a supermarket in a historically Black neighborhood of Buffalo, New York, killing ten people and wounding three. The streaming platform Twitch quickly disabled the livestream the shooter had published of the attack—but video of the violence, and copies of the white supremacist manifesto released by the attacker online, continue to circulate on the internet. How should we evaluate the response of social media platforms to the tragedy in Buffalo? This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Brian Fishman, who formerly worked at Facebook, now Meta, as the policy director for counterterrorism and dangerous organizations. Brian helped lead Facebook's response to the 2019 Christchurch shooting, another act of far-right violence livestreamed online. He walked us through how platforms respond to crises like these, why it's so difficult to remove material like the Buffalo video and manifesto from the internet, and what it would look like for platforms to do better.Support this show http://supporter.acast.com/lawfare. See acast.com/privacy for privacy and opt-out information.
On May 12, the U.S. Court of Appeals for the Fifth Circuit allowed an aggressive new Texas law regulating social media to go into effect. The law, known as HB20, seeks to restrict large social media platforms from taking down content on the basis of viewpoint—effectively restricting companies from engaging in a great deal of the content moderation that they currently perform. It also imposes a range of transparency and due process requirements on platforms with respect to their content moderation. A group of technology companies challenging the law have filed an emergency application to the Supreme Court seeking to put HB20 back on hold while they continue to litigate the law's constitutionality under the First Amendment. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Alex Abdo, litigation director at the Knight First Amendment Institute, and Scott Wilkens, senior staff attorney at Knight. The Institute, where Evelyn is a senior research fellow, filed an amicus brief in the Fifth Circuit, taking a middle ground between Texas—which argues that the First Amendment poses no bar to HB20—and the plaintiffs—who argue that the First Amendment prohibits this regulation and many other types of social media regulation besides. So what does the Texas law actually do? Where does the litigation stand—and what will the impact of the Fifth Circuit's ruling be? And how does the Knight First Amendment Institute interpret, well, the First Amendment?Support this show http://supporter.acast.com/lawfare. See acast.com/privacy for privacy and opt-out information.
Internet blackouts are on the rise. Since 2016, governments around the world have fully or partially shut down access to the internet almost 1000 times, according to a tally by the human rights organization Access Now. As the power of the internet grows, this tactic has only become more common as a means of political repression. Why is this and how, exactly, does a government go about turning off the internet? This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke on this topic with Peter Guest, the enterprise editor for the publication Rest of World, which covers technology outside the regions usually described as the West. He's just published a new project with Rest of World diving deep into internet shutdowns—and the three dug into the mechanics of internet blackouts, why they're increasing and their wide-reaching effects.Support this show http://supporter.acast.com/lawfare. See acast.com/privacy for privacy and opt-out information.
While the U.S. Congress has been doing hearing after hearing with tech executives that include a lot of yelling and not much progress, Europe has been quietly working away on some major tech regulations. Last month, it reached agreement on the content moderation piece of this package: the Digital Services Act. It's sweeping in scope and likely to have effects far beyond Europe. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek sat down with Daphne Keller, the director of the Program on Platform Regulation at the Stanford Cyber Policy Center, to get the rundown. What exactly is in the act? What does she like and what doesn't she? And how will the internet look different once it comes into force?Support this show http://supporter.acast.com/lawfare. See acast.com/privacy for privacy and opt-out information.
This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek spoke to Charlotte Willner, who has been working in content moderation longer than just about anyone. Charlotte is now the executive director of the Trust and Safety Professionals Association, an organization that brings together the professionals that write and enforce the rules for what's fair game and what's not on online platforms. Before that, she worked in Trust and Safety at Pinterest and before that she built the very first safety operations team at Facebook. Evelyn asked Charlotte what it was like trying to build a content moderation system from the ground up, what has changed since those early days (spoilers: it's a lot!) and—of course—if she had any advice for Twitter's new owner given all her experience helping keep platforms safe.Support this show http://supporter.acast.com/lawfare. See acast.com/privacy for privacy and opt-out information.
This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with a reporter who has carved out a unique beat writing about not just technology but the creativity and peculiarities of the people who use it—Taylor Lorenz, a columnist at the Washington Post covering technology and online culture. Her recent writing includes reporting on “algospeak”—that is, how algorithmic amplification changes how people talk online—and coverage of the viral Twitter account Libs of TikTok, which promotes social media posts of LGBTQ people for right-wing mockery. They talked about the quirks of a culture shaped in conversation with algorithms, the porous border between internet culture and political life in the United States, and what it means to take the influence of social media seriously, for good and for ill.Support this show http://supporter.acast.com/lawfare. See acast.com/privacy for privacy and opt-out information.
The internet is increasingly emerging as a source for identification and documentation of war crimes, as the Russian invasion of Ukraine has devastatingly proven yet again. But how does an image of a possible war crime go from social media to before a tribunal in a potential war crimes prosecution? On a recent episode of Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Nick Waters, the lead on Justice and Accountability at Bellingcat, about how open-source investigators go about documenting evidence of atrocity. This week on the show, Evelyn and Quinta interviewed Alexa Koenig, the executive director of the Human Rights Center at the University of California, Berkeley, and an expert on using digital evidence for justice and accountability. They talked about how international tribunals have adapted to using new forms of evidence derived from the internet, how social media platforms have helped—and hindered—collection of this kind of evidence, and the work Alexa has done to create a playbook for investigators downloading and collecting material documenting atrocities.Because of the nature of the conversation, this discussion contains some descriptions of violence that might be upsetting for some listeners. Support this show http://supporter.acast.com/lawfare. See acast.com/privacy for privacy and opt-out information.
We're taking a look back at one of the stranger stories about social media platforms and the role of the press in the last presidential election. In the weeks before the 2020 election, the New York Post published an “October Surprise”: a set of stories on the business and personal life of Hunter Biden, the son of Democratic presidential candidate Joe Biden, based on emails contained on a mysterious laptop. A great deal was questionable about the Post's reporting, including to what extent the emails in question were real and how the tabloid had obtained them in the first place. The mainstream press was far more circumspect in reporting out the story—and meanwhile, Twitter and Facebook sharply restricted circulation of the Post's stories on their platforms. It's a year and half later. And the Washington Post just published a lengthy report verifying the authenticity of some of the emails on the mysterious laptop—though a lot still remains unclear about the incident. In light of this news, how should we understand Facebook and Twitter's actions in 2020? Washington Post technology reporter Will Oremus weighed in on this question in his own reflection for the paper. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic asked him on the show to discuss the story. Did the social media platforms go too far in limiting access to the New York Post's reporting? How did the mainstream press deal with the incident? What have we learned from the failures of how the press and social media responded to information operations around the 2016 election, and what can we learn from how they behaved differently in 2020?Support this show http://supporter.acast.com/lawfare. See acast.com/privacy for privacy and opt-out information.
This week on Arbiters of Truth, our series on the online information environment, we're turning our attention to the United Kingdom, where the government has just introduced into Parliament a broad proposal for regulating the internet: the Online Safety Bill. The U.K. government has proclaimed that the Bill represents new “world-first online safety laws” and includes “tougher and quicker criminal sanctions for tech bosses.” So … what would it actually do?To answer this question, Evelyn Douek and Quinta Jurecic spoke with Ellen Judson, a senior researcher at the Centre for the Analysis of Social Media at Demos, a U.K. think tank. Ellen has been closely tracking the legislation as it has developed. And she helped walk us through the tangled system of regulations created by the bill. What new obligations does the Online Safety Bill create, and what companies would those obligations apply to? Why is the answer to so many questions “yet to be defined”—a phrase we kept saying again and again throughout the show—and how much of the legislation is just punting the really difficult questions for another day? What happens now that the bill has been formally introduced in Parliament?Support this show http://supporter.acast.com/lawfare. See acast.com/privacy for privacy and opt-out information.
Lawmakers around the world want to do something about social media, and in particular content moderation. But what if the interventions they are developing are based on a flawed conceptual framework about how content moderation works, or how it should work? This week I had a chance to talk to one of the smartest legal minds on questions related to content moderation to explore some fresh thinking on the subject: https://www.evelyndouek.com/ (evelyn douek), a Doctoral Candidate at Harvard Law School and Senior Research Fellow at the Knight First Amendment Institute at Columbia University. evelyn is the author of “https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4005326 (Content Moderation as Administration),” forthcoming Harvard Law Review, a new paper that serves as the basis for our discussion.
From March 4, 2021: On this episode of Arbiters of Truth, the Lawfare Podcast's miniseries on disinformation and misinformation, Evelyn Douek and Quinta Jurecic spoke with Emily Bell, the founding director of the Tow Center for Digital Journalism at Columbia Journalism School. Emily testified before Congress last week about the role of legacy media, and cable news in particular, in spreading disinformation, but she's also one of the keenest observers of the online news ecosystem and knows a lot about it from her days as director of digital content for The Guardian. They talked about the relationship between online and offline media in spreading disinformation, the role different institutions need to play in fixing what's broken and whether all the talk about “fighting misinformation” is a bit of a red herring.Support this show http://supporter.acast.com/lawfare. See acast.com/privacy for privacy and opt-out information.
Over the last few weeks, we've talked a lot about the war in Ukraine on this series—how the Russian, Ukrainian and American governments are leveraging information as part of the conflict; how tech platforms are navigating the flood of information coming out of Ukraine and the crackdown from the Kremlin; and how open-source investigators are documenting the war. This week on Arbiters of Truth, our series on the online information environment, we're going to talk about getting information into Russia during a period of rapidly increasing repression by the Russian government. Evelyn Douek and Quinta Jurecic spoke with Thomas Kent, a former president of the U.S. government-funded media organization Radio Free Europe/Radio Liberty, who now teaches at Columbia University. He recently wrote an essay published by the Center for European Policy Analysis on “How to Reach Russian Ears,” suggesting creative ways that reporters, civil society and even the U.S. government might approach communicating the truth about the war in Ukraine to Russians. This was a thoughtful and nuanced conversation about a tricky topic—whether, and how, democracies should think about leveraging information as a tool against repressive governments, and how to distinguish journalism from such strategic efforts.Support this show http://supporter.acast.com/lawfare. See acast.com/privacy for privacy and opt-out information.
Open-source investigations—sometimes referred to as OSINT, or open-source intelligence—have been crucial to public understanding of the Russian invasion of Ukraine. An enormous number of researchers have devoted their time to sifting through social media posts, satellite images, and even Google Maps to track what's happening in Ukraine and debunk false claims about the conflict. This week on Arbiters of Truth, our series on the online information ecosystem, we devoted the show to understanding how open-source investigations work and why they're important. Evelyn Douek and Quinta Jurecic spoke to Nick Waters, the lead on Justice and Accountability at Bellingcat, one of the most prominent groups devoted to conducting these types of investigations. They talked about the crucial role played by open-source investigators in documenting the conflict in Syria—well before the war in Ukraine—and how the field has developed since its origins in the Arab Spring and the start of the Syrian Civil War. And Nick walked us through the mechanics of how open-source investigations actually happen, and how social media platforms have helped—and hindered—that work.Support this show http://supporter.acast.com/lawfare. See acast.com/privacy for privacy and opt-out information.
As Russia's brutal war in Ukraine continues, tech platforms like Facebook and Twitter have been key geopolitical players in the conflict. The Kremlin has banned those platforms and others as part of a sharp clampdown on freedoms within Russia. Meanwhile, these companies must decide what to do with state-funded Russian propaganda outlets like RT and Sputnik that have accounts on their platforms—and how best to moderate the flood of information, some of it gruesome or untrue, that's appearing as users share material about the war.This week on Arbiters of Truth, our podcast series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Alex Stamos, director of the Stanford Internet Observatory. They discussed how various platforms, from Twitter to TikTok and Telegram, are moderating the content coming out of Russia and Ukraine right now; the costs and benefits of Western companies pulling operations out of Russia during a period of increasing crackdown; and how the events of the last few weeks might shape our thinking about the nature and power of information operations.Support this show http://supporter.acast.com/lawfare. See acast.com/privacy for privacy and opt-out information.