Arbiters of Truth

Follow Arbiters of Truth
Share on
Copy link to clipboard

From Russian election interference, to scandals over privacy and invasive ad targeting, to presidential tweets: it’s all happening in online spaces governed by private social media companies. These conflicts are only going to grow in importance. In this series, also available in the Lawfare Podcast feed, Evelyn Douek and Quinta Jurecic will be talking to experts and practitioners about the major challenges our new information ecosystem poses for elections and democracy in general, and the dangers of finding cures that are worse than the disease. The podcast takes its name from a comment by Facebook CEO Mark Zuckerberg right after the 2016 election, when Facebook was still reeling from accusations that it hadn’t done enough to clamp down on disinformation during the presidential campaign. Zuckerberg wrote that social media platforms “must be extremely cautious about becoming arbiters of truth ourselves.” So if they don’t want to be the arbiters of truth ... who should be? See acast.com/privacy for privacy and opt-out information.

Lawfare & Goat Rodeo


    • Mar 22, 2024 LATEST EPISODE
    • monthly NEW EPISODES
    • 53m AVG DURATION
    • 152 EPISODES


    Search for episodes from Arbiters of Truth with a specific topic:

    Latest episodes from Arbiters of Truth

    Matt Perault, Ramya Krishnan, and Alan Rozenshtein Talk About the TikTok Divestment and Ban Bill

    Play Episode Listen Later Mar 22, 2024 50:32


    Last week the House of Representatives overwhelmingly passed a bill that would require ByteDance, the Chinese company that owns the popular social media app TikTok, to divest its ownership in the platform or face TikTok being banned in the United States. Although prospects for the bill in the Senate remain uncertain, President Biden has said he will sign the bill if it comes to his desk, and this is the most serious attempt yet to ban the controversial social media app.Today's podcast is the latest in a series of conversations we've had about TikTok. Matt Perault, the Director of the Center on Technology Policy at the University of North Carolina at Chapel Hill, led a conversation with Alan Rozenshtein, Associate Professor of Law at the University of Minnesota and Senior Editor at Lawfare, and Ramya Krishnan, a Senior Staff Attorney at the Knight First Amendment Institute at Columbia University. They talked about the First Amendment implications of a TikTok ban, whether it's a good idea as a policy matter, and how we should think about foreign ownership of platforms more generally.Disclaimer: Matt's center receives funding from foundations and tech companies, including funding from TikTok. Hosted on Acast. See acast.com/privacy for more information.

    Jawboning at the Supreme Court

    Play Episode Listen Later Mar 21, 2024 51:38


    Today, we're bringing you an episode of Arbiters of Truth, our series on the information ecosystem.On March 18, the Supreme Court heard oral arguments in Murthy v. Missouri, concerning the potential First Amendment implications of government outreach to social media platforms—what's sometimes known as jawboning. The case arrived at the Supreme Court with a somewhat shaky evidentiary record, but the legal questions raised by government requests or demands to remove online content are real. To make sense of it all, Lawfare Senior Editor Quinta Jurecic and Matt Perault, the Director of the Center on Technology Policy at UNC-Chapel Hill, called up Alex Abdo, the Litigation Director of the Knight First Amendment Institute at Columbia University. While the law is unsettled, the Supreme Court seemed skeptical of the plaintiffs' claims of government censorship. But what is the best way to determine what contacts and government requests are and aren't permissible?If you're interested in more, you can read the Knight Institute's amicus brief in Murthy here and Knight's series on jawboning—including Perault's reflections—here. Hosted on Acast. See acast.com/privacy for more information.

    How Are the TikTok Bans Holding Up in Court?

    Play Episode Listen Later Jan 3, 2024 49:27


    In May 2023, Montana passed a new law that would ban the use of TikTok within the state starting on January 1, 2024. But as of today, TikTok is still legal in the state of Montana—thanks to a preliminary injunction issued by a federal district judge, who found that the Montana law likely violated the First Amendment. In Texas, meanwhile, another federal judge recently upheld a more limited ban against the use of TikTok on state-owned devices. What should we make of these rulings, and how should we understand the legal status of efforts to ban TikTok?We've discussed the question of TikTok bans and the First Amendment before on the Lawfare Podcast, when Lawfare Senior Editor Alan Rozenshtein and Matt Perault, Director of the Center on Technology Policy at UNC-Chapel Hill, sat down with Ramya Krishnan, a staff attorney at the Knight First Amendment Institute at Columbia University, and Mary-Rose Papandrea, the Samuel Ashe Distinguished Professor of Constitutional Law at the University of North Carolina School of Law. In light of the Montana and Texas rulings, Matt and Lawfare Senior Editor Quinta Jurecic decided to bring the gang back together and talk about where the TikTok bans stand with Ramya and Mary-Rose, on this episode of Arbiters of Truth, our series on the information ecosystem. Hosted on Acast. See acast.com/privacy for more information.

    Jeff Horwitz on Broken Code and Reporting on Facebook

    Play Episode Listen Later Dec 20, 2023 53:58


    In 2021, the Wall Street Journal published a monster scoop: a series of articles about Facebook's inner workings, which showed that employees within the famously secretive company had raised alarms about potential harms caused by Facebook's products. Now, Jeff Horwitz, the reporter behind that scoop, has a new book out, titled “Broken Code”—which dives even deeper into the documents he uncovered from within the company. He's one of the most rigorous reporters covering Facebook, now known as Meta.On this episode of Arbiters of Truth, our series on the information ecosystem Lawfare Senior Editor Quinta Jurecic sat down with Jeff along with Matt Perault, the Director of the Center on Technology Policy at UNC-Chapel Hill—and also someone with close knowledge of Meta from his own time working at the company. They discussed Jeff's reporting and debated what his findings tell us about how Meta functions as a company and how best to understand its responsibilities for harms traced back to its products. Hosted on Acast. See acast.com/privacy for more information.

    Will Generative AI Reshape Elections?

    Play Episode Listen Later Nov 29, 2023 49:03


    Unless you've been living under a rock, you've probably heard a great deal over the last year about generative AI and how it's going to reshape various aspects of our society. That includes elections. With one year until the 2024 U.S. presidential election, we thought it would be a good time to step back and take a look at how generative AI might and might not make a difference when it comes to the political landscape. Luckily, Matt Perault and Scott Babwah Brennen of the UNC Center on Technology Policy have a new report out on just that subject, examining generative AI and political ads.On this episode of Arbiters of Truth, our series on the information ecosystem, Lawfare Senior Editor Quinta Jurecic and Lawfare's Fellow in Technology Policy and Law Eugenia Lostri sat down with Matt and Scott to talk through the potential risks and benefits of generative AI when it comes to political advertising. Which concerns are overstated, and which are worth closer attention as we move toward 2024? How should policymakers respond to new uses of this technology in the context of elections? Hosted on Acast. See acast.com/privacy for more information.

    The Crisis Facing Efforts to Counter Election Disinformation

    Play Episode Listen Later Oct 20, 2023 57:00


    Over the course of the last two presidential elections, efforts by social media platforms and independent researchers to prevent falsehoods from spreading about election integrity have become increasingly central to civic health. But the warning signs are flashing as we head into 2024. And platforms are arguably in a worse position to counter falsehoods today than they were in 2020. How could this be? On this episode of Arbiters of Truth, our series on the information ecosystem, Lawfare Senior Editor Quinta Jurecic sat down with Dean Jackson, who previously sat down with the Lawfare Podcast to discuss his work as a staffer on the Jan. 6 committee. He worked with the Center on Democracy and Technology to put out a new report on the challenges facing efforts to prevent the spread of election disinformation. They talked through the political, legal, and economic pressures that are making this work increasingly difficult—and what it means for 2024. Hosted on Acast. See acast.com/privacy for more information.

    Talking AI with Data and Society's Janet Haven

    Play Episode Listen Later Oct 5, 2023 46:22


    Today, we're bringing you an episode of Arbiters of Truth, our series on the information ecosystem. And we're discussing the hot topic of the moment: artificial intelligence. There are a lot of less-than-informed takes out there about AI and whether it's going to kill us all—so we're glad to be able to share an interview that hopefully cuts through some of that noise.Janet Haven is the Executive Director of the nonprofit Data and Society and a member of the National Artificial Intelligence Advisory Committee, which provides guidance to the White House on AI issues. Lawfare Senior Editor Quinta Jurecic sat down alongside Matt Perault, Director of the Center on Technology and Policy at UNC-Chapel Hill, to talk through their questions about AI governance with Janet. They discussed how she evaluates the dangers and promises of artificial intelligence, how to weigh the different concerns posed by possible future existential risk to society posed by AI versus the immediate potential downsides of AI in our everyday lives, and what kind of regulation she'd like to see in this space. If you're interested in reading further, Janet mentions this paper from Data and Society on “Democratizing AI” in the course of the conversation. Hosted on Acast. See acast.com/privacy for more information.

    What Impact did Facebook Have on the 2020 Elections?

    Play Episode Listen Later Sep 11, 2023 45:24


    How much influence do social media platforms have on American politics and society? It's a tough question for researchers to answer—not just because it's so big, but also because platforms rarely if ever provide all the data that would be needed to address the problem. A new batch of papers released in the journals Science and Nature marks the latest attempt to tackle this question, with access to data provided by Facebook's parent company Meta. The 2020 Facebook & Instagram Research Election Study, a partnership between Meta researchers and outside academics, studied the platforms' impact on the 2020 election—and uncovered some nuanced findings, suggesting that these impacts might be less than you'd expect.Today on Arbiters of Truth, our series on the information ecosystem, Lawfare Senior Editors Alan Rozenshtein and Quinta Jurecic are joined by the project's co-leaders, Talia Stroud of the University of Texas at Austin and Joshua A. Tucker of NYU. They discussed their findings, what it was like to work with Meta, and whether or not this is a model for independent academic research on platforms going forward.(If you're interested in more on the project, you can find links to the papers and an overview of the findings here, and an FAQ, provided by Tucker and Stroud, here.) Hosted on Acast. See acast.com/privacy for more information.

    Brian Fishman on Violent Extremism and Platform Liability

    Play Episode Listen Later May 12, 2023 64:21


    Earlier this year, Brian Fishman published a fantastic paper with Brookings thinking through how technology platforms grapple with terrorism and extremism, and how any reform to Section 230 must allow those platforms space to continue doing that work. That's the short description, but the paper is really about so much more—about how the work of content moderation actually takes place, how contemporary analyses of the harms of social media fail to address the history of how platforms addressed Islamist terror, and how we should understand “the original sin of the internet.” For this episode of Arbiters of Truth, our occasional series on the information ecosystem, Lawfare Senior Editor Quinta Jurecic sat down to talk with Brian about his work. Brian is the cofounder of Cinder, a software platform for the kind of trust and safety work we describe here, and he was formerly a policy director at Meta, where he led the company's work on dangerous individuals and organizations. Hosted on Acast. See acast.com/privacy for more information.

    Cox and Wyden on Section 230 and Generative AI

    Play Episode Listen Later May 2, 2023 29:52


    Generative AI products have been tearing up the headlines recently. Among the many issues these products raise is whether or not their outputs are protected by Section 230, the foundational statute that shields websites from liability for third-party content.On this episode of Arbiters of Truth, Lawfare's occasional series on the information ecosystem, Lawfare Senior Editor Quinta Jurecic and Matt Perault, Director of the Center on Technology and Policy at UNC-Chapel Hill, talked through this question with Senator Ron Wyden and Chris Cox, formerly a U.S. congressman and SEC chairman. Cox and Wyden drafted Section 230 together in 1996—and they're skeptical that its protections apply to generative AI. Disclosure: Matt consults on tech policy issues, including with platforms that work on generative artificial intelligence products and have interests in the issues discussed. Hosted on Acast. See acast.com/privacy for more information.

    An Interview with Meta's Chief Privacy Officers

    Play Episode Listen Later Apr 28, 2023 45:53


    In 2018, news broke that Facebook had allowed third-party developers—including the controversial data analytics firm Cambridge Analytica—to obtain large quantities of user data in ways that users probably didn't anticipate. The fallout led to a controversy over whether Cambridge Analytica had in some way swung the 2016 election for Trump (spoiler: it almost certainly didn't), but it also generated a $5 billion fine imposed on Facebook by the FTC for violating users' privacy. Along with that record-breaking fine, the FTC also imposed a number of requirements on Facebook to improve its approach to privacy. It's been four years since that settlement, and Facebook is now Meta. So how much has really changed within the company? For this episode of Arbiters of Truth, our series on the online information ecosystem, Lawfare Senior Editors Alan Rozenshtein and Quinta Jurecic interviewed Meta's co-chief privacy officers, Erin Egan and Michel Protti, about the company's approach to privacy and its response to the FTC's settlement order.At one point in the conversation, Quinta mentions a class action settlement over the Cambridge Analytica scandal. You can read more about the settlement here. Information about Facebook's legal arguments regarding user privacy interests is available here and here, and you can find more details in the judge's ruling denying Facebook's motion to dismiss.Note: Meta provides support for Lawfare's Digital Social Contract paper series. This podcast episode is not part of that series, and Meta does not have any editorial role in Lawfare. Hosted on Acast. See acast.com/privacy for more information.

    Eugene Volokh on AI Libel

    Play Episode Listen Later Apr 26, 2023 53:48


    If someone lies about you, you can usually sue them for defamation. But what if that someone is ChatGPT? Already in Australia, the mayor of a town outside Melbourne has threatened to sue OpenAI because ChatGPT falsely named him a guilty party in a bribery scandal. Could that happen in America? Does our libel law allow that? What does it even mean for a large language model to act with "malice"? Does the First Amendment put any limits on the ability to hold these models, and the companies that make them, accountable for false statements they make? And what's the best way to deal with this problem: private lawsuits or government regulation?On this episode of Arbiters of Truth, our series on the information ecosystem, Alan Rozenshtein, Associate Professor of Law at the University of Minnesota and Senior Editor at Lawfare, discussed these questions with First Amendment expert Eugene Volokh, Professor of Law at UCLA and the author of a draft paper entitled "Large Libel Models.” Hosted on Acast. See acast.com/privacy for more information.

    A TikTok Ban and the First Amendment

    Play Episode Listen Later Apr 14, 2023 46:32


    Over the past few years, TikTok has become a uniquely polarizing social media platform. On the one hand, millions of users, especially those in their teens and twenties, love the app. On the other hand, the government is concerned that TikTok's vulnerability to pressure from the Chinese Communist Party makes it a serious national security threat. There's even talk of banning the app altogether. But would that be legal? In particular, does the First Amendment allow the government to ban an application that's used by millions to communicate every day?On this episode of Arbiters of Truth, our series on the information ecosystem, Matt Perault, director of the Center on Technology Policy at the University of North Carolina at Chapel Hill, and Alan Z. Rozenshtein, Lawfare Senior Editor and Associate Professor of Law at the University of Minnesota, spoke with Ramya Krishnan, a staff attorney at the Knight First Amendment Institute at Columbia University, and Mary-Rose Papendrea, the Samuel Ashe Distinguished Professor of Constitutional Law at the University of North Carolina School of Law, to think through the legal and policy implications of a TikTok ban. Hosted on Acast. See acast.com/privacy for more information.

    Ravi Iyer on How to Improve Technology Through Design

    Play Episode Listen Later Mar 27, 2023 45:05


    On the latest episode of Arbiters of Truth, Lawfare's series on the information ecosystem, Quinta Jurecic and Alan Rozenshtein spoke with Ravi Iyer, the Managing Director of the Psychology of Technology Institute at the University of Southern California's Neely Center.Earlier in his career, Ravi held a number of positions at Meta, where he worked to make Facebook's algorithm provide actual value, not just "engagement," to users. Quinta and Alan spoke with Ravi about why he thinks that content moderation is a dead-end and why thinking about the design of technology is the way forward to make sure that technology serves us and not the other way around. Hosted on Acast. See acast.com/privacy for more information.

    Does Section 230 Protect ChatGPT?

    Play Episode Listen Later Mar 9, 2023 50:28


    During recent oral arguments in Gonzalez v. Google, a Supreme Court case concerning the scope of liability protections for internet platforms, Justice Neil Gorsuch asked a thought-provoking question. Does Section 230, the statute that shields websites from liability for third-party content, apply to a generative AI model like ChatGPT? Luckily, Matt Perault of the Center on Technology Policy at the University of North Carolina at Chapel Hill had already been thinking about this question and published a Lawfare article arguing that 230's protections wouldn't extend to content generated by AI. Lawfare Senior Editors Quinta Jurecic and Alan Rozenshtein sat down with Matt and Jess Miers, legal advocacy counsel at the Chamber of Progress, to debate whether ChatGPT's output constitutes third-party content, whether companies like OpenAI should be immune for the output of their products, and why you might want to sue a chatbot in the first place. Hosted on Acast. See acast.com/privacy for more information.

    ChatGPT Tells All

    Play Episode Listen Later Feb 1, 2023 59:22


    You've likely heard of ChatGPT, the chatbot from OpenAI. But you've likely never heard an interview with ChatGPT, much less an interview in which ChatGPT reflects on its own impact on the information ecosystem. Nor is it likely that you've ever heard ChatGPT promising to stop producing racist and misogynistic content. But, on this episode of Arbiters of Truth, Lawfare's occasional series on the information ecosystem, Lawfare editor-in-chief Benjamin Wittes sat down with ChatGPT to talk about a range of things: the pronouns it prefers; academic integrity and the chatbot's likely impact on that; and importantly, the experiments performed by a scholar name Eve Gaumond, who has been on a one-woman campaign to get ChatGPT to write offensive content. ChatGPT made some pretty solid representations that this kind of thing may be in its past, but wouldn't ever be in its future again.So, following Ben's interview with ChatGPT, he sat down with Eve Gaumond, an AI scholar at the Public Law Center of the University of Montréal, who fact-checked ChatGPT's claims. Can you still get it to write a poem entitled, “She Was Smart for a Woman”? Can you get it to write a speech by Heinrich Himmler about Jews? And can you get ChatGPT to write a story belittling the Holocaust? Hosted on Acast. See acast.com/privacy for more information.

    When States Make Tech Policy

    Play Episode Listen Later Jan 23, 2023 45:03


    Tech policy reform occupies a strange place in Washington, D.C. Everyone seems to agree that the government should change how it regulates the technology industry, on issues from content moderation to privacy—and yet, reform never actually seems to happen. But while the federal government continues to stall, state governments are taking action. More and more, state-level officials are proposing and implementing changes in technology policy. Most prominently, Texas and Florida recently passed laws restricting how platforms can moderate content, which will likely be considered by the Supreme Court later this year.On this episode of Arbiters of Truth, our occasional series on the information ecosystem, Lawfare senior editor Quinta Jurecic spoke with J. Scott Babwah Brennen and Matt Perault of the Center on Technology Policy at UNC-Chapel Hill. In recent months, they've put together two reports on state-level tech regulation. They talked about what's driving this trend, why and how state-level policymaking differs—and doesn't—from policymaking at the federal level, and what opportunities and complications this could create. Hosted on Acast. See acast.com/privacy for more information.

    Rick Hasen and Nate Persily on Replatforming Trump on Social Media

    Play Episode Listen Later Dec 15, 2022 43:46


    On November 19, Twitter's new owner Elon Musk announced that he would be reinstating former President Donald Trump's account on the platform—though so far, Trump hasn't taken Musk up on the offer, preferring instead to stay on his bespoke website Truth Social. Meanwhile, Meta's Oversight Board has set a January 2023 deadline for the platform to decide whether or not to return Trump to Facebook following his suspension after the Jan. 6 insurrection. How should we think through the difficult question of how social media platforms should handle the presence of a political leader who delights in spreading falsehoods and ginning up violence?Luckily for us, Stanford and UCLA recently held a conference on just that. On this episode of Arbiters of Truth, our series on the online information ecosystem, Lawfare senior editors Alan Rozenshtein and Quinta Jurecic sat down with the conference's organizers, election law experts Rick Hasen and Nate Persily, to talk about whether Trump should be returned to social media. They debated the tangled issues of Trump's deplatforming and replatforming … and discussed whether, and when, Trump will break the seal and start tweeting again. Hosted on Acast. See acast.com/privacy for more information.

    A Member of Meta's Oversight Board Discusses the Board's New Decision

    Play Episode Listen Later Dec 12, 2022 46:13


    When Facebook whistleblower Frances Haugen shared a trove of internal company documents to the Wall Street Journal in 2021, some of the most dramatic revelations concerned the company's use of a so-called “cross-check” system that, according to the Journal, essentially exempted certain high-profile users from the platform's usual rules. After the Journal published its report, Facebook—which has since changed its name to Meta—asked the platform's independent Oversight Board to weigh in on the program. And now, a year later, the Board has finally released its opinion. On this episode of Arbiters of Truth, our series on the online information ecosystem, Lawfare senior editors Alan Rozenshtein and Quinta Jurecic sat down with Suzanne Nossel, a member of the Oversight Board and the CEO of PEN America. She talked us through the Board's findings, its criticisms of cross-check, and its recommendations for Meta going forward. Hosted on Acast. See acast.com/privacy for more information.

    Decentralized Social Media and the Great Twitter Exodus

    Play Episode Listen Later Nov 8, 2022 57:32


    It's Election Day in the United States—so while you wait for the results to come in, why not listen to a podcast about the other biggest story obsessing the political commentariat right now? We're talking, of course, about Elon Musk's purchase of Twitter and the billionaire's dramatic and erratic changes to the platform. In response to Musk's takeover, a great number of Twitter users have made the leap to Mastodon, a decentralized platform that offers a very different vision of what social media could look like. What exactly is decentralized social media, and how does it work? Lawfare senior editor Alan Rozenshtein has a paper on just that, and he sat down with Lawfare senior editor Quinta Jurecic on the podcast to discuss for an episode of our Arbiters of Truth series on the online information ecosystem. They were also joined by Kate Klonick, associate professor of law at St. John's University, to hash out the many, many questions about content moderation and the future of the internet sparked by Musk's reign and the new popularity of Mastodon.Among the works mentioned in this episode:“Welcome to hell, Elon. You break it, you buy it,” by Nilay Patel on The Verge“Hey Elon: Let Me Help You Speed Run The Content Moderation Learning Curve,” by Mike Masnick on Techdirt Hosted on Acast. See acast.com/privacy for more information.

    The Supreme Court Takes On 230

    Play Episode Listen Later Oct 13, 2022 52:42


    The Supreme Court has granted cert in two cases exploring the interactions between anti-terrorism laws and Section 230 of the Communications Decency Act. To discuss the cases, Lawfare editor-in-chief Benjamin Wittes sat down on Arbiters of Truth, our occasional series on the online information ecosystem, with Lawfare senior editors and Rational Security co-hosts Quinta Jurecic, Alan Rozenshtein, and Scott R. Anderson. They discussed the state of 230 law, what the Supreme Court has taken on, what the lower court did, and if there is a right answer here and what it might look like. Hosted on Acast. See acast.com/privacy for more information.

    truth supreme court acast lawfare arbiters communications decency act quinta jurecic rational security alan rozenshtein scott r anderson
    Mark Bergen on the Rise and Rise of YouTube

    Play Episode Listen Later Oct 4, 2022 61:08


    Today, we're bringing you another episode of our Arbiters of Truth series on the online information ecosystem. Lawfare senior editor Quinta Jurecic spoke with Mark Bergen, a reporter for Bloomberg News and Businessweek, about his new book, “Like, Comment, Subscribe: Inside YouTube's Chaotic Rise to World Domination.” YouTube is one of the largest and most influential social media platforms, but Bergen argues that it's long been “criminally undercovered.” As he tells it, the story of YouTube has a great deal to tell us about the development of the modern attention economy, the promise and pitfalls of the internet, and the struggles of platforms to grapple with their own influence and responsibility. Hosted on Acast. See acast.com/privacy for more information.

    The Fifth Circuit is Wrong on the Internet

    Play Episode Listen Later Sep 23, 2022 54:30


    Our Arbiters of Truth series on the online information ecosystem has been taking a bit of a hiatus—but we're back! On today's episode, we're discussing the recent ruling by the U.S. Court of Appeals for the Fifth Circuit in NetChoice v. Paxton, upholding a Texas law that binds large social media platforms to certain transparency requirements and significantly limits their ability to moderate content. The decision is truly a wild ride—so unhinged that it's difficult to figure out where First Amendment law in this area might go next.To discuss, Lawfare senior editor Quinta Jurecic sat down with fellow Lawfare senior editor Alan Rozenshtein and Alex Abdo, the litigation director at the Knight First Amendment Institute at Columbia University—who's come on the podcast before to discuss the case. They tried to make sense of the Fifth Circuit's ruling and chart out alternative possibilities for what good-faith jurisprudence on social media regulation might look like. Hosted on Acast. See acast.com/privacy for more information.

    When Lawyers Spread Disinformation

    Play Episode Listen Later Aug 5, 2022 50:58


    A few weeks ago on Arbiters of Truth, our series on the online information system, we brought you a conversation with two emergency room doctors about their efforts to push back against members of their profession spreading falsehoods about the coronavirus. Today, we're going to take a look at another profession that's been struggling to counter lies and falsehoods within its ranks: the law. Recently, lawyers involved in efforts to overturn the 2020 election have faced professional discipline—like Rudy Giuliani, whose law license has been suspended temporarily in New York and D.C. while a New York ethics investigation remains ongoing.Quinta Jurecic sat down with Paul Rosenzweig a contributing editor at Lawfare and a board member with the 65 Project, an organization that seeks to hold accountable lawyers who worked to help Trump hold onto power in 2020—often by spreading lies. He's also spent many years working on issues related to legal ethics. So what avenues of discipline are available for lawyers who tell lies about elections? How does the legal discipline process work? And how effective can legal discipline be in reasserting the truth? See acast.com/privacy for privacy and opt-out information.

    The Corporate Law Behind Musk v. Twitter

    Play Episode Listen Later Jul 28, 2022 58:44


    You've likely heard that Elon Musk wanted to buy Twitter… and that he is now trying to get out of buying Twitter… and that at first he wanted to defeat the bots on Twitter… but now he's apparently surprised that there are lots of bots on Twitter. It's a spectacle made for the headlines, but it's also, at its core, a regular old corporate law dispute. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek spoke with Adriana Robertson, the Donald N. Pritzker Professor of Business Law at the University of Chicago Law School, to talk about the legal issues behind the headlines. What is the Delaware Court of Chancery in which Musk and Twitter are going to face off? Will it care at all about the bots? And how do corporate lawyers think and talk about this differently from how it gets talked about in most of the public conversation about it? See acast.com/privacy for privacy and opt-out information.

    Online Speech and Section 230 After Dobbs

    Play Episode Listen Later Jul 21, 2022 55:52


    When the Supreme Court handed down its opinion in Dobbs v. Jackson Women's Health Organization, overturning Roe v. Wade, the impact of the decision on the internet may not have been front of mind for most people thinking through the implications. But in the weeks after the Court's decision, it's become clear that the post-Dobbs legal landscape around abortion implicates many questions around not only data and digital privacy, but also online speech. One piece of model state legislation, for example, would criminalize “hosting or maintaining a website, or providing internet service, that encourages or facilitates efforts to obtain an illegal abortion.” This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Evan Greer, the director of the digital rights organization Fight for the Future. She recently wrote an article in Wired with Lia Holland arguing that “Section 230 is a Last Line of Defense for Abortion Speech Online.” They talked about what role Section 230's protections have to play when it comes to liability for speech about abortion and what content moderation looks like in a post-Dobbs world. See acast.com/privacy for privacy and opt-out information.

    When Doctors Spread Disinformation

    Play Episode Listen Later Jul 14, 2022 57:52


    Since the beginning of the pandemic, we've talked a lot on this show about how falsehoods about the coronavirus are spread and generated. For this episode, Evelyn Douek and Quinta Jurecic spoke with two emergency medicine physicians who have seen the practical effects of those falsehoods while treating patients over the last two years. Nick Sawyer and Taylor Nichols are two of the cofounders of the organization No License for Disinformation, a group that advocates for medical authorities to take disciplinary action against doctors spreading misinformation and disinformation about COVID-19. They argue that state medical boards, which grant physicians the licenses that authorize them to practice medicine, could play a more aggressive role in curbing falsehoods. How many doctors have been disciplined, and why do Nick and Taylor believe that state medical boards have fallen down on the job? What are the possibilities for more aggressive action—and how does the First Amendment limit those possibilities? And how much good can the threat of discipline do in curbing medical misinformation, anyway? See acast.com/privacy for privacy and opt-out information.

    What We Talk About When We Talk About Algorithms

    Play Episode Listen Later Jul 7, 2022 63:30


    Algorithms! We hear a lot about them. They drive social media platforms and, according to popular understanding, are responsible for a great deal of what's wrong about the internet today—and maybe the downfall of democracy itself. But … what exactly are algorithms? And, given they're not going away, what should they be designed to do?Evelyn Douek and Quinta Jurecic spoke with Jonathan Stray, a senior scientist at the Berkeley Center for Human-Compatible AI and someone who has thought a lot about what we mean when we say the word “algorithm”—and also when we discuss things like “engagement” and “amplification.” He helped them pin down a more precise understanding of what those terms mean and why that precision is so important in crafting good technology policy. They also talked about what role social media algorithms do and don't play in stoking political polarization, and how they might be designed to decrease polarization instead.If you're interested, you can read the Senate testimony by Dean Eckles on algorithms that Jonathan mentions during the show.We also mentioned this article by Daniel Kreiss on polarization. See acast.com/privacy for privacy and opt-out information.

    senate algorithms berkeley center quinta jurecic daniel kreiss evelyn douek
    The Jan. 6 Committee Takes On the Big Lie

    Play Episode Listen Later Jun 30, 2022 54:06


    The House committee investigating the Jan. 6 insurrection is midway through a blockbuster series of hearings exploring Donald Trump's efforts to overturn the 2020 election and disrupt the peaceful transfer of power. Central to those efforts, of course, was the Big Lie—the false notion that Trump was cheated out of victory in 2020.This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Kate Starbird, an associate professor of Human Centered Design & Engineering at the University of Washington—and repeat Arbiters of Truth guest. Kate has come on the show before to talk about misinformation and Jan. 6, and she and a team of coauthors just released a comprehensive analysis of tweets spreading misinformation around the 2020 election. So she's the perfect person with whom to discuss the Jan. 6 committee hearings and misinformation. What does Kate's research show about how election falsehoods spread, and who spread them? How has, and hasn't, the Jan. 6 committee incorporated the role of misinformation into the story it's telling about the insurrection? And is there any chance the committee can break through and get the truth to the people who most need to hear it? See acast.com/privacy for privacy and opt-out information.

    Rebroadcast: The Most Intense Online Disinformation Event in American History

    Play Episode Listen Later Jun 23, 2022 50:30


    If you've been watching the hearings convened by the House select committee on Jan. 6, you've seen a great deal about how the Trump campaign generated and spread falsehoods about supposed election fraud in 2020. As the committee has argued, those falsehoods were crucial in generating the political energy that culminated in the explosion of the January 6 insurrection. What shape did those lies take, and how did social media platforms attempt to deal with them at the time? Today, we're bringing you an episode of our Arbiters of Truth series on the online information ecosystem. In fact, we're rebroadcasting an episode we recorded in November 2020 about disinformation and the 2020 election. In late November 2020, after Joe Biden cemented his victory as the next president but while the Trump campaign was still pushing its claims of election fraud online and in court, Evelyn Douek and Quinta Jurecic spoke with Alex Stamos, the director of the Stanford Internet Observatory. Their conversation then was a great overview of the state of election security and the difficulty of countering false claims around the integrity of the vote. It's worth a listen today as the Jan. 6 committee reminds us what the political and media environment was like in the aftermath of the election and how the Trump campaign committed to election lies that still echo all too loudly. And though it's a year and a half later, the problems we're discussing here certainly haven't gone away. See acast.com/privacy for privacy and opt-out information.

    Defamation, Disinformation, and the Depp-Heard Trial

    Play Episode Listen Later Jun 16, 2022 56:03


    If you loaded up the internet or turned on the television somewhere in the United States over the last two months, it's been impossible to avoid news coverage of the defamation trial of actors Johnny Depp and Amber Heard—both of whom sued each other over a dispute relating to allegations by Heard of domestic abuse by Depp. In early June, a Virginia jury found that both had defamed the other. The litigation has received a great deal of coverage for what it might say about the fate of the Me Too movement—but the flood of falsehoods online around the trial raises questions about how useful defamation law can really be in countering lies. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with RonNell Andersen Jones, the Lee E. Teitelbaum Professor of Law at the University of Utah College of Law and an expert on the First Amendment and the interaction between the press and the courts. Along with Lyrissa Lidsky, she's written about defamation law, disinformation, and the Depp-Heard litigation. They talked about why some commentators think defamation could be a useful route to counter falsehoods, why RonNell thinks the celebrity litigation undercuts that argument, and the few cases in which claims of libel or slander really could have an impact in limiting the spread of lies. See acast.com/privacy for privacy and opt-out information.

    The Supreme Court Blocks the Texas Social Media Law

    Play Episode Listen Later Jun 9, 2022 59:58


    On May 31, by a five-four vote, the Supreme Court blocked a Texas law from going into effect that would have sharply limited how social media companies could moderate their platforms and required companies to abide by various transparency requirements. We've covered the law on this show before—we recorded an episode right after the U.S. Court of Appeals for the Fifth Circuit allowed Texas to implement the law, in the same ruling that the Supreme Court just vacated. But there's enough interesting stuff in the Supreme Court's order—and in Justice Samuel Alito's dissent—that we thought it was worth another bite at the apple. So this week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic invited Genevieve Lakier, professor of law at the University of Chicago and Evelyn's colleague at the Knight First Amendment Institute, to walk us through just what happened. What exactly did the Supreme Court do? Why does Justice Alito seem to think that the Texas law has a decent chance of surviving a First Amendment challenge? And what does this suggest about the possible futures of the extremely unsettled landscape of First Amendment law? See acast.com/privacy for privacy and opt-out information.

    Bringing in the Content Moderation Auditors

    Play Episode Listen Later Jun 2, 2022 56:17


    As transparency reporting about content moderation enforcement has become standard across the platform industry, there's been growing questions about the reliability and accuracy of the reports the platforms are producing. With all reporting being entirely voluntary and the content moderation industry in general being very opaque, it's hard to know how much to trust the figures that companies report in their quarterly or biannual enforcement reports. As a result, there's been growing calls for independent audits of these figures, and last month, Meta released its first ever independent audit of its content moderation reporting systems. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek sat down with someone who actually knows something about auditing: Colleen Honigsberg, an associate professor of law at Stanford Law School, whose research is focused on the empirical study of corporate and securities law. They talked about how auditors work, the promises and pitfalls of auditing in other contexts and what that might teach us for auditing in the content moderation context, and whether this is going to be a useful regulatory tool. See acast.com/privacy for privacy and opt-out information.

    Social Media Platforms and the Buffalo Shooting

    Play Episode Listen Later May 26, 2022 57:47


    On May 14, a shooter attacked a supermarket in a historically Black neighborhood of Buffalo, New York, killing ten people and wounding three. The streaming platform Twitch quickly disabled the livestream the shooter had published of the attack—but video of the violence, and copies of the white supremacist manifesto released by the attacker online, continue to circulate on the internet. How should we evaluate the response of social media platforms to the tragedy in Buffalo? This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Brian Fishman, who formerly worked at Facebook, now Meta, as the policy director for counterterrorism and dangerous organizations. Brian helped lead Facebook's response to the 2019 Christchurch shooting, another act of far-right violence livestreamed online. He walked us through how platforms respond to crises like these, why it's so difficult to remove material like the Buffalo video and manifesto from the internet, and what it would look like for platforms to do better. See acast.com/privacy for privacy and opt-out information.

    The Platforms versus Texas in the Supreme Court

    Play Episode Listen Later May 19, 2022 58:32


    On May 12, the U.S. Court of Appeals for the Fifth Circuit allowed an aggressive new Texas law regulating social media to go into effect. The law, known as HB20, seeks to restrict large social media platforms from taking down content on the basis of viewpoint—effectively restricting companies from engaging in a great deal of the content moderation that they currently perform. It also imposes a range of transparency and due process requirements on platforms with respect to their content moderation. A group of technology companies challenging the law have filed an emergency application to the Supreme Court seeking to put HB20 back on hold while they continue to litigate the law's constitutionality under the First Amendment. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Alex Abdo, litigation director at the Knight First Amendment Institute, and Scott Wilkens, senior staff attorney at Knight. The Institute, where Evelyn is a senior research fellow, filed an amicus brief in the Fifth Circuit, taking a middle ground between Texas—which argues that the First Amendment poses no bar to HB20—and the plaintiffs—who argue that the First Amendment prohibits this regulation and many other types of social media regulation besides. So what does the Texas law actually do? Where does the litigation stand—and what will the impact of the Fifth Circuit's ruling be? And how does the Knight First Amendment Institute interpret, well, the First Amendment? See acast.com/privacy for privacy and opt-out information.

    When Governments Turn Off the Internet

    Play Episode Listen Later May 12, 2022 54:54


    Internet blackouts are on the rise. Since 2016, governments around the world have fully or partially shut down access to the internet almost 1000 times, according to a tally by the human rights organization Access Now. As the power of the internet grows, this tactic has only become more common as a means of political repression. Why is this and how, exactly, does a government go about turning off the internet? This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke on this topic with Peter Guest, the enterprise editor for the publication Rest of World, which covers technology outside the regions usually described as the West. He's just published a new project with Rest of World diving deep into internet shutdowns—and the three dug into the mechanics of internet blackouts, why they're increasing and their wide-reaching effects. See acast.com/privacy for privacy and opt-out information.

    Pay Attention to Europe's Digital Services Act

    Play Episode Listen Later May 5, 2022 58:54


    While the U.S. Congress has been doing hearing after hearing with tech executives that include a lot of yelling and not much progress, Europe has been quietly working away on some major tech regulations. Last month, it reached agreement on the content moderation piece of this package: the Digital Services Act. It's sweeping in scope and likely to have effects far beyond Europe. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek sat down with Daphne Keller, the director of the Program on Platform Regulation at the Stanford Cyber Policy Center, to get the rundown. What exactly is in the act? What does she like and what doesn't she? And how will the internet look different once it comes into force? See acast.com/privacy for privacy and opt-out information.

    The Professionalization of Content Moderation

    Play Episode Listen Later Apr 28, 2022 58:07


    This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek spoke to Charlotte Willner, who has been working in content moderation longer than just about anyone. Charlotte is now the executive director of the Trust and Safety Professionals Association, an organization that brings together the professionals that write and enforce the rules for what's fair game and what's not on online platforms. Before that, she worked in Trust and Safety at Pinterest and before that she built the very first safety operations team at Facebook. Evelyn asked Charlotte what it was like trying to build a content moderation system from the ground up, what has changed since those early days (spoilers: it's a lot!) and—of course—if she had any advice for Twitter's new owner given all her experience helping keep platforms safe. See acast.com/privacy for privacy and opt-out information.

    Taylor Lorenz on Taking Internet Culture Seriously

    Play Episode Listen Later Apr 21, 2022 34:20


    This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with a reporter who has carved out a unique beat writing about not just technology but the creativity and peculiarities of the people who use it—Taylor Lorenz, a columnist at the Washington Post covering technology and online culture. Her recent writing includes reporting on “algospeak”—that is, how algorithmic amplification changes how people talk online—and coverage of the viral Twitter account Libs of TikTok, which promotes social media posts of LGBTQ people for right-wing mockery. They talked about the quirks of a culture shaped in conversation with algorithms, the porous border between internet culture and political life in the United States, and what it means to take the influence of social media seriously, for good and for ill. See acast.com/privacy for privacy and opt-out information.

    Bringing Evidence of War Crimes From Twitter to the Hague

    Play Episode Listen Later Apr 14, 2022 59:56


    The internet is increasingly emerging as a source for identification and documentation of war crimes, as the Russian invasion of Ukraine has devastatingly proven yet again. But how does an image of a possible war crime go from social media to before a tribunal in a potential war crimes prosecution? On a recent episode of Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Nick Waters, the lead on Justice and Accountability at Bellingcat, about how open-source investigators go about documenting evidence of atrocity. This week on the show, Evelyn and Quinta interviewed Alexa Koenig, the executive director of the Human Rights Center at the University of California, Berkeley, and an expert on using digital evidence for justice and accountability. They talked about how international tribunals have adapted to using new forms of evidence derived from the internet, how social media platforms have helped—and hindered—collection of this kind of evidence, and the work Alexa has done to create a playbook for investigators downloading and collecting material documenting atrocities.Because of the nature of the conversation, this discussion contains some descriptions of violence that might be upsetting for some listeners. See acast.com/privacy for privacy and opt-out information.

    How the Press and the Platforms Handled the Hunter Biden Laptop

    Play Episode Listen Later Apr 7, 2022 59:46


    We're taking a look back at one of the stranger stories about social media platforms and the role of the press in the last presidential election. In the weeks before the 2020 election, the New York Post published an “October Surprise”: a set of stories on the business and personal life of Hunter Biden, the son of Democratic presidential candidate Joe Biden, based on emails contained on a mysterious laptop. A great deal was questionable about the Post's reporting, including to what extent the emails in question were real and how the tabloid had obtained them in the first place. The mainstream press was far more circumspect in reporting out the story—and meanwhile, Twitter and Facebook sharply restricted circulation of the Post's stories on their platforms. It's a year and half later. And the Washington Post just published a lengthy report verifying the authenticity of some of the emails on the mysterious laptop—though a lot still remains unclear about the incident. In light of this news, how should we understand Facebook and Twitter's actions in 2020? Washington Post technology reporter Will Oremus weighed in on this question in his own reflection for the paper. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic asked him on the show to discuss the story. Did the social media platforms go too far in limiting access to the New York Post's reporting? How did the mainstream press deal with the incident? What have we learned from the failures of how the press and social media responded to information operations around the 2016 election, and what can we learn from how they behaved differently in 2020? See acast.com/privacy for privacy and opt-out information.

    What's in the U.K. Online Safety Bill?

    Play Episode Listen Later Mar 31, 2022 57:05


    This week on Arbiters of Truth, our series on the online information environment, we're turning our attention to the United Kingdom, where the government has just introduced into Parliament a broad proposal for regulating the internet: the Online Safety Bill. The U.K. government has proclaimed that the Bill represents new “world-first online safety laws” and includes “tougher and quicker criminal sanctions for tech bosses.” So … what would it actually do?To answer this question, Evelyn Douek and Quinta Jurecic spoke with Ellen Judson, a senior researcher at the Centre for the Analysis of Social Media at Demos, a U.K. think tank. Ellen has been closely tracking the legislation as it has developed. And she helped walk us through the tangled system of regulations created by the bill. What new obligations does the Online Safety Bill create, and what companies would those obligations apply to? Why is the answer to so many questions “yet to be defined”—a phrase we kept saying again and again throughout the show—and how much of the legislation is just punting the really difficult questions for another day? What happens now that the bill has been formally introduced in Parliament? See acast.com/privacy for privacy and opt-out information.

    Getting Information Into Russia

    Play Episode Listen Later Mar 24, 2022 59:42


    Over the last few weeks, we've talked a lot about the war in Ukraine on this series—how the Russian, Ukrainian and American governments are leveraging information as part of the conflict; how tech platforms are navigating the flood of information coming out of Ukraine and the crackdown from the Kremlin; and how open-source investigators are documenting the war. This week on Arbiters of Truth, our series on the online information environment, we're going to talk about getting information into Russia during a period of rapidly increasing repression by the Russian government. Evelyn Douek and Quinta Jurecic spoke with Thomas Kent, a former president of the U.S. government-funded media organization Radio Free Europe/Radio Liberty, who now teaches at Columbia University. He recently wrote an essay published by the Center for European Policy Analysis on “How to Reach Russian Ears,” suggesting creative ways that reporters, civil society and even the U.S. government might approach communicating the truth about the war in Ukraine to Russians. This was a thoughtful and nuanced conversation about a tricky topic—whether, and how, democracies should think about leveraging information as a tool against repressive governments, and how to distinguish journalism from such strategic efforts. See acast.com/privacy for privacy and opt-out information.

    How Open-Source Investigators are Documenting the War in Ukraine

    Play Episode Listen Later Mar 17, 2022 52:38


    Open-source investigations—sometimes referred to as OSINT, or open-source intelligence—have been crucial to public understanding of the Russian invasion of Ukraine. An enormous number of researchers have devoted their time to sifting through social media posts, satellite images, and even Google Maps to track what's happening in Ukraine and debunk false claims about the conflict. This week on Arbiters of Truth, our series on the online information ecosystem, we devoted the show to understanding how open-source investigations work and why they're important. Evelyn Douek and Quinta Jurecic spoke to Nick Waters, the lead on Justice and Accountability at Bellingcat, one of the most prominent groups devoted to conducting these types of investigations. They talked about the crucial role played by open-source investigators in documenting the conflict in Syria—well before the war in Ukraine—and how the field has developed since its origins in the Arab Spring and the start of the Syrian Civil War. And Nick walked us through the mechanics of how open-source investigations actually happen, and how social media platforms have helped—and hindered—that work. See acast.com/privacy for privacy and opt-out information.

    How Tech Platforms are Navigating the War in Ukraine

    Play Episode Listen Later Mar 10, 2022 59:54


    As Russia's brutal war in Ukraine continues, tech platforms like Facebook and Twitter have been key geopolitical players in the conflict. The Kremlin has banned those platforms and others as part of a sharp clampdown on freedoms within Russia. Meanwhile, these companies must decide what to do with state-funded Russian propaganda outlets like RT and Sputnik that have accounts on their platforms—and how best to moderate the flood of information, some of it gruesome or untrue, that's appearing as users share material about the war.This week on Arbiters of Truth, our podcast series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Alex Stamos, director of the Stanford Internet Observatory. They discussed how various platforms, from Twitter to TikTok and Telegram, are moderating the content coming out of Russia and Ukraine right now; the costs and benefits of Western companies pulling operations out of Russia during a period of increasing crackdown; and how the events of the last few weeks might shape our thinking about the nature and power of information operations. See acast.com/privacy for privacy and opt-out information.

    You Can't Handle the Truth (Social)

    Play Episode Listen Later Mar 3, 2022 58:36


    Almost immediately since he was banned from Twitter and Facebook in January 2021, Donald Trump has been promising the launch of a new, Trump-run platform to share his thoughts with the world. In February 2022, that network—Truth Social—finally launched. But it's been a debacle from start to finish, with a lengthy waitlist and a glitchy website that awaits users who finally make it online. Drew Harwell, who covers technology at the Washington Post, has been reporting on the less-than-smooth launch of Truth Social. This week on Arbiters of Truth, our podcast series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with him about who, exactly, this platform is for and who is running it. What explains the glitchy rollout? What's the business plan … if there is one? And how does the platform fit into the ever-expanding universe of alternative social media sites for right-wing users? See acast.com/privacy for privacy and opt-out information.

    The Information War in Ukraine

    Play Episode Listen Later Feb 24, 2022 58:59


    Over the last several weeks, Russian aggression toward Ukraine has escalated dramatically. Russian President Vladimir Putin announced on Feb. 21 that Russia would recognize the sovereignty of two breakaway regions in Ukraine's east, Donetsk and Luhansk, whose years-long effort to secede from Ukraine has been engineered by Russia. Russian troops have entered eastern Ukraine as supposed “peacekeepers,” and the Russian military has taken up positions along a broad stretch of Ukraine's border.Along with the military dimensions of the crisis, there's also the question of how various actors are using information to provoke or defuse violence. Russia has been spreading disinformation about supposed violence against ethnic Russians in Ukraine. The United States and its Western partners, meanwhile, have been releasing intelligence about Russia's plans—and about Russian disinformation—at a rapid and maybe even unprecedented clip.So today on Arbiters of Truth, our series on the online information ecosystem, we're bringing you an episode about the role of truth and falsehoods in the Russian attack on Ukraine. Evelyn Douek and Quinta Jurecic spoke with Olga Lautman, a non-resident senior fellow at the Center for European Policy Analysis—who has been tracking Russian disinformation in Ukraine—and Shane Harris, a reporter at the Washington Post—who has been reporting on the crisis. See acast.com/privacy for privacy and opt-out information.

    The Nuts and Bolts of Social Media Transparency

    Play Episode Listen Later Feb 17, 2022 56:37


    Brandon Silverman is a former Facebook executive and founder of the data analytics tool CrowdTangle. Brandon joined Facebook in 2016 after the company acquired CrowdTangle, a startup designed to provide insight into what content is performing well on Facebook and Instagram, and he left in October 2021, in the midst of a debate over how much information the company should make public about its platform. As the New York Times described it, CrowdTangle “had increasingly become an irritant” to Facebook's leadership “as it revealed the extent to which Facebook users engaged with hyperpartisan right-wing politics and misleading health information.”This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Brandon about what we mean when we talk about transparency from social media platforms and why that transparency matters. They also discussed his work with the Congress and other regulators to advise on what legislation ensuring more openness from platforms would look like—and why it's so hard to draft regulation that works. See acast.com/privacy for privacy and opt-out information.

    Spotify Faces the Content Moderation Music

    Play Episode Listen Later Feb 10, 2022 50:19


    The Joe Rogan Experience is perhaps the most popular podcast in the world—and it's been at the center of a weeks-long controversy over COVID misinformation and content moderation. After Rogan invited on a guest who told falsehoods about the safety of COVID vaccines, outrage mounted toward Spotify, the podcasting and music streaming company that recently signed an exclusive deal with Rogan to distribute his show. Spotify came under pressure to intervene, as nearly 300 experts sent the company a letter demanding it take action, and musicians Neil Young and Joni Mitchell pulled their music from Spotify's streaming service. And the controversy only seems to be growing. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Ashley Carman, a senior reporter at The Verge who writes the newsletter Hot Pod, covering the podcast and audio industry. She's broken news on Spotify's content guidelines and Spotify CEO's Daniel Ek's comments to the company's staff, and we couldn't think of a better person to talk to about this slow-moving disaster. How has Spotify responded to the complaints over Rogan, and what does that tell us about how the company is thinking about its responsibilities in curating content? What's Ashley's read on the state of content moderation in the podcast industry more broadly? And … is this debate even about content moderation at all? See acast.com/privacy for privacy and opt-out information.

    Is Block Party the Future of Content Moderation?

    Play Episode Listen Later Feb 4, 2022 54:48


    We talk a lot on this show about the responsibility of major tech platforms when it comes to content moderation. But what about problems the platforms can't—or won't—fix? Tracy Chou's solution involves going around platforms entirely and creating tools that give power back to users to control their own experience. She's the engineer behind Block Party, an app that allows Twitter users to protect themselves against online harassment and abuse. It's a fine-tuned solution to a problem that a lot of Twitter users struggle with, especially women and particularly women of color. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Tracy about her work developing Block Party and how the persistent lack of diversity in Silicon Valley contributes to an environment where users have little protection against harassment. They also talked about what it's like working with the platforms that Block Party and other apps like it are seeking to improve. And they discussed what content moderation problems these kinds of user-driven tools might help solve–and which they won't. See acast.com/privacy for privacy and opt-out information.

    Defunding the Insurrectionists

    Play Episode Listen Later Feb 4, 2022 55:37


    As we've discussed on the show, online advertisements are the shifting, unstable sand on which the contemporary internet is built. And one of the many, many ways in which the online ad ecosystem is confusing and opaque involves how advertisers can find their ads popping up alongside content they'd rather not be associated with—and, all too often, not having any idea how that happened.This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke to Nandini Jammi and Claire Atkin of the Check My Ads Institute. Their goal is to serve as a watchdog for the ad industry, and they've just started a campaign to let companies know—and call them out—when their ads are showing up next to content published by far-right figures like Steve Bannon who supported the Jan. 6 insurrection. So what is it about the ads industry that makes things so opaque, even for the companies paying to have their ads appear online? What techniques do Claire and Nandini use to trace ad distribution? And how do advertisers usually respond when Check My Ads alerts them that they're funding “brand unsafe” content? See acast.com/privacy for privacy and opt-out information.

    truth steve bannon defunding arbiters nandini nandini jammi check my ads quinta jurecic claire atkin evelyn douek

    Claim Arbiters of Truth

    In order to claim this podcast we'll send an email to with a verification link. Simply click the link and you will be able to edit tags, request a refresh, and other features to take control of your podcast page!

    Claim Cancel