Podcasts about integrity institute

  • 27PODCASTS
  • 56EPISODES
  • 47mAVG DURATION
  • 1MONTHLY NEW EPISODE
  • Apr 11, 2025LATEST

POPULARITY

20172018201920202021202220232024


Best podcasts about integrity institute

Latest podcast episodes about integrity institute

Wellington Mornings with Nick Mills
Friday Faceoff: Mark Sainsbury and Bryce Edwards talk Treaty Principles Bill, Andrew Little and The Chase NZ

Wellington Mornings with Nick Mills

Play Episode Listen Later Apr 11, 2025 33:04 Transcription Available


David Seymour's Treaty Principles Bill is dead and buried - but was it worth having the conversation as a country? Also, former Labour leader Andrew Little says he is considering running to be Wellington's next mayor. Would he be right for the job? And TVNZ has confirmed production of a four-episode New Zealand version of The Chase. Who should host it? To answer those questions, Integrity Institute director Dr Bryce Edwards and broadcaster Mark Sainsbury joined Nick Mills for Friday Faceoff. LISTEN ABOVESee omnystudio.com/listener for privacy information.

RNZ: Nights
What's behind our distrust of the government and media?

RNZ: Nights

Play Episode Listen Later Mar 31, 2025 13:16


Bryce Edwards is the author of The Integrity Institute on Substack, and director of the Democracy Project, and joins Emile Donovan.

Trust in Tech: an Integrity Institute Member Podcast
The Future of Trust & Safety: Navigating Challenges in a Shifting Industry

Trust in Tech: an Integrity Institute Member Podcast

Play Episode Listen Later Feb 12, 2025 46:40


In this episode, Alice and Integrity Institute co-founder Jeff Allen, discuss the ever-evolving T&S landscape. With recent policy changes at Meta and increasing scrutiny on content moderation, they discuss T&S's evolving future, the critical role of integrity professionals, and the business case for safer online spaces. How do trust & safety teams continue to create value in an environment of growing political and company pressure? Listen in for a thoughtful conversation on the commitment to integrity work through turbulent times.Further readingThe ethical and practical flaws in Meta's policy overhaul by Alice Hunsberger

In Reality
Battered But Still Hopeful, The Guardians of a Civil Internet with Integrity Institute's Jeff Allen

In Reality

Play Episode Listen Later Dec 19, 2024 44:35


Welcome to In Reality, the podcast on truth, disinformation and the media. I'm your host Eric Schurenberg, a former journalist and media exec, now the founder of the Alliance for Trust in Media.At the front lines of the battle for truth in the information ecosystem are the social media platforms' trust and safety teams. Trust and safety teams are the data-science professionals who make sure that social media content conforms to the platforms' standards. It's a finger-in-the-dike kind of task, because of both the volume of content—34 million videos uploaded on TikTok every day, for one example--and the judgment needed to distinguish merely obnoxious content from the truly harmful. And lately, the whole idea has run into significant headwinds, some political, from Republicans who say that trust and safety is just a code word for censorship; And some economic, from platforms leaders, who have been cutting back their trust and safety teams as cost centers and generally more trouble than they're worth. Today's guest, Jeff Allen, is very much part of this world. Jeff's a former trust and safety executive at Meta, now the founder of the Integrity Institute, which is both a community for trust and safety professionals and an advocacy group for a kinder gentler social internet. Jeff and I discuss what trust and safety professionals really think about free speech; why Instagram search tends to harm young people and Google's does not; why Mark Zuckerberg doesn't like trust and safety, in Zuck's own words; and where those hoping for an internet that does better at fostering human well-being, might find reason for optimism.Website - free episode transcriptswww.in-reality.fmProduced by Tom Platts at Sound Sapiensoundsapien.comAlliance for Trust in Mediaalliancefortrust.com

Impossible Tradeoffs with Katie Harbath
Debrief on the India and EU Elections Online

Impossible Tradeoffs with Katie Harbath

Play Episode Listen Later Jun 20, 2024 38:32


I'll admit, I wasn't expecting to do my first Summer Spectacular episode quite this quickly. However, when Saurabh Shukla with NewsMobile pinged me on WhatsApp to offer to talk about what had happened in the Indian election, I jumped at the chance. I then saw former Integrity Institute resident fellow Alexis Crews post about her work in the EU, and I knew I had to include her as well.In our conversation, we discuss the role of social media and digital platforms in the elections, the use of AI for misinformation and disinformation, the impact of WhatsApp as a messaging app, and the use of influencers in campaigns. We also talk about the lessons learned from the EU elections and the recommendations for tech companies in mitigating disinformation. I hope you enjoy!Anchor Change with Katie Harbath is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber. Get full access to Anchor Change with Katie Harbath at anchorchange.substack.com/subscribe

Trust in Tech: an Integrity Institute Member Podcast
Workplace ethics and activism with Nadah Feteih

Trust in Tech: an Integrity Institute Member Podcast

Play Episode Listen Later Apr 5, 2024 42:53


Many of us working at tech companies are having to make moral and ethical decisions when it comes to where we work, what we work on, and what we speak up about. In this episode, we have a conversation with Nadah Feteih around how tech workers (specifically folks working in integrity and trust & safety teams) can speak up about ethical issues at their workplace. We discuss activism from within the industry, compelled identity labor, balancing speaking up and staying silent, thinking ethically in tech, and the limitations and harms of technology.TakeawaysBalancing speaking up and staying silent can be difficult for tech workers, as some topics may be divisive or risky to address.Compelled identity labor is a challenge faced by underrepresented and marginalized tech workers, who may feel pressure to speak on behalf of their communities.Thinking ethically in tech is crucial, and there is a growing need for resources and education on tech ethics.Tech employees have the power to take a stand and advocate for change within their companies.Engaging on social issues in the workplace requires a balance between different approaches, including staying within the system and speaking up from the outside.Listening to moderators and incorporating local perspectives is crucial for creating inclusive and equitable tech platforms.Disclaimer: The views stated in this episode are not affiliated with any organization and only represent the views of the individuals.Mentioned in this episode: Breaking the Silence: Marginalized Tech Workers' Experiences and Community SolidarityBlack in ModerationTech Worker HandbookNo Tech For ApartheidTech Workers CoalitionCreditsToday's episode was produced, edited, and hosted by Alice Hunsberger.You can reach myself and Talha Baig, the other half of the Trust in Tech team, at podcast@integrityinstitute.org. Our music is by Zhao Shen. Special thanks to all the staff at the Integrity Institute.

Heather du Plessis-Allan Drive
Grant Nelson: Integrity Institute spokesperson urges Prime Minister Luxon to 'do the right thing' and investigate businesses over Covid wage subsidies

Heather du Plessis-Allan Drive

Play Episode Listen Later Apr 2, 2024 3:07


A pair of millionaire philanthropists are urging the Prime Minister to honour a commitment and chase businesses up for Covid wage subsidy money. Grant and Marilyn Nelson suspect companies have wrongly held on to millions of dollars- and hope the Government can encourage these companies to pay that money back. Grant Nelson says businesses should be treated like beneficiaries and be made to pay back money they owe. LISTEN ABOVESee omnystudio.com/listener for privacy information.

Best of Business
Grant Nelson: Integrity Institute spokesperson urges Prime Minister Luxon to 'do the right thing' and investigate businesses over Covid wage subsidies

Best of Business

Play Episode Listen Later Apr 2, 2024 3:16


A pair of millionaire philanthropists are urging the Prime Minister to honour a commitment and chase businesses up for Covid wage subsidy money. Grant and Marilyn Nelson suspect companies have wrongly held on to millions of dollars- and hope the Government can encourage these companies to pay that money back. Grant Nelson says businesses should be treated like beneficiaries and be made to pay back money they owe. LISTEN ABOVESee omnystudio.com/listener for privacy information.

Impossible Tradeoffs with Katie Harbath
Who Voters Trust for Election Information in 2024

Impossible Tradeoffs with Katie Harbath

Play Episode Listen Later Mar 21, 2024 38:22


Don't forget you can now also watch these conversations on YouTube!This week, we are diving deep into elections and specifically where people go to get information on the election. Rachel Orey is the Bipartisan Policy Center's Senior Associate Director where they are responsible for the organization's election administration policy development, state and federal advocacy efforts, and the BPC Task Force on Elections. Their research focuses on evidence-based and data-driven reforms that meaningfully improve our elections ecosystem.As many of you know, I was a fellow on Rachel's team for nearly three years and one of my last acts as both a BPC and Integrity Institute fellow was to help get this survey off the ground. We did a similar one in 2022 as well.Some of the findings include:Most Americans have confidence in the 2024 presidential election. They are more confident that votes in their community and state will be counted accurately than votes across the country. * A majority of respondents (69%) are confident their votes will be counted accurately in the 2024 election. This includes majorities of Republicans (60% very or somewhat confident), Independents (59%), and Democrats (85%).* Across all groups, Americans are most confident about an accurate count of votes in their community (74%). Just 64% are confident in an accurate count across the country.* This difference is most pronounced among Republicans. Only 50% of Republicans express confidence that votes will be counted accurately at the national level compared with 66% at the local level—a gap of 16 percentage points.* The confidence gap between local and national counting is an opportunity for voter education about how the counting and certification process works at all levels of our election system. While election officials may be doing a good job building confidence in their community, this gap shows the need for national and state media outlets, candidates, and political elites to help voters understand the robust processes and security measures that are present in every state.Rachel digs into that and more in this week's podcast.Here's the link to the security and integrity protections that make American elections strong, resilient, and trustworthy in every jurisdiction. Get full access to Anchor Change with Katie Harbath at anchorchange.substack.com/subscribe

Impossible Tradeoffs with Katie Harbath
Democracy Works: Helping voters navigate the first elections in the generative AI era

Impossible Tradeoffs with Katie Harbath

Play Episode Listen Later Mar 7, 2024 34:08


We're back! I didn't intend to take two full months off from the podcast, but as many of you know, I started a new job in January as the Chief Global Affairs Officer of Duco Experts - a technology consulting firm. It has been overwhelming, in a good way, but it took me a bit to get started again with the podcast. I've got some exciting guests lined up. I figure we'll do this season through the end of May, and then I'll re-evaluate for the rest of the year.To kick things off, I'm excited to have Luis Lozada, the CEO of Democracy Works. You may not have heard of Democracy Works, but you likely have encountered their work. They do the painstaking work of gathering all the information about where, when, and how to vote from the thousands of election officials across the country to put it in a readable format that companies like Google, TikTok, and Anthropic currently use. I started working with them when I was at Facebook, and we used them to power many of our U.S.-based Election Day reminders. I was invited to join the board while I was at Facebook and have now been a board member for five years.With the explosion of AI, Democracy Works is now helping companies think through the next generation of people getting election information. Luis and I cover that and more in our conversation.Enjoy!PS: If you are looking for the poll by the Bipartisan Policy Center, Integrity Institute and States United that we reference you can find it here.PS: We're now on video, too! With Season 2, I've launched an Impossible Tradeoffs YouTube channel if you'd like to watch our conversation rather than listen. Get full access to Anchor Change with Katie Harbath at anchorchange.substack.com/subscribe

Trust in Tech: an Integrity Institute Member Podcast
Careers in T&S: Job search special (you ask, we answer)

Trust in Tech: an Integrity Institute Member Podcast

Play Episode Listen Later Feb 13, 2024 31:36


You asked, we answered! It's a rough time out there in the tech industry as so many people in Trust & Safety are job searching or thinking about their career and what it all means. In this episode, Alice Hunsberger shares her recent job search experience and gives advice on job searching and career development in the Trust and Safety industry. Listener questions that are answered include: How do I figure out what to do next in my career?What helps a resume or cover letter stand out?What are good interviewing tips?What advice do leaders wish they had when they were first starting out?Do T&S Leaders really believe we will have an internet free (or at least drastically) reduced of harm?Resources and links mentioned in this episode:Personal Safety for Integrity workersHiring and growing trust & safety teams at small companiesKatie Harbath's career advice postsAlice Links Disclaimer: The views stated in this episode are not affiliated with any organization and only represent the views of the individuals.Today's episode was produced, edited, and hosted by Alice Hunsberger. You can reach myself and Talha Baig, the other half of the Trust in Tech team, at podcast@integrityinstitute.org. Our music is by Zhao Shen. Special thanks to all the staff at the Integrity Institute.

Trust in Tech: an Integrity Institute Member Podcast
Building the Wikipedia of Integrity w/ Grady Ward

Trust in Tech: an Integrity Institute Member Podcast

Play Episode Listen Later Feb 6, 2024 55:46


Integrity workers are missing a shared resource where they can easily point to a taxonomy of harms and specific interventions to mitigate those harms. Enter, Grady Ward, a visiting fellow of the Integrity Institute, who discusses how he is creating a Wikipedia for and by integrity workers.In typical Trust in Tech fashion, we also discuss the tensions and synergies between integrity and privacy, and if you stick around to the end, you can hear about some musings on the interplay of art and nature.Links:The Wikipedia of Trust and SafetyGrady's personal websiteDisclaimer: The views stated in this episode are not affiliated with any organization and only represent the views of the individuals.Today's episode was produced, edited, and hosted by Talha Baig. You can reach myself and Alice Hunsberger, the other half of the Trust in Tech team, at podcast@integrityinstitute.org. Our music is by Zhao Shen. Special thanks to all the staff at the Integrity Institute.

Trust in Tech: an Integrity Institute Member Podcast
Personal Safety for Integrity workers

Trust in Tech: an Integrity Institute Member Podcast

Play Episode Listen Later Jan 20, 2024 34:10


Listen to this episode to learn how to stay safe as an Integrity workerLinks: - Tall Poppy (available through employers only at the moment) - DeleteMe - PEN America Online Harassment Field Manual - Assessing Online Threats - Want a security starter pack? | Surveillance Self-Defense - Yoel Roth on being targeted: Trump Attacked Me. Then Musk Did. It Wasn't an Accident. - Crash override network: What To Do If Your Employee Is Being Targeted By Online AbusePractical tips:If you're a manager - Train your team on what credible threats looks like - make sure you have a plan in place for dealing with threats to your office or employees - Allow pseudonyms; don't require public photos - Invest in services that can help your employee scrub public data.Individuals- Keep your personal social media private/ friends-only. - Use different photos on LinkedIn than your personal social media. - Consider hiding your location online, not using your full name, etc. Credits:Today's episode was produced, edited, and hosted by Alice Hunsberger. You can reach myself and Talha Baig, the other half of the Trust in Tech team, at podcast@integrityinstitute.org. Our music is by Zhao Shen. Special thanks to all the staff at the Integrity Institute.

New Books Network
Paul Gowder, "The Networked Leviathan: For Democratic Platforms" (Cambridge UP, 2023)

New Books Network

Play Episode Listen Later Jan 17, 2024 66:40


Governments and consumers expect internet platform companies to regulate their users to prevent fraud, stop misinformation, and avoid violence. Yet, so far, they've failed to do so. The inability of platforms like Facebook, Google, and Amazon to govern their users has led to stolen elections, refused vaccines, counterfeit N95s in a pandemic, and even genocide. Such failures stem from these companies' inability to manage the complexity of their userbases, products, and their own incentives under the eyes of internal and external constituencies.  In The Networked Leviathan: For Democratic Platforms (Cambridge UP, 2023), Paul Gowder argues that countries should adapt the institutional tools developed in political science for platform governance to democratize major platforms. Democratic institutions allow knowledgeable actors to freely share and apply their understanding of the problems they face while leaders more readily recruit third parties to help manage their decision-making capacity.  This book is also available open access on Cambridge Core. Paul Gowder is Professor of Law and Associate Dean of Research and Intellectual Life at Northwestern University's Pritzker School of Law and a Founding Fellow of the Integrity Institute. He is the author of The Rule of Law in the Real World and The Rule of Law in the United States: An Unfinished Project of Black Liberation. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/new-books-network

New Books in Political Science
Paul Gowder, "The Networked Leviathan: For Democratic Platforms" (Cambridge UP, 2023)

New Books in Political Science

Play Episode Listen Later Jan 17, 2024 66:40


Governments and consumers expect internet platform companies to regulate their users to prevent fraud, stop misinformation, and avoid violence. Yet, so far, they've failed to do so. The inability of platforms like Facebook, Google, and Amazon to govern their users has led to stolen elections, refused vaccines, counterfeit N95s in a pandemic, and even genocide. Such failures stem from these companies' inability to manage the complexity of their userbases, products, and their own incentives under the eyes of internal and external constituencies.  In The Networked Leviathan: For Democratic Platforms (Cambridge UP, 2023), Paul Gowder argues that countries should adapt the institutional tools developed in political science for platform governance to democratize major platforms. Democratic institutions allow knowledgeable actors to freely share and apply their understanding of the problems they face while leaders more readily recruit third parties to help manage their decision-making capacity.  This book is also available open access on Cambridge Core. Paul Gowder is Professor of Law and Associate Dean of Research and Intellectual Life at Northwestern University's Pritzker School of Law and a Founding Fellow of the Integrity Institute. He is the author of The Rule of Law in the Real World and The Rule of Law in the United States: An Unfinished Project of Black Liberation. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/political-science

New Books in Critical Theory
Paul Gowder, "The Networked Leviathan: For Democratic Platforms" (Cambridge UP, 2023)

New Books in Critical Theory

Play Episode Listen Later Jan 17, 2024 66:40


Governments and consumers expect internet platform companies to regulate their users to prevent fraud, stop misinformation, and avoid violence. Yet, so far, they've failed to do so. The inability of platforms like Facebook, Google, and Amazon to govern their users has led to stolen elections, refused vaccines, counterfeit N95s in a pandemic, and even genocide. Such failures stem from these companies' inability to manage the complexity of their userbases, products, and their own incentives under the eyes of internal and external constituencies.  In The Networked Leviathan: For Democratic Platforms (Cambridge UP, 2023), Paul Gowder argues that countries should adapt the institutional tools developed in political science for platform governance to democratize major platforms. Democratic institutions allow knowledgeable actors to freely share and apply their understanding of the problems they face while leaders more readily recruit third parties to help manage their decision-making capacity.  This book is also available open access on Cambridge Core. Paul Gowder is Professor of Law and Associate Dean of Research and Intellectual Life at Northwestern University's Pritzker School of Law and a Founding Fellow of the Integrity Institute. He is the author of The Rule of Law in the Real World and The Rule of Law in the United States: An Unfinished Project of Black Liberation. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/critical-theory

New Books in Public Policy
Paul Gowder, "The Networked Leviathan: For Democratic Platforms" (Cambridge UP, 2023)

New Books in Public Policy

Play Episode Listen Later Jan 17, 2024 66:40


Governments and consumers expect internet platform companies to regulate their users to prevent fraud, stop misinformation, and avoid violence. Yet, so far, they've failed to do so. The inability of platforms like Facebook, Google, and Amazon to govern their users has led to stolen elections, refused vaccines, counterfeit N95s in a pandemic, and even genocide. Such failures stem from these companies' inability to manage the complexity of their userbases, products, and their own incentives under the eyes of internal and external constituencies.  In The Networked Leviathan: For Democratic Platforms (Cambridge UP, 2023), Paul Gowder argues that countries should adapt the institutional tools developed in political science for platform governance to democratize major platforms. Democratic institutions allow knowledgeable actors to freely share and apply their understanding of the problems they face while leaders more readily recruit third parties to help manage their decision-making capacity.  This book is also available open access on Cambridge Core. Paul Gowder is Professor of Law and Associate Dean of Research and Intellectual Life at Northwestern University's Pritzker School of Law and a Founding Fellow of the Integrity Institute. He is the author of The Rule of Law in the Real World and The Rule of Law in the United States: An Unfinished Project of Black Liberation. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/public-policy

New Books in Politics
Paul Gowder, "The Networked Leviathan: For Democratic Platforms" (Cambridge UP, 2023)

New Books in Politics

Play Episode Listen Later Jan 17, 2024 66:40


Governments and consumers expect internet platform companies to regulate their users to prevent fraud, stop misinformation, and avoid violence. Yet, so far, they've failed to do so. The inability of platforms like Facebook, Google, and Amazon to govern their users has led to stolen elections, refused vaccines, counterfeit N95s in a pandemic, and even genocide. Such failures stem from these companies' inability to manage the complexity of their userbases, products, and their own incentives under the eyes of internal and external constituencies.  In The Networked Leviathan: For Democratic Platforms (Cambridge UP, 2023), Paul Gowder argues that countries should adapt the institutional tools developed in political science for platform governance to democratize major platforms. Democratic institutions allow knowledgeable actors to freely share and apply their understanding of the problems they face while leaders more readily recruit third parties to help manage their decision-making capacity.  This book is also available open access on Cambridge Core. Paul Gowder is Professor of Law and Associate Dean of Research and Intellectual Life at Northwestern University's Pritzker School of Law and a Founding Fellow of the Integrity Institute. He is the author of The Rule of Law in the Real World and The Rule of Law in the United States: An Unfinished Project of Black Liberation. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/politics-and-polemics

New Books in Communications
Paul Gowder, "The Networked Leviathan: For Democratic Platforms" (Cambridge UP, 2023)

New Books in Communications

Play Episode Listen Later Jan 17, 2024 66:40


Governments and consumers expect internet platform companies to regulate their users to prevent fraud, stop misinformation, and avoid violence. Yet, so far, they've failed to do so. The inability of platforms like Facebook, Google, and Amazon to govern their users has led to stolen elections, refused vaccines, counterfeit N95s in a pandemic, and even genocide. Such failures stem from these companies' inability to manage the complexity of their userbases, products, and their own incentives under the eyes of internal and external constituencies.  In The Networked Leviathan: For Democratic Platforms (Cambridge UP, 2023), Paul Gowder argues that countries should adapt the institutional tools developed in political science for platform governance to democratize major platforms. Democratic institutions allow knowledgeable actors to freely share and apply their understanding of the problems they face while leaders more readily recruit third parties to help manage their decision-making capacity.  This book is also available open access on Cambridge Core. Paul Gowder is Professor of Law and Associate Dean of Research and Intellectual Life at Northwestern University's Pritzker School of Law and a Founding Fellow of the Integrity Institute. He is the author of The Rule of Law in the Real World and The Rule of Law in the United States: An Unfinished Project of Black Liberation. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/communications

New Books in Science, Technology, and Society
Paul Gowder, "The Networked Leviathan: For Democratic Platforms" (Cambridge UP, 2023)

New Books in Science, Technology, and Society

Play Episode Listen Later Jan 17, 2024 66:40


Governments and consumers expect internet platform companies to regulate their users to prevent fraud, stop misinformation, and avoid violence. Yet, so far, they've failed to do so. The inability of platforms like Facebook, Google, and Amazon to govern their users has led to stolen elections, refused vaccines, counterfeit N95s in a pandemic, and even genocide. Such failures stem from these companies' inability to manage the complexity of their userbases, products, and their own incentives under the eyes of internal and external constituencies.  In The Networked Leviathan: For Democratic Platforms (Cambridge UP, 2023), Paul Gowder argues that countries should adapt the institutional tools developed in political science for platform governance to democratize major platforms. Democratic institutions allow knowledgeable actors to freely share and apply their understanding of the problems they face while leaders more readily recruit third parties to help manage their decision-making capacity.  This book is also available open access on Cambridge Core. Paul Gowder is Professor of Law and Associate Dean of Research and Intellectual Life at Northwestern University's Pritzker School of Law and a Founding Fellow of the Integrity Institute. He is the author of The Rule of Law in the Real World and The Rule of Law in the United States: An Unfinished Project of Black Liberation. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/science-technology-and-society

New Books in Law
Paul Gowder, "The Networked Leviathan: For Democratic Platforms" (Cambridge UP, 2023)

New Books in Law

Play Episode Listen Later Jan 17, 2024 66:40


Governments and consumers expect internet platform companies to regulate their users to prevent fraud, stop misinformation, and avoid violence. Yet, so far, they've failed to do so. The inability of platforms like Facebook, Google, and Amazon to govern their users has led to stolen elections, refused vaccines, counterfeit N95s in a pandemic, and even genocide. Such failures stem from these companies' inability to manage the complexity of their userbases, products, and their own incentives under the eyes of internal and external constituencies.  In The Networked Leviathan: For Democratic Platforms (Cambridge UP, 2023), Paul Gowder argues that countries should adapt the institutional tools developed in political science for platform governance to democratize major platforms. Democratic institutions allow knowledgeable actors to freely share and apply their understanding of the problems they face while leaders more readily recruit third parties to help manage their decision-making capacity.  This book is also available open access on Cambridge Core. Paul Gowder is Professor of Law and Associate Dean of Research and Intellectual Life at Northwestern University's Pritzker School of Law and a Founding Fellow of the Integrity Institute. He is the author of The Rule of Law in the Real World and The Rule of Law in the United States: An Unfinished Project of Black Liberation. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/law

New Books in Technology
Paul Gowder, "The Networked Leviathan: For Democratic Platforms" (Cambridge UP, 2023)

New Books in Technology

Play Episode Listen Later Jan 17, 2024 66:40


Governments and consumers expect internet platform companies to regulate their users to prevent fraud, stop misinformation, and avoid violence. Yet, so far, they've failed to do so. The inability of platforms like Facebook, Google, and Amazon to govern their users has led to stolen elections, refused vaccines, counterfeit N95s in a pandemic, and even genocide. Such failures stem from these companies' inability to manage the complexity of their userbases, products, and their own incentives under the eyes of internal and external constituencies.  In The Networked Leviathan: For Democratic Platforms (Cambridge UP, 2023), Paul Gowder argues that countries should adapt the institutional tools developed in political science for platform governance to democratize major platforms. Democratic institutions allow knowledgeable actors to freely share and apply their understanding of the problems they face while leaders more readily recruit third parties to help manage their decision-making capacity.  This book is also available open access on Cambridge Core. Paul Gowder is Professor of Law and Associate Dean of Research and Intellectual Life at Northwestern University's Pritzker School of Law and a Founding Fellow of the Integrity Institute. He is the author of The Rule of Law in the Real World and The Rule of Law in the United States: An Unfinished Project of Black Liberation. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/technology

Exchanges: A Cambridge UP Podcast
Paul Gowder, "The Networked Leviathan: For Democratic Platforms" (Cambridge UP, 2023)

Exchanges: A Cambridge UP Podcast

Play Episode Listen Later Jan 17, 2024 66:40


Governments and consumers expect internet platform companies to regulate their users to prevent fraud, stop misinformation, and avoid violence. Yet, so far, they've failed to do so. The inability of platforms like Facebook, Google, and Amazon to govern their users has led to stolen elections, refused vaccines, counterfeit N95s in a pandemic, and even genocide. Such failures stem from these companies' inability to manage the complexity of their userbases, products, and their own incentives under the eyes of internal and external constituencies.  In The Networked Leviathan: For Democratic Platforms (Cambridge UP, 2023), Paul Gowder argues that countries should adapt the institutional tools developed in political science for platform governance to democratize major platforms. Democratic institutions allow knowledgeable actors to freely share and apply their understanding of the problems they face while leaders more readily recruit third parties to help manage their decision-making capacity.  This book is also available open access on Cambridge Core. Paul Gowder is Professor of Law and Associate Dean of Research and Intellectual Life at Northwestern University's Pritzker School of Law and a Founding Fellow of the Integrity Institute. He is the author of The Rule of Law in the Real World and The Rule of Law in the United States: An Unfinished Project of Black Liberation.

Trust in Tech: an Integrity Institute Member Podcast
How to build a Movement w/ David Jay

Trust in Tech: an Integrity Institute Member Podcast

Play Episode Listen Later Dec 15, 2023 46:23


It seems everyday we are pulled in different directions on social media. However, what we are feeling seldom resonates. Enter David Jay! A master in building movements including leading it for the Center Humane Technology. In this episode, we will learn precisely how to build a movement, and why communities are perpetually underfunded.David Jay is an advisor of the Integrity Institute and played a pivotal role in the early days of the Institute. He is also currently the founder of Relationality Labs which hopes to make the impact of relational organizing visible so that organizers can be resourced for the strategic value that they create. In the past, he has had a diverse range of experiences, including founding asexuality.org, and as chief mobilization officer for the Center for Humane Technology.Here are some of the questions we answer on today's show:1. How do you create, scale, and align relationships to create a movement?2. How to structure stories to resonate?3. How to keep your nose on the edge for new movements?4. How to identify leaders for the future?5. Why David Jay is excited by the Integrity Institute and the future of integrity workers?6. Why community based initiatives don't get funded at the same rate as non-community based initiatives. Check out David Jay's Relationality Lab!Disclaimer: The views in this episode only represent the views of the people involved in the recording of the episode. They do not represent any other entity's views.

Trust in Tech: an Integrity Institute Member Podcast
The Ultimate Guide to Election Integrity Part II

Trust in Tech: an Integrity Institute Member Podcast

Play Episode Listen Later Dec 10, 2023 48:03


Elections matter, and history has demonstrated online platforms will find themselves grappling with these challenges whether they want to be or not. The two key questions facing online platforms now, as they stare down the tsunami of global elections heading their way, are: Have they initiated an internal elections integrity program? And if so, how do they ensure the best possible preparation to safeguard democracies globally?The Integrity Institute launched an elections integrity best practices guide on “Defining and Achieving Success in Elections Integrity.” This latest guide extends the first and provides companies – large or small, established or new-on-the-block – concrete details as they fully implement an elections integrity program. Today on the podcast, we talk to four contributors about this guide: Glenn Ellingson, Diane Chang, Swapneel Mehta, and Eric Davis.Also check out our first episode on elections!

Trust in Tech: an Integrity Institute Member Podcast
How to Find Your Place in Trust & Safety: A Story of Career Pivoting

Trust in Tech: an Integrity Institute Member Podcast

Play Episode Listen Later Sep 27, 2023 18:43


Alice Hunsberger talks to Heather Grunkemeier (former Program Owner of Trust & Safety at Rover, and current owner of consultancy firm Twinkle LLC) and discusses how Heather finally broke into the field of Trust & Safety after years of trying, what it was actually like for her, and what her advice is for other people in the midst of career pivots. We also touch on mental health, identity, self worth, and how working in Trust & Safety has unique challenges (and rewards). If you liked our Burnout Episode, you may enjoy this one too. (And if you haven't listened to it yet or read our Burnout resource guide, please check it out).CreditsThis episode of Trust in Tech was hosted, edited, and produced by Alice Hunsberger.Music by Zhao Shen. Special thanks to the staff and members of the Integrity Institute for their continued support.

The Sunday Show
Paul Gowder on The Networked Leviathan

The Sunday Show

Play Episode Listen Later Sep 3, 2023 55:51


One of the problems we come back to again and again on the Tech Policy Press podcast is the problem of how to govern social media platforms. Today's guest is Paul Gowder, Professor of Law and Associate Dean of Research and Intellectual Life at Northwestern University's Pritzker School of Law and a founding fellow of the Integrity Institute. Gowder is the author of The Networked Leviathan: For Democratic Platforms, a book that he says takes an institutional political science approach to the problem of tech platform governance, arguing “that the goals of effective governance capacity development and of global justice” can come together, and that we can build “worldwide direct democratic institutions to exercise public authority over the operations of the big platforms.”

Trust in Tech: an Integrity Institute Member Podcast
Pig Butchering: Not Your Grandma's Romance Scam

Trust in Tech: an Integrity Institute Member Podcast

Play Episode Listen Later Jul 11, 2023 53:34


Assaf Kipnis has spent the last decade fighting e-crime and scams. Today, he's on the podcast with fellow Integrity Institute member Alice Hunsberger to tell us about Pig Butchering Scams and Coordinated Inauthentic Behavior, and how they are more sophisticated scams than you might think. If you take away one thing from this, it's this: don't follow investing advice from random people you meet online!Show Links: Pig Butchering Scam Victim Journey and AnalysisThe Anatomy of a Pig Butchering Scam Fraudology Podcast with Karisse HendrickPig Butchering Scams Are Evolving Fast | WIREDExample of educational guide for users: Grindr Scam awareness guideWhat's the deal with all those weird wrong-number texts?I've been getting tons of ‘wrong number' spam texts, and I don't hate it? - The VergeFacebook shuts down ‘the BL' Removing Coordinated Inauthentic Behavior From Georgia, Vietnam and the USA Former Fox News Executive Divides Americans Using Russian TacticsMeta October 2020 Inauthentic Behavior Report

Trust in Tech: an Integrity Institute Member Podcast
Happy Pride! Let's talk about protecting the LGBTQ+ community online

Trust in Tech: an Integrity Institute Member Podcast

Play Episode Listen Later Jun 15, 2023 41:26


What can companies do to support the LGBTQ+ community during this pride season, beyond slapping a rainbow logo on everything? Integrity Institute members Alex Leavitt and Alice Hunsberger discuss the state of LGBTQ+ safety online and off, how the queer community is unique and faces disproportionate risks, and what are some concrete actions that platforms should be taking.Show Links:Human Rights Campaign declares LGBTQ state of emergency in the USSocial Media Safety IndexDigital Civility Index & Our Challenge | Microsoft Online SafetyBest Practices for Gender-Inclusive Content Moderation — Grindr BlogTinder - travel alertAssessing and Mitigating Risk for the Global Grindr CommunityStrengthening our policies to promote safety, security, and well-being on TikTokMeta's LGBTQ+ Safety centerData collection for queer minorities

Politics Is Everything
Neverending Cat and Mouse: Are Online Companies Prepared for 2024 Elections? ft. Katie Harbath

Politics Is Everything

Play Episode Listen Later May 22, 2023 30:18


With some 65 elections across 54 countries slated for 2024, how can social media and other online companies prepare? Katie Harbath, Chief Executive for Anchor Change and a fellow at the Bipartisan Policy Center, shares how companies, individuals and government entities can support election integrity, increase transparency around artificial intelligence, and combat mis- and malinformation.   Previously Katie was a public policy director at Facebook (now Meta) where, over the course of ten years, she was credited with building out and leading a 30-person global team responsible for managing elections. Prior to Facebook, Katie held senior strategic digital roles at the Republican National Committee, the National Republican Senatorial Committee, DCI Group and multiple campaigns.   Links in this episode:  Integrity Institute, Elections integrity best practices A Brief History of Tech and Elections EU Regulatory Framework on AI EU Digital Services Act 

Tech Policy Grind
AI Integrity with Talha Baig [S4E09]

Tech Policy Grind

Play Episode Listen Later May 19, 2023 40:42


Welcome back! This week, Reema sits down with Talha Baig, co-host of the Trust in Tech podcast and former AI Integrity engineer at Meta. Reema and Talha chat about developments in generative AI and how it affects trust and safety online and get into the weeds on how AI is applied in the integrity space - and the implications of AI on the integrity space itself. Talha also describes how integrity engineers differ in function from other technical teams, and their relationship with the legal and policy world. They ponder the question: does trust and safety need a regulator? Then, they dig into the Trust in Tech podcast and what Talha is up to at the Integrity Institute. Resources mentioned in the episode: Trust in Tech Podcast Ranking by Engagement: Tom Cunningham How Streaming Hurt Hollywood Writers: The Daily Mushtaq Khan on using institutional economics to predict effective government reforms: 80,000 Hours Podcast Happy City: Transforming Our Lives Through Urban Design by Charles Montgomery The Price of Peace: Money, Democracy, and the Life of John Maynard Keynes by Zachary D. Carter Check out the Foundry on Instagram, Twitter, or LinkedIn and subscribe to our newsletter! If you'd like to support the show, donate to the Foundry here or reach out to us at foundrypodcasts@ilpfoundry.us. Thanks for listening, and stay tuned for our next episode. And get ready for Foundry Trivia on June 12 in Washington D.C.! DISCLAIMER: Reema engages with the Foundry voluntarily and in her personal capacity. The views and opinions expressed on air are not reflective of the organizations Reema is affiliated with.

Trust in Tech: an Integrity Institute Member Podcast
The Ultimate Guide to Election Integrity! with Katie Harbath and Glenn Ellingson

Trust in Tech: an Integrity Institute Member Podcast

Play Episode Listen Later May 17, 2023 85:29


It might be May 2023, but it's never too early to start worrying about elections! 2024 is slated to be the biggest year of elections in platform history. In this episode Katie Harbath and Glenn Ellingson join the show to prepare you for the storm of elections coming in 2024.You may recognize Katie as the inaugural guest of Trust in Tech. Katie is an Integrity Institute Fellow and global leader at the intersection of elections, democracy, and technology. She is Chief Executive of Anchor Change where she helps clients think through tech policy issues. Before that she worked at Meta for 10 years where she built and led a 30 person team managing elections. Glenn is an Integrity Institute member who was previously an engineering manager for Meta's civic integrity team and before that Head of Product Engineering for Hustle - a company which helped progressive political organizations and other nonprofit and for-profit groups forge personal relationships at scale.Glenn and Katie led the development of the Elections Best Practices deck the Integrity Institute just shared on their website, which we discuss in the episode. We also answer some of the following questions:How to prioritize different elections across the world?What principles to adhere to when working on election integrity?What are the challenges of dealing with political harassment?How to map out the landscape of election integrity work?What was Cambridge Analytica, and did the scandal actually make platforms less transparent?And how your company can learn best practices and responsibly deal with electionsLinks:Election Integrity best practices deckAnchor Change A Brief History of Tech and Elections: A 26-Year JourneyDemystifying the Cambridge Analytica Scandal Five Years LaterDisclaimer: The views in this episode only represent the views of the people involved in the recording of the episode. They do not represent Meta's or any other entity's views.

Trust in Tech: an Integrity Institute Member Podcast
GPT4: Eldritch abomination or intern? A discussion with OpenAI

Trust in Tech: an Integrity Institute Member Podcast

Play Episode Listen Later May 4, 2023 78:15


OpenAI, creators of ChatGPT, join the show! In November 2022, ChatGPT upended the tech (and larger) world with a chatbot that passes not only the Turing test, but the bar exam. In this episode, we talk with Dave Willner and Todor Markov, integrity professionals at OpenAI, about how they make large language models safer for all. Dave Willner is the Head of Trust and Safety at OpenAI. He previously was Head of Community Policy at both Airbnb and Facebook, where he built the teams that wrote the community guidelines and oversaw the internal policies to enforce them. Todor Markov is a deep learning researcher at OpenAI. He builds content moderation tools for ChatGPT and GPT4. He graduated from Stanford with a Master's in Statistics and a Bachelor's in Symbolic Systems. Alice Hunsberger hosts the episode. She is the VP of Customer Experience at Grindr. She leads Customer support, insights and trust and safety. Previously, she worked at OKCupid as Director & Global Head of Customer Experience. Sahar Massachi is a visiting host today. He is the co-founder and Executive Director of the Integrity Institute. A past fellow of the Berkman Klein Center, Sahar is currently an advisory committee member for the Louis D. Brandeis Legacy Fund for Social Justice, a StartingBloc fellow, and a Roddenbery Fellow.They discuss what content moderation looks like for ChatGPT, why T&S stands for Tradeoffs and Sadness, and how integrity workers can help OpenAI.They also chat about the red-teaming process for GPT4, overlaps between platform integrity and AI integrity, their favorite GPT jailbreaks and how moderating GPTs is basically like teaching an Eldritch Abomination. Disclaimer: The views in this episode only represent the views of the people involved in the recording of the episode. They do not represent Meta's or any other entity's views.

The Technically Human Podcast
Instituting Integrity: The rise of the integrity worker collective

The Technically Human Podcast

Play Episode Listen Later Apr 28, 2023 66:59


Today I'm sitting down with Talha Baig to talk about a new to me organization, the Integrity Institute. On the show, I've spent a lot of time talking about what I see as a new workforce emerging in the tech sector, of people working in jobs in the tech sector to try and understand, assess, and mitigate some of the harms caused by technologies. That's why I was excited to learn about the Integrity Institute, a cohort of engineers, product managers, researchers, analysts, data scientists, operations specialists, policy experts and more, who are coming together to leverage their combined experience and their understanding of the systemic causes of problems on the social internet to help mitigate these problems. They want to bring this experience and expertise directly to the people theorizing, building, and governing the social internet. So I wanted to talk to Talha, who hosts the Trust in Tech podcast out of the institute, about the concept, the function, and the future of integrity work. Talha Baig is an expert on using machine learning to address platform integrity issues. He has spent 3 years as a Machine Learning Engineer reducing human, drugs, and weapon trafficking on Facebook Marketplace. He has insider knowledge on how platforms use AI for both good and bad, and shares his thoughts on his new podcast Trust in Tech, where he has in-depth conversations about the social internet with other platform integrity workers. They discuss the intersections between internet, society, culture, and philosophy with the goal of helping individuals, societies, and democracies to thrive.

Together Digital Power Lounge
Reclaim the Fairytale, Stepping out of idealism and into Individualism | Katie Harbath, CEO Anchor Change | Power Lounge S2E7

Together Digital Power Lounge

Play Episode Listen Later Mar 22, 2023 62:55


As women, we've long been sold the story of idealism over individualism. Tech superstar, Katie Harbath wants to help you change that.THIS WEEK'S TOPIC:From fairytales to fables, young girls are told time and time again that they are to be wooed and rescued. But at 42, a tech democracy advocate is choosing to live her life on her terms and celebrating the life she has built for herself in ways more women should.Take a listen to be inspired and learn how.THIS WEEK'S GUEST:Katie Harbath is a global leader at the intersection of elections, democracy, and technology. As the chief executive of Anchor Change, she helps clients think through tech policy issues. She is a senior advisor for technology and democracy at the International Republican Institute and is also a fellow at the Bipartisan Policy Center, the Integrity Institute, and a nonresident fellow at the Atlantic Council.Previously, Katie spent 10 years at Facebook. As a director of public policy, she built and led global teams that managed elections and helped government and political figures use the social network to connect with their constituents.Before Facebook, Katie held senior digital roles at the Republican National Committee, the National Republican Senatorial Committee, and the DCI Group, as well as multiple campaigns for office.She is a board member at the National Conference on Citizenship, Democracy Works, and the Center for Journalism Ethics at the University of Madison-Wisconsin.Follow Katie on LinkedIn: https://www.linkedin.com/in/harbath/Learn more about Katie's company: https://www.anchorchange.com/Sign up for Katie's weekly newsletters: https://anchorchange.substack.com/Sponsored by: COhatchCOhatch is a new kind of shared work, social, and family space built on community. Members get access to workspace, amenities like rock walls and sports simulators, and more to live a fully integrated life that balances work, family, well-being, community, and giving back. COhatch has 31 locations open or under construction nationwide throughout Ohio, Indiana, Florida, Pennsylvania, North Carolina, Georgia, and Tennessee. Visit www.cohatch.com for more information.Support the show

Trust in Tech: an Integrity Institute Member Podcast
Trust in Tech, Episode 12: Deepfakes, Biases and AI Hegemony with Claire Boine

Trust in Tech: an Integrity Institute Member Podcast

Play Episode Listen Later Mar 18, 2023 66:18


Deepfakes have gained steam on video platforms including Tik Tok and Reels. For example, we hear Obama, Trump and Biden ranking their favorite rappers and even playing dungeons and dragons. Does this technology have potential harmful effects?This episode features Claire Boine, an expert in AI law, in conversation with Integrity Institute member Talha Baig, a Machine Learning (ML) Engineer. Claire is a PhD candidate in AI Law at the University of Ottawa, and a Research Associate at the Artificial and Natural Intelligence Toulouse Institute and in the Accountable AI in a Global Context Research Chair at UOttawa. Claire also runs a nonprofit organization whose goal is to help senior professionals motivated by evidence and reason transition into high impact fields including AI. We discuss how deep fakes present an asymmetrical power dynamic and some mitigations we can put in place including data trusts - a collective to put the data back in the hands of users. We also ponder the use of simulacras to replace dead actors and discuss whether we can resurrect dead philosophers by the use of deep learning. Towards the end of the episode, we surmise how chatbots develop bias, and even discuss if AI is sentient and whether that matters.Disclaimer: The views in this episode only represent the views of the people involved in the recording of the episode. They do not represent Meta's or any other entity's views. Links:Sabelo Mhlambi: From Rationality to Relationality: Ubuntu as an Ethical and Human Rights Framework for Artificial Intelligence Governance [link]Kevin Roose: Bing's A.I. Chat: ‘I Want to Be Alive.

Trust in Tech: an Integrity Institute Member Podcast
Trust in Tech, Episode 11: The Impact of Social Media on the Past and Present: History, Hate, and Techno-imperialism

Trust in Tech: an Integrity Institute Member Podcast

Play Episode Listen Later Mar 9, 2023 60:05


This episode features Jason Steinhauer, author of "History Disrupted: How Social Media and the World Wide Web Have Changed the Past", and Integrity Institute member Theodora Skeadas, public policy professional with 10 years of experience at the intersection of technology, society, and safety. Theo has worked in Twitter, Booz Allen Hamilton, and is currently president of Harvard W3D: Women in Defense, Diplomacy, and Development.In recent years, social media has been a breeding ground for disinformation, hate speech, and the spread of harmful ideologies.Jason argues that social media has birthed a new genre of historical communication that he calls “e-history,” a user-centric, instantly-gratifying version of history that often avoids the true complexity of the past. Theo retorts that social media and wikipedia are non-gate-kept institutions that have allowed for the democratization of history - so both the winners and the losers write the past.Disclaimer: The views in this episode only represent the views of the people involved in the recording of the episode. They do not represent Meta's or any other entity's views. Links:Jason's book: History, Disrupted: How Social Media and the World Wide Web Have Changed the PastJason's substack: History ClubHarvard's W3D: Women in Defense, Diplomacy and Development newsletter: ThreoAll Tech is Human: websiteCredits:Produced by Talha BaigMusic by Zhao ShenSpecial Thanks to Rachel, Sean, Cass and Sahar for their continued support

Trust in Tech: an Integrity Institute Member Podcast
Trust in Tech, Episode 10: Counter-terrorism on Tech Platforms w/ GIFCT Director of Technology Tom Thorley

Trust in Tech: an Integrity Institute Member Podcast

Play Episode Listen Later Mar 2, 2023 61:32


Welcome to the Trust in Tech podcast, a project by the Integrity Institute — a community driven think tank which advances the theory and practice of protecting the social internet, powered by our community of integrity professionals.In this episode, Institute member Talha Baig is in conversation with Tom Thorley of the Global Internet Forum to Counter Terrorism (GIFCT). The Forum was established to foster technical collaboration among member companies, advance relevant research, and share knowledge with smaller platforms. Tom Thorley is the Director of Technology at GIFCT and delivers cross-platform technical solutions for GIFCT members. He worked for over a decade at the British government's signals intelligence agency, GCHQ, where Tom specialized in issues at the nexus of technology and human behavior. As a passionate advocate for responsible technology, Tom is a member of the board of the SF Bay Area Internet Society Chapter; is a mentor with All Tech Is Human and Coding It Forward; and also volunteers with America On Tech and DataKind.Tom and Talha discuss the historical context behind founding GIFCT, the difficulties of cross-platform content moderation, and fighting terrorism over encrypted networks while maintaining human rights.As a reminder, the views stated in this episode are not affiliated with any organization and only represent the views of the individuals. We hope you enjoy the show.Credits:If you enjoyed today's conversation please share this episode to your friends, so we can continue making episodes like this.Today's episode was produced by Talha Baig Music by Zhao ShenSpecial thanks to Sahar, Cass, Rachel and Sean for their continued support

Trust in Tech: an Integrity Institute Member Podcast
Trust in Tech, Episode 9: Positioning Generative AI to Empower Artists

Trust in Tech: an Integrity Institute Member Podcast

Play Episode Listen Later Feb 22, 2023 0:46


Welcome to the Trust in Tech podcast, a project by the Integrity Institute — a community driven think tank which advances the theory and practice of protecting the social internet, powered by our community of integrity professionals.In this episode, Institute co-founder Jeff Allen and Institute member Derek Slater discuss the Creative Commons statement in favor of generative AI. Derek is a founding partner at Proteus Strategies, and, among his various hats, was formerly Google's Global Director of Information Policy. As context: on Feb 6, 2023, the Creative Commons came out with a statement in favor of generative AI, claiming “Just as people learn from past works, generative AI is trained on previous works, analyzing past materials in order to extract underlying ideas and other information in order to build new works”Jeff and Derek reflect on this statement: discussing how past platforms have failed and succeeded at working with creators, and musing on what the future of work could look like.As a reminder, the views stated in this episode are not affiliated with any organization and only represent the views of the individuals. We hope you enjoy the show.Credits:If you enjoyed today's conversation please share this episode to your friends, so we can continue making episodes like this.Today's episode was produced by Talha Baig Music by Zhao ShenSpecial thanks to Sahar, Cass, Rachel and Sean for their continued support

Trust in Tech: an Integrity Institute Member Podcast
Trust in Tech, Episode 8: Hiring and growing trust & safety teams at small companies

Trust in Tech: an Integrity Institute Member Podcast

Play Episode Listen Later Feb 15, 2023 0:35


Welcome to the Trust in Tech podcast, a project by the Integrity Institute — a community driven think tank which advances the theory and practice of protecting the social internet, powered by our community of integrity professionals.In this episode, two Trust & Safety leaders discuss what it's really like to build teams at small companies. We discuss the pros and cons of working at a small company, what hiring managers look for, how small teams are structured, and career growth opportunities.Alice Hunsberger, VP CX, Grindr interviews Colleen Mearn, who currently leads Trust & Safety at Clubhouse. Previously she was the Global Vertical Lead at YouTube for Harmful and Dangerous policies. In both of these roles, Colleen has loved figuring out how to scale global policies and building high-performing teams.Timestamps:0:30 - Intro/ Colleen's background2:30 - Tech policy jobs5:26 - Downsides of Big Tech6:30- Collaborating cross-functionally, working with product teams9:45 - Building teams at small companies12:30 - Types of people who succeed at small companies16:00 - Career growth17:00 - Growing a team, which roles to prioritize20:45 - The hiring process at small companies23:15- What hiring managers at small companies look for24:45 - Cover letter controversy27:20 - Pivoting to Trust and Safety mid-career, vs. starting as a content moderator34:30 - OutroTrust in Tech is hosted by Alice Hunsberger, and produced by Talha Baig.Edited by Alice Hunsberger.Music by Zhao Shen. Special thanks to Sahar Massachi, Cass Marketos, Rachel Fagen and Sean Wang.

Trust in Tech: an Integrity Institute Member Podcast
Trust in Tech, Episode 7: XCheck — Policing the Elite of Facebook Users

Trust in Tech: an Integrity Institute Member Podcast

Play Episode Listen Later Jan 30, 2023 39:39


Welcome to the Trust in Tech podcast, a project by the Integrity Institute — a community driven think tank which advances the theory and practice of protecting the social internet, powered by our community of integrity professionals.In this episode, Integrity Institute member Lauren Wagner and fellow Karan Lala discuss Meta's cross-check program and the Oversight Board's policy advisory opinion. They cover how Meta treats its most influential and important users, the history and technical details of the cross-check program, the public response to its leak, what the Oversight Board found with respect to Meta's scaled content moderation, and what the company could do to address its gaps going forward. Lauren Wagner is a venture capitalist and fellow at the Berggruen Institute researching trust and safety. She previously worked at Meta, where she developed product strategy to tackle misinformation at scale and built privacy-protected data sharing products. Karan Lala is currently a J.D. Candidate at the University of Chicago Law School working at the intersection of policy and technology. He was a software engineer on Facebook's Civic Integrity team, where he led efforts to detect and enforce against abusive assets and sensitive entities in the civic space.Timestamps:0:00: Intro1:36: Overview of the XCheck program7:53: Data-sharing with the Oversight Board11:01: XCheck around the world12:59: The Oversight Board's findings19:25: Public response to the leak22:40: Recommendations and fixes 34:02: What should the future of XCheck look like? Credits:Trust in Tech is hosted by Alice Hunsberger, and produced by Talha Baig.Edited by Alice Hunsberger.Music by Zhao Shen. Special thanks to Sahar Massachi, Cass Marketos, Rachel Fagen and Sean Wang.

Trust in Tech: an Integrity Institute Member Podcast
Trust in Tech, Episode 6: Reconciling Capitalism & Community

Trust in Tech: an Integrity Institute Member Podcast

Play Episode Listen Later Jan 25, 2023 55:01


Welcome to the Trust in Tech podcast, a project by the Integrity Institute — a community driven think tank which advances the theory and practice of protecting the social internet, powered by our community of integrity professionals.In this sixth episode, Integrity Institute member Alice Hunsberger and Community Advisor Cassandra Marketos discuss digital spaces and community building. They discuss how to live in a world where community is not the default; whether being anonymous in online spaces is a good thing; and how product design and perception can influence the legitimacy of the content and community of a product. Cassandra “Cass” Marketos has a varied background and a diverse range of skills. She started out as a product manager for the music label Insound. Then she was the first employee at Kickstarter, where she worked on everything related to editorial and community. After her time there, Cass was deputy director of digital outbound during the Obama administration. And now she serves as Community Advisor on the Integrity Institute staff, making our community at the Integrity Institute feel like home. Cass has launched several non-profits including Dollar a Day, and now builds her local community with compost.Timestamps:0:00: Intro0:50 What is community2:30: Business and Community11:00: Being Idealistic and Realistic12:50: Is Discord the future?19:50: Anonymity in Online Spaces25:00: Universal ToS is Impossible31:20: Social Media as Road Rage34:10: Building Community in Real and Online life46:00: Urban Dictionary and Product Design Legitimizing Content51:20: Having a Community Advocate on your TeamCredits:Trust in Tech is hosted by Alice Hunsberger, and produced by Talha Baig.Edited by Alice Hunsberger.Music by Zhao Shen. Special thanks to Sahar Massachi, Cass Marketos, Rachel Fagen and Sean Wang.

Trust in Tech: an Integrity Institute Member Podcast
Trust in Tech, Episode 5: Keeping the Metaverse Safe

Trust in Tech: an Integrity Institute Member Podcast

Play Episode Listen Later Jan 18, 2023 45:59


In this fifth episode, Integrity Institute members Talha Baig and Lizzy Donahue talk Integrity in the metaverse. The conversation ranges from defining what the metaverse is to discussing whether it should even exist! We also discuss other fun topics, such as: integrity issues with augmented reality and dating in the metaverse. Lizzy is an experienced integrity professional who worked at Meta for 7 years where she pioneered machine learning to proactively detect suicidal intent, worked on Integrity at Oculus Rift home, and kept us safe on Horizon worlds. On top of that Lizzy was a “Global Social Benefit” fellow at SCU, where she won the top prize at her senior design conference for building a tool to aid Social Enterprises with training employees and customers. She is now working as a Trust and Safety engineer at Clubhouse.Talha steps in for Alice Hunsberger as host. Talha worked at Meta for the past 3 years as an ML engineer on Marketplace Integrity and is currently acting as producer for this podcast.Disclaimer: The views in this episode only represent the views of the people involved in the recording of the episode. They DO NOT represent Meta's or any other entity's views. Timestamps:0:00 Intro1:45 What is the metaverse3:50 Integrity in the metaverse6:15 Privacy in the metaverse9:50 Should children be allowed in the metaverse14:30 Overwatch18:50 Body language in the metaverse24:50 Self-governance in the metaverse27:45 Decentralized recording29:45 Is the metaverse good for society?38:10 Dating in the metaverse40:50 Integrity for Augmented Reality44:55 CreditsTrust in Tech is hosted by Alice Hunsberger, and produced by Talha Baig.Edited by Alice Hunsberger.Music by Zhao Shen. Special thanks to Sahar Massachi and Cassandra Marketos for their continued support, and to all the members of the Integrity Institute.

Trust in Tech: an Integrity Institute Member Podcast
Trust in Tech, Episode 4: Preventing and Reacting to Burnout

Trust in Tech: an Integrity Institute Member Podcast

Play Episode Listen Later Dec 9, 2022 71:00


In this third episode, Integrity Institute member Alice Hunsberger talks with Institute cofounders Sahar Massachi and Jeff Allen about the issues around integrity in tech and why the Integrity Institute was founded, how to define integrity work, and why integrity teams are the true long-term growth teams of tech companies. We have a bit of a deep dive into hate speech and talk about several reasons why it's important to remove it, and the dreaded death spiral that can happen when platforms don't invest in integrity properly. We also discuss why building social media companies is an ethical endeavor, and the work the Integrity Institute has done to establish a code of ethics and a hippocratic oath for integrity workers. And we touch on Jeff and Sahar's thoughts on safety regulation for the industry, the importance of initial members to define a group's norms, the benefit to growing slowly, and why integrity workers are heroes. Trust in Tech is hosted by Alice Hunsberger, and produced by Talha Baig.Edited by Alice Hunsberger.Music by Zhao Shen. Special thanks to Sahar Massachi and Cassandra Marketos for their continued support, and to all the members of the Integrity Institute.

Trust in Tech: an Integrity Institute Member Podcast
Trust in Tech, Episode 3: Founding Episode with Sahar Massachi & Jeff Allen

Trust in Tech: an Integrity Institute Member Podcast

Play Episode Listen Later Nov 29, 2022 42:48


In this third episode, Integrity Institute member Alice Hunsberger talks with Institute cofounders Sahar Massachi and Jeff Allen about the issues around integrity in tech and why the Integrity Institute was founded, how to define integrity work, and why integrity teams are the true long-term growth teams of tech companies. We have a bit of a deep dive into hate speech and talk about several reasons why it's important to remove it, and the dreaded death spiral that can happen when platforms don't invest in integrity properly. We also discuss why building social media companies is an ethical endeavor, and the work the Integrity Institute has done to establish a code of ethics and a hippocratic oath for integrity workers. And we touch on Jeff and Sahar's thoughts on safety regulation for the industry, the importance of initial members to define a group's norms, the benefit to growing slowly, and why integrity workers are heroes. Trust in Tech is hosted by Alice Hunsberger, and produced by Talha Baig.Edited by Alice Hunsberger.Music by Zhao Shen. Special thanks to Sahar Massachi and Cassandra Marketos for their continued support, and to all the members of the Integrity Institute.

Trust in Tech: an Integrity Institute Member Podcast
Trust in Tech, Episode 2: Global Threat Analysis with Zara Perumal

Trust in Tech: an Integrity Institute Member Podcast

Play Episode Listen Later Nov 14, 2022 28:24


Integrity Institute members Alice Hunsberger and Zara Perumal talk about mis- and disinformation: how to recognize it and how to contextualize it, both individually and at scale. Trust in Tech is hosted by Alice Hunsberger, and produced by Talha Baig.Edited by Alice Hunsberger.Music by Jao Shen. Special thanks to Sahar Massachi and C assandra Marketos for their continued support, and to all the members of the Integrity Institute.

Trust in Tech: an Integrity Institute Member Podcast
Trust in Tech, Episode 1: Elections with Katie Harbath

Trust in Tech: an Integrity Institute Member Podcast

Play Episode Listen Later Nov 8, 2022 29:48


Welcome to the Trust in Tech podcast, a project by the Integrity Institute — a community driven think tank which advances the theory and practice of protecting the social internet, powered by our community of integrity professionals. In this first episode, Integrity Institute members Alice Hunsberger and Katie Harbath talk on the eve of the US midterms about the issues surrounding civic integrity and elections online; what life after working in tech looks like; and how scared Katie thinks we should be about the 2024 elections. Links:Integrity Institute's Elections Integrity ProgramBipartisan Policy Center: New Survey Data on Who Americans Look To For Election Information (Nov 2, 2022)Katie Harbath's newsletter: https://anchorchange.substack.com/

The Sunday Show
Trust and Safety Comes of Age?

The Sunday Show

Play Episode Listen Later Sep 25, 2022 50:01


As content moderation and other trust and safety issues have been, to put it mildly, at the fore of tech concerns over the last few years, it's interesting to take a step back and look at the various conferences, professional organizations and research communities that have emerged to address this broad and challenging set of subjects. To get a sense of where trust and safety is as a field at this moment in time, Tech Policy Press spoke to three individuals involved in it, each coming from different perspectives: Shelby Grossman, a research scholar at the Stanford Internet Observatory and a leader in the community of academic researchers studying trust and safety issues as co-editor of the recently launched https://tsjournal.org/index.php/jots (Journal of Online Trust and Safety) David Sullivan, the leader of an industry funded consortium focused on developing best practices for the field called the https://dtspartnership.org/ (Digital Trust and Safety Partnership); and Jeff Allen, co-founder and chief research officer of an independent membership organization of trust and safety professionals, the https://integrityinstitute.org/ (Integrity Institute).

IRL - Online Life Is Real Life
The Truth is Out There

IRL - Online Life Is Real Life

Play Episode Listen Later Aug 29, 2022 21:54


Murky political groups are exploiting social media systems to spread disinformation. With important elections taking place around the world this year, who is pushing back? We meet grassroots groups in Africa and beyond who are using AI to tackle disinformation in languages and countries underserved by big tech companies.Justin Arenstein is the founder of Code for Africa, an organization that works with newsrooms across 21 countries to fact check, track and combat the global disinformation industry.Tarunima Prabhakar builds tools and datasets to respond to online misinformation in India, as co-founder of the open-source technology community, Tattle.Sahar Massachi was a data engineer at Facebook and now leads the Integrity Institute, a new network for people who work on integrity teams at social media companies. Raashi Saxena in India was the global project coordinator of Hatebase, a crowdsourced repository of online hate speech in 98 languages, run by the Sentinel Project. IRL is an original podcast from Mozilla, the non-profit behind Firefox. In Season 6, host Bridget Todd shares stories of people who make AI more trustworthy in real life. This season doubles as Mozilla's 2022 Internet Health Report. Go to the report for show notes, transcripts, and more. 

Talking Headways: A Streetsblog Podcast
Episode 369: Treating Social Media Like a City

Talking Headways: A Streetsblog Podcast

Play Episode Listen Later Feb 10, 2022 61:21


This week we're joined by Sahar Massachi of the Integrity Institute. Sahar discusses his piece in MIT Technology Review connecting cities and social media platforms and how we should be monitoring and managing them properly. We chat about the similarities between managing social media's bad actors and the urban problems like black box highway modeling, speed management, and city building. Follow us on twitter @theoverheadwire Support the show on Patreon! http://Patreon.com/theoverheadwire  

Arbiters of Truth
What Is Integrity in Social Media?

Arbiters of Truth

Play Episode Listen Later Feb 4, 2022 55:57


There's been a lot of news recently about Facebook, and a lot of that news has focused on the frustration of employees assigned to the platform's civic integrity team or other corners of the company focused on ensuring user trust and safety. If you read reporting on the documents leaked by Facebook whistleblower Frances Haugen, you'll see again and again how these Facebook employees raised concerns about the platform and proposed solutions only to be shot down by executives.That's why it's an interesting time to talk to two former Facebook employees who both worked on the platform's civic integrity team. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Sahar Massachi and Jeff Allen, who recently unveiled a new project, the Integrity Institute, aimed at building better social media. The goal is to bring the expertise of current and former tech employees to inform the ongoing discussion around if and how to regulate big social media platforms. They dug into the details of what they feel the Institute can add to the conversation, the nitty-gritty of some of the proposals around transparency and algorithms that the Institute has already set out, and what the mood is among people who work in platform integrity right now. See acast.com/privacy for privacy and opt-out information.

Radio Free New York
Can you fix social media by targeting behavior instead of speech?

Radio Free New York

Play Episode Listen Later Jan 6, 2022 48:35


Sahar Massachi, Co-Founder and Executive Director of the Integrity Institute talks about the emerging field of integrity professionals in social media. Can clear rules and best practices in social media save our platforms from abusive users? Can we target destructive behaviors instead of policing speech? These discussions and more. https://integrityinstitute.org/ https://static1.squarespace.com/static/614cbb3258c5c87026497577/t/617834d31bcf2c5ac4c07494/1635267795944/Metrics+and+Transparency+-+Summary+%28EXTERNAL%29.pdf https://www.technologyreview.com/2021/12/20/1042709/how-to-save-social-media-treat-it-like-a-city/ --- Send in a voice message: https://anchor.fm/afreesolution/message Support this podcast: https://anchor.fm/afreesolution/support

Radio Free New York
BONUS : Sahar from Integrity Institute - Talking to Government and HOW we change social

Radio Free New York

Play Episode Listen Later Jan 6, 2022 22:33


Sahar and Kevin continue the conversation after the show to discuss social media hearings and potential policy around changing social media --- Send in a voice message: https://anchor.fm/afreesolution/message Support this podcast: https://anchor.fm/afreesolution/support

The Sunday Show
Reducing Harm on Social Media

The Sunday Show

Play Episode Listen Later Jan 2, 2022 66:34


This first episode of 2022 features a discussion hosted by the Center for Social Media and Politics at NYU (CSMaP) that gathered academic, policy, and tech experts to discuss ideas about how to make social media a safer and more civil place.  The panel was expertly moderated by Jane Lytvynenko, a Senior Research Fellow at the Technology and Social Change Project at Harvard Kennedy School's Shorenstein Center on Media, Politics and Public Policy and included: Niousha Roshani, Deputy Director of the Content Policy & Society Lab at Stanford University's Program on Democracy and the Internet;  Rebekah Tromble, Director of the Institute for Data, Democracy & Politics at George Washington University;  Joshua A. Tucker, Co-Director of CSMaP and a Professor of Politics at NYU; and  and Sahar Massachi, Co-Founder and Executive Director of the Integrity Institute.  Thanks to Erik Opsal at CSMaP for his assistance providing the audio, and to Zeve Sanderson, Founding Executive Director at CSMaP, whose voice you will hear in the closing. 

The Sunday Show
Platform Integrity, Platform Democracy

The Sunday Show

Play Episode Listen Later Nov 21, 2021 82:10


Today, we've got two separate but related conversations about social media and how it intersects with democracy and society. In the first segment, we're going to hear from Jeff Allen and Sahar Massachi, two former Facebook employees who are the founders of the Integrity Institute, a new nonprofit organization. They believe one solution to the problems on social media is the development of a community of integrity professionals with experience at a variety of social media platforms that can come together to address problems and share best practices together. Then, we're going to look under the hood of some fresh ideas about how to democratize policymaking on social media platforms from Aviv Ovadya, a fellow at the Harvard Kennedy School's Belfer Center for Science and International Affairs and author of a proposal titled "Towards Platform Democracy: Policymaking Beyond Corporate CEOs and Partisan Pressure." To help evaluate Aviv's ideas, I'm also joined by Joe Bak-Coleman, a postdoctoral fellow at the University of Washington Center for an Informed Public, and Renée DiResta, technical research manager at Stanford Internet Observatory.

The Lawfare Podcast
What Is Integrity in Social Media?

The Lawfare Podcast

Play Episode Listen Later Nov 4, 2021 55:35


There's been a lot of news recently about Facebook, and a lot of that news has focused on the frustration of employees assigned to the platform's civic integrity team or other corners of the company focused on ensuring user trust and safety. If you read reporting on the documents leaked by Facebook whistleblower Frances Haugen, you'll see again and again how these Facebook employees raised concerns about the platform and proposed solutions only to be shot down by executives.That's why it's an interesting time to talk to two former Facebook employees who both worked on the platform's civic integrity team. This week on Arbiters of Truth, our series on the online information ecosystem, Evelyn Douek and Quinta Jurecic spoke with Sahar Massachi and Jeff Clark, who recently unveiled a new project, the Integrity Institute, aimed at building better social media. The goal is to bring the expertise of current and former tech employees to inform the ongoing discussion around if and how to regulate big social media platforms. They dug into the details of what they feel the Institute can add to the conversation, the nitty-gritty of some of the proposals around transparency and algorithms that the Institute has already set out, and what the mood is among people who work in platform integrity right now. Support this show http://supporter.acast.com/lawfare. See acast.com/privacy for privacy and opt-out information.