Podcast appearances and mentions of Sarah T Roberts

  • 33PODCASTS
  • 39EPISODES
  • 45mAVG DURATION
  • 1MONTHLY NEW EPISODE
  • Mar 12, 2025LATEST

POPULARITY

20172018201920202021202220232024


Best podcasts about Sarah T Roberts

Latest podcast episodes about Sarah T Roberts

Books & Writers · The Creative Process
The Hidden Humans Behind Artificial Intelligence & the Sociopathology of Elon Musk

Books & Writers · The Creative Process

Play Episode Listen Later Mar 12, 2025 59:27


In this episode on Speaking Out of Place podcast Professor David Palumbo-Liu talks with Sarah T. Roberts about the hidden humans behind Artificial Intelligence, which is reliant on executives and business managers to direct AI to promote their brand and low-level, out-sourced, and poorly paid content managers to slog through masses of images, words, and data before they get fed into the machine. They talk about the cultural, sociological, financial, and political aspects of AI. They end by taking on Elon Musk and the DOGE project, as an emblem of how Silicon Valley executives have embraced a brand of tech rapture that disdains and destroys democracy and attacks the idea that people can take care of each other, independent of sociopathic libertarianism.Sarah T. Roberts, Ph.D., is a full professor at UCLA (Gender Studies, Information Studies, Labor Studies), specializing in Internet and social media policy, infrastructure, politics and culture, and the intersection of media, technology, and society. She is the faculty director and co-founder of the UCLA Center for Critical Internet Inquiry (C2i2), co-director of the Minderoo Initiative on Technology & Power, and a research associate of the Oxford Internet Institute. Informed by feminist Science and Technology Studies perspectives, Roberts is keenly interested in the way power, geopolitics, and economics play out on and via the internet, reproducing, reifying, and exacerbating global inequities and social injustice.www.palumbo-liu.comhttps://speakingoutofplace.comBluesky @palumboliu.bsky.socialInstagram @speaking_out_of_placePhoto of Elon Musk: Debbie RoweCreative Commons Attribution-Share Alike 3.0 Unported

Social Justice & Activism · The Creative Process
The Hidden Humans Behind Artificial Intelligence & the Sociopathology of Elon Musk

Social Justice & Activism · The Creative Process

Play Episode Listen Later Mar 12, 2025 59:27


In this episode on Speaking Out of Place podcast Professor David Palumbo-Liu talks with Sarah T. Roberts about the hidden humans behind Artificial Intelligence, which is reliant on executives and business managers to direct AI to promote their brand and low-level, out-sourced, and poorly paid content managers to slog through masses of images, words, and data before they get fed into the machine. They talk about the cultural, sociological, financial, and political aspects of AI. They end by taking on Elon Musk and the DOGE project, as an emblem of how Silicon Valley executives have embraced a brand of tech rapture that disdains and destroys democracy and attacks the idea that people can take care of each other, independent of sociopathic libertarianism.Sarah T. Roberts, Ph.D., is a full professor at UCLA (Gender Studies, Information Studies, Labor Studies), specializing in Internet and social media policy, infrastructure, politics and culture, and the intersection of media, technology, and society. She is the faculty director and co-founder of the UCLA Center for Critical Internet Inquiry (C2i2), co-director of the Minderoo Initiative on Technology & Power, and a research associate of the Oxford Internet Institute. Informed by feminist Science and Technology Studies perspectives, Roberts is keenly interested in the way power, geopolitics, and economics play out on and via the internet, reproducing, reifying, and exacerbating global inequities and social injustice.www.palumbo-liu.comhttps://speakingoutofplace.comBluesky @palumboliu.bsky.socialInstagram @speaking_out_of_placePhoto of Elon Musk: Debbie RoweCreative Commons Attribution-Share Alike 3.0 Unported

Education · The Creative Process
The Hidden Humans Behind Artificial Intelligence & the Sociopathology of Elon Musk

Education · The Creative Process

Play Episode Listen Later Mar 12, 2025 59:27


In this episode on Speaking Out of Place podcast Professor David Palumbo-Liu talks with Sarah T. Roberts about the hidden humans behind Artificial Intelligence, which is reliant on executives and business managers to direct AI to promote their brand and low-level, out-sourced, and poorly paid content managers to slog through masses of images, words, and data before they get fed into the machine. They talk about the cultural, sociological, financial, and political aspects of AI. They end by taking on Elon Musk and the DOGE project, as an emblem of how Silicon Valley executives have embraced a brand of tech rapture that disdains and destroys democracy and attacks the idea that people can take care of each other, independent of sociopathic libertarianism.Sarah T. Roberts, Ph.D., is a full professor at UCLA (Gender Studies, Information Studies, Labor Studies), specializing in Internet and social media policy, infrastructure, politics and culture, and the intersection of media, technology, and society. She is the faculty director and co-founder of the UCLA Center for Critical Internet Inquiry (C2i2), co-director of the Minderoo Initiative on Technology & Power, and a research associate of the Oxford Internet Institute. Informed by feminist Science and Technology Studies perspectives, Roberts is keenly interested in the way power, geopolitics, and economics play out on and via the internet, reproducing, reifying, and exacerbating global inequities and social injustice.www.palumbo-liu.comhttps://speakingoutofplace.comBluesky @palumboliu.bsky.socialInstagram @speaking_out_of_placePhoto of Elon Musk: Debbie RoweCreative Commons Attribution-Share Alike 3.0 Unported

Tech, Innovation & Society - The Creative Process
The Hidden Humans Behind Artificial Intelligence & the Sociopathology of Elon Musk

Tech, Innovation & Society - The Creative Process

Play Episode Listen Later Mar 12, 2025 59:27


In this episode on Speaking Out of Place podcast Professor David Palumbo-Liu talks with Sarah T. Roberts about the hidden humans behind Artificial Intelligence, which is reliant on executives and business managers to direct AI to promote their brand and low-level, out-sourced, and poorly paid content managers to slog through masses of images, words, and data before they get fed into the machine. They talk about the cultural, sociological, financial, and political aspects of AI. They end by taking on Elon Musk and the DOGE project, as an emblem of how Silicon Valley executives have embraced a brand of tech rapture that disdains and destroys democracy and attacks the idea that people can take care of each other, independent of sociopathic libertarianism.Sarah T. Roberts, Ph.D., is a full professor at UCLA (Gender Studies, Information Studies, Labor Studies), specializing in Internet and social media policy, infrastructure, politics and culture, and the intersection of media, technology, and society. She is the faculty director and co-founder of the UCLA Center for Critical Internet Inquiry (C2i2), co-director of the Minderoo Initiative on Technology & Power, and a research associate of the Oxford Internet Institute. Informed by feminist Science and Technology Studies perspectives, Roberts is keenly interested in the way power, geopolitics, and economics play out on and via the internet, reproducing, reifying, and exacerbating global inequities and social injustice.www.palumbo-liu.comhttps://speakingoutofplace.comBluesky @palumboliu.bsky.socialInstagram @speaking_out_of_placePhoto of Elon Musk: Debbie RoweCreative Commons Attribution-Share Alike 3.0 Unported

Speaking Out of Place
The Hidden Humans Behind Artificial Intelligence, and the Sociopathology of Elon Musk: A Conversation with Sarah T. Roberts

Speaking Out of Place

Play Episode Listen Later Mar 9, 2025 59:22


Today on Speaking Out of Place I talk with Sarah T Roberts about the hidden humans behind Artificial Intelligence, which is reliant on executives and business managers to direct AI to promote their brand and low-level, out-sourced, and poorly paid content managers to slog through masses of images, words, and data before they get fed into the machine. We talk about the cultural, sociological, financial, and political aspects of AI. We end by taking on Elon Musk and the DOGE project, as an emblem of how Silicon Valley executives have embraced a brand of tech rapture that disdains and destroys democracy and attacks the idea that people can take care of each other, independent of sociopathic libertarianism.Sarah T. Roberts, Ph.D. is a full professor at UCLA (Gender Studies, Information Studies, Labor Studies), specializing in Internet and social media policy, infrastructure, politics and culture, and the intersection of media, technology and society. She is the faculty director and co-founder of the UCLA Center for Critical Internet Inquiry (C2i2), co-director of the Minderoo Initiative on Technology & Power, and a research associate of the Oxford Internet Institute. Informed by feminist Science and Technology Studies perspectives, Roberts is keenly interested in the way power, geopolitics and economics play out on and via the internet, reproducing, reifying and exacerbating global inequities and social injustice.

University of Minnesota Press
Cyberlibertarianism and the fraught politics of the internet

University of Minnesota Press

Play Episode Listen Later Nov 12, 2024 56:47


In a timely challenge to the potent political role of digital technology, Cyberlibertarianism: The Right-Wing Politics of Digital Technology argues that right-wing ideology was built into both the technical and social construction of the digital world from the start. Leveraging more than a decade of research, David Golumbia, who passed away in 2023, traced how digital evangelism has driven a worldwide shift toward the political right, concealing inequality, xenophobia, dishonesty, and massive corporate concentrations of wealth and power beneath the idealistic presumption of digital technology as an inherent social good. George Justice wrote the foreword to Cyberlibertarianism, and is joined in conversation with Frank Pasquale.George Justice is professor of English literature and provost at the University of Tulsa.Frank Pasquale is professor of law at Cornell Tech and Cornell Law School.David Golumbia (1963–2023) was associate professor of English at Virginia Commonwealth University and author of Cyberlibertarianism: The Right-Wing Politics of Digital Technology; The Politics of Bitcoin: Software as Right-Wing Extremism; and The Cultural Logic of Computation.EPISODE REFERENCES:Tim WuLawrence LessigWikileaksDavid E. Pozen: Transparency's Ideological Drift https://openyls.law.yale.edu/handle/20.500.13051/10354Stefanos Geroulanos / Transparency in Postwar France#CreateDontScrapeDavid Golumbia / ChatGPT Should Not Exist (article)M. T. Anderson / FeedJonathan Crary / Scorched Earth"If you want to understand the origins of our information hellscape with its vast new inequalities, corrupt information, algorithmic control, population-scale behavioral manipulation, and wholesale destruction of privacy, then begin here."—Shoshana Zuboff, author of The Age of Surveillance Capitalism"Cyberlibertarianism is essential for understanding the contemporary moment and the recent past that got us here. It stands as a monumental magnum opus from a meticulous thinker and sharp social critic who is sorely missed."—Sarah T. Roberts, director, Center for Critical Internet Inquiry, UCLACyberlibertarianism: The Right-Wing Politics of Digital Technology is available from University of Minnesota Press.

Airtalk
Updates from the Paris 2024 Olympic Games

Airtalk

Play Episode Listen Later Aug 8, 2024 17:25


The 2024 Paris Olympic Games are set to wrap up in the next several days. Whether watching on cable, streaming, or TikTok viewers have been treated to a stunning display of athleticism from around the globe. From a controversial opening ceremony to the triumphant return of Snoop Dog, the Paris games have been nothing short of pure entertainment. Today on AirTalk, we are breaking down everything you need to know about the 2024 Paris Olympic Games as they come to a close. Joining us to discuss the latest from Paris is David Wharton, staff writer for The Los Angeles Times Covering The Olympics, and Sarah T. Roberts, professor and director of UCLA's Center for Critical Internet Inquiry. We also want to hear from you. What have been your favorite moments of the Olympics?

What Works | Small Business Podcast
EP 449: The Most Undervalued Skill of the 21st-Century Economy

What Works | Small Business Podcast

Play Episode Listen Later Oct 26, 2023 35:18


This is the penultimate episode of Strange New Work, a special series from What Works that explores the future of work through the lens of speculative fiction.What's the most undervalued skill of the 21st-century economy? Moderation.I very well might be forgetting something. But with more of our lives and work showing up online every day, the way our feeds, data, and connections are moderated is critical to our daily lives. Moderation can be many things—it's how platforms are designed, how content is incentivized or de-incentivized, and how communication between people is mediated. Some moderation is done structurally, some is done with code, but lots of moderation is done by real people all over the world.In this episode, I take a close look at the skill of moderation, its role in our evolving tech futures, and the politics that complicate this essential work.Footnotes: "Welcome to hell, Elon" by Nilay Patel on The Verge "Why Elon's Twitter is in the Sh*tter with Nilay Patel" on Offline with Jon Favreau Fall; Or, Dodge in Hell by Neal Stephenson Work Without the Worker by Phil Jones "Content Moderation is Terrible by Design" featuring Sarah T. Roberts on Harvard Business Review "Moderating Social Media" on the agenda on YouTube "How Microwork is the Solution to War" by Ben Irwin on Preemptive Love "Reddit faces content quality concerns after its Great Mod Purge" by Scharon Harding Rosie Sherry on tips for content moderation "Neal Stephenson Explains His Vision for the Digital Afterlife" on PC Mag Love What Works? Become a premium subscriber for just $7 per month. Your subscription helps make my work sustainable and gets you access to twice-monthly This is Not Advice episodes, quarterly workshops, and more. Click here to learn more and preview the premium benefits! ★ Support this podcast ★

Haileywood
7. MySecurity

Haileywood

Play Episode Listen Later May 24, 2023 39:39


"If you open a hole on the internet," UCLA professor Sarah T. Roberts tells us, "it gets  filled with sh*t." The tragic death of Megan Meier was a turning point for MySpace. As the first social media company to operate on a massive scale, MySpace and its users were forced to grapple with the consequences of that scale. In this episode, Joanne is joined by Thomas Kadri of the University of Georgia School of Law to discuss how our legal system was ill-equipped to deal with the social media era. UCLA professor and author Sarah T. Roberts chronicles the early days of content moderation. And Bridget Todd and Scott Zakarin are back to talk about bullying in the MySpace era.  See omnystudio.com/listener for privacy information.

Main Accounts: The Story of MySpace
7. MySecurity

Main Accounts: The Story of MySpace

Play Episode Listen Later May 24, 2023 39:39 Transcription Available


"If you open a hole on the internet," UCLA professor Sarah T. Roberts tells us, "it gets  filled with sh*t." The tragic death of Megan Meier was a turning point for MySpace. As the first social media company to operate on a massive scale, MySpace and its users were forced to grapple with the consequences of that scale. In this episode, Joanne is joined by Thomas Kadri of the University of Georgia School of Law to discuss how our legal system was ill-equipped to deal with the social media era. UCLA professor and author Sarah T. Roberts chronicles the early days of content moderation. And Bridget Todd and Scott Zakarin are back to talk about bullying in the MySpace era.  See omnystudio.com/listener for privacy information.

Community Signal
Elon Musk's Quest to Make Twitter Worse

Community Signal

Play Episode Listen Later Nov 21, 2022 55:12


Elon Musk's presence has loomed over Twitter since he announced plans to purchase the platform. And for these few weeks that he's been in charge, many concerns have proven to be justified. Musk laid off 3,700 employees, and then 4,400 contractors. He is firing those who are critical of him. The verification process, perhaps one of Twitter's most trusted features, has been unraveled. He's offered severance to those who don't want to be part of “extremely hardcore” Twitter. Following the results of a Twitter poll, he reinstated the account of Donald Trump, who was suspended from the platform for his role in inciting the January 6th attacks. So, what happens now? What of the many social movements that manifested on Twitter? While some movements and followings may see new manifestations on other platforms, not everything will be completely recreated. For example, as writer Jason Parham explains, “whatever the destination, Black Twitter will be increasingly difficult to recreate.” In this episode of Community Signal, Patrick speaks to three experts: Sarah T. Roberts, associate professor in the Department of Information Studies at UCLA, trust and safety consultant Ralph Spencer, and Omar Wasow, assistant professor in UC Berkeley's Department of Political Science and co-founder of BlackPlanet, about the current state and future of Twitter. They dissect the realities facing the platform today including content moderation, loss of institutional knowledge, and uncertainty about Twitter's infrastructure, but also emphasize the importance of Twitter as a social utility for news and more. This episode also touches on: The reality of moderating a platform like Twitter What platforms actually mean when they say they're for “free speech” How Musk tanked the value of verification on Twitter Big Quotes On the future of content moderation at Twitter (8:28): “There's no way possible with the cuts [Musk has] made that he's going to be able to do any type of content moderation. … [He] isn't going to have anybody who remotely begins to know to how to do that [legal compliance and related work].” –Ralph Spencer Sarah T. Roberts' moderation challenge for Elon Musk (11:19): “I want Elon Musk to spend one day as a frontline production content moderator, and then get back to this [Community Signal] crew about how that went. Let us know what you saw. Share with us how easy it was to stomach that. Were you able to keep up with the expected pace at Twitter? Could you … make good decisions over 90% of the time, over 1,000, 2,000 times a day? Could you do that all the while seeing animals being harmed, kids being beat on, [and] child sexual exploitation material?” –@ubiquity75 Bumper sticker wisdom doesn't make good policy (15:46): “Everything [Musk has said about free speech] has had the quality of good bumper stickers but is totally divorced from reality, and that doesn't bode well, obviously.” –@owasow The responsibility in leading a social media platform (19:41): “One thing that we are seeing in real-time [at Twitter] is what a danger there is in having one individual – especially a very privileged individual who does not live in the same social milieu as almost anyone else in the world – one very privileged individual's ability to be the arbiter of … these profoundly contested ideological notions of something like free speech which again is continually misapplied in this realm.” –@ubiquity75 Musk's peddling of conspiracy theories (20:29): “[Musk is] running around tweeting that story about Nancy Pelosi's husband, the false article about what happened between him and his attacker. What kind of example is that to set? … What it is to me is like this kid who has way too much money, and he found a new toy he wants to play with.” –Ralph Spencer Leading with humility (21:23): “[If you're running a site like Twitter,] you have to have a ‘small d' democratic personality, which is to say you really have to be comfortable with a thousand voices flourishing, a lot of them being critical of you, and that's not something that you take personally.” –@owasow There are always limits on speech (23:50): “When you declare that your product, your site, your platform, your service is a free speech zone, there is always going to be a limit on that speech. … [CSAM] is the most extreme example that we can come up with, but that is content moderation. To remove that material, to disallow it, to enforce the law means that there is a limit on speech, and there ought to be in that case. If there's a limit on speech, it is by definition not a free speech site. Then we have to ask, well, what are the limits, and who do they serve?” –@ubiquity75 “Free speech” platforms are not a thing (25:25): “When I hear people invoke free speech on a for-profit social media site, not only does that not exist today, it never has existed, and it never will exist. Let's deal with what reality is actually giving us and talk about that instead of these fantasies that actually are pretty much not good for anyone involved.” –@ubiquity75 The social weight and trust that verification brought to interactions on Twitter (32:52): “[Twitter] has outsized social impact, whether it's in the political arena, whether it's in social movements, whether it's in celebrity usage, all of these things have been true. In terms of political movements, the good, bad, the ugly. We saw an insurrection against the United States launched by the President of the United States on Twitter, so it's not all rosy, but the point is that Twitter had this outsized power and part of that could be attributed … to this verification process that let a lot of high profile folks, prominent individuals, media organizations, other kinds of people in the zeitgeist or in the public eye, engage with a certain sense of security.” –@ubiquity75 How does Twitter sustain its infrastructure amidst the mass layoffs and resignations? (39:18): “We have good reason to fear that [Twitter's] infrastructure is going to get considerably worse over time. [Musk has] fired enough of the people. … In a lot of ways, [Twitter is] like a telephone company. It's got a lot of boring infrastructure that it has to maintain so that it's reliable. [Musk has] taken a bunch of these pillars or blocks in the Jenga stack and knocked them out, and it's a lot more wobbly now.” –@owasow Musk's Twitter user experience is not the common one (48:23): “[Musk is] obsessed with bots and spam, but why is that such a compulsion for him? Well, he has 100-plus million followers, and when he looks at his replies, there's probably a lot of bots and spam there. That's not where I live because I'm a civilian. His perspective is distorted in a way partly by the investment around him but partly also by just being so way out of proportion to almost any other human on Earth.” –@owasow About Our Guests Omar Wasow is an assistant professor in UC Berkeley's Department of Political Science. His research focuses on race, politics, and statistical methods. Previously, Omar co-founded BlackPlanet, an early leading social network, and was a regular technology analyst on radio and television. He received a PhD in African American Studies, an MA in government, and an MA in statistics from Harvard University. Ralph Spencer has been working to make online spaces safer for more than 20 years, starting with his time as a club editorial specialist (message board editor) at Prodigy, and then graduating to America Online. During his time at AOL, he was in charge of all issues involving Child Sexual Abuse Material or CSAM. The evidence that Ralph and the team he worked with in AOL's legal department compiled contributed to numerous arrests and convictions of individuals for the possession and distribution of CSAM. He currently works as a freelance trust and safety consultant. Sarah T. Roberts is an associate professor in the Department of Information Studies at UCLA. She holds a PhD from the iSchool at the University of Illinois at Urbana-Champaign. Her book on commercial content moderation, Behind the Screen, was released in 2019 from the Yale University Press. She served as a consultant, too, and is featured in the award-winning documentary The Cleaners. Dr. Roberts sits on the board of the IEEE Annals of the History of Computing, was a 2018 Carnegie Fellow, and a 2018 recipient of the EFF Barlow Pioneer Award for her groundbreaking research on content moderation of social media. Related Links Elon Musk takes control of Twitter and immediately ousts top executives (via NPR) Omar Wasow's website Omar Wasow on Twitter BlackPlanet.com, founded by Wasow Ralph Spencer on LinkedIn Sarah T. Roberts' website Sarah T. Roberts on Twitter Behind the Screen: Content Moderation in the Shadows of Social Media, by Sarah T. Roberts Note from Patrick: After 5 years, this is Carol's final episode as editorial lead on Community Signal. We'll miss you, Carol! The Twitter Rules Code and Other Laws of Cyberspace by Lawrence Lessig Elon Musk says Twitter will have a ‘content moderation council' (via The Verge) Democratic U.S. senators accuse Musk of undermining Twitter, urge FTC probe (via Reuters) We got Twitter ‘verified' in minutes posing as a comedian and a senator (via The Washington Post) How Much Did Twitter's Verification Chaos Cost Insulin Maker Eli Lilly and Twitter Itself? (via Gizmodo) Patrick's (somewhat sarcastic) Twitter thread about the policies he hoped the platform would put in place to address Musk's conflicts of interest Saturday Night Live's content moderation council sketch Transcript View on our website Your Thoughts If you have any thoughts on this episode that you'd like to share, please leave me a comment, send me an email or a tweet. If you enjoy the show, we would be so grateful if you spread the word and supported Community Signal on Patreon.

The American Vandal, from The Center for Mark Twain Studies
The Collapse of Twitter, Zombie Cyberlibertarianism, & Commercial Content Moderation with Sarah T. Roberts

The American Vandal, from The Center for Mark Twain Studies

Play Episode Listen Later Nov 18, 2022 56:44


With the end of Twitter seemingly imminent, content moderation and social media expert, Sarah T. Roberts, discusses Elon Musk's ideology, the labor of social media, and the migration to Mastodon. For more about this episode, please visit MarkTwainStudies.com/TheEndOfTwitter

The Tech Humanist Show
How Tech Harms – and Can Help Heal – the Climate

The Tech Humanist Show

Play Episode Listen Later Apr 21, 2022 45:09


On this week's episode, we're talking about one of the most urgent issues facing humanity today, and how we can reframe our mindset around it to better encourage and allow ourselves to take action. That issue, of course, is climate change. Technology has created a lot of the problems we face, but is also coming up with some of the most innovative and inventive solutions. Solving this is going to take creativity, collaboration, and a willingness to change, but that's what we're all about here at the Tech Humanist Show! What is our individual responsibility to tackling these problems? What are the most exciting solutions on the horizon? Who should we be holding to account, and how? Those answers and more on this week's episode. Guests this week include Sarah T. Roberts, AR Siders, Tan Copsey, Anne Therese Gennari, Christopher Mims, Art Chang, Dorothea Baur, Abhishek Gupta, and Caleb Gardner. The Tech Humanist Show is a multi-media-format program exploring how data and technology shape the human experience. Hosted by Kate O'Neill. To watch full interviews with past and future guests, or for updates on what Kate O'Neill is doing next, subscribe to The Tech Humanist Show hosted by Kate O'Neill channel on YouTube. Full Transcript: Hello, humans! Today we're talking about a problem that technology is both a major cause of and perhaps one of our best potential solutions for: climate change. By almost any reckoning, the climate emergency is the most urgent and existential challenge facing humanity for the foreseeable future. All of the other issues we face pale in comparison to the need to arrest and reverse carbon emissions, reduce global average temperatures, and begin the work of rebuilding sustainable models for all of us to be able to live and work on this planet. By late 2020, melting ice in the Arctic began to release previously-trapped methane gas deposits. The warming effects of methane are 80 times stronger than carbon over 20 years, which has climate scientists deeply worried. Meanwhile, the Amazon rainforest has been devastated by burning. The plastic-filled oceans are warming. Coral reefs are dying. Experts are constantly adjusting their predictions on warming trends. And climate issues contribute to other socio-political issues as well, usually causing a big loop: Climate disasters create uninhabitable environments, leading to increased migration and refugee populations, which can overwhelm nearby areas and stoke the conditions for nationalistic and jingoistic political power grabs. This puts authoritarians and fascists into power—who usually aren't too keen on spending money to fix problems like climate change that don't affect them personally—exacerbating all of the previous problems. UK Prime Minister Boris Johnson showcased exactly this type of position before a recent UN climate conference, claiming the fall of the Roman empire was due to uncontrolled immigration as a way of refocusing people's fear and attention away from climate change. Marine Le Pen of France went so far as to say that those without a homeland don't care about the environment. Similarly out-of-touch and out-of-context things have been said recently by right-wing leaders in Spain, Germany, Switzerland… the list goes on and on. Perhaps the most psychologically challenging aspect of all this is that even as we begin to tackle these issues one by one, we will continue to see worsening environmental effects for the next few decades. As David Wallace-Wells writes in The Uninhabitable Earth: “Some amount of further warming is already baked in, thanks to the protracted processes by which the planet adapts to greenhouse gas…But all of those paths projected from the present…to two degrees, to three, to four or even five—will be carved overwhelmingly by what we choose to do now.” The message is: It's up to us. We know what's coming, and are thus empowered to chart the course for the future. What we need are bold visions and determined action, and we need it now. At this point you may be thinking, “I could really use some of that Kate O'Neill optimism right about now…” Not only do I have hope, but many of the climate experts I have read and spoken with are hopeful as well. But the first step in Strategic Optimism is acknowledging the full and unvarnished reality, and the hard truth about the climate crisis is that things do look bleak right now. Which just means our optimistic strategy in response has to be that much more ambitious, collaborative, and comprehensive. As Christiana Figuere and Tom Rivett-Carnac wrote in The Future We Choose: Surviving the Climate Crisis, “[To feel] a lack of agency can easily transform into anger. Anger that sinks into despair is powerless to make change. Anger that evolves into conviction is unstoppable.” One of the things slowing progress on the climate front is the people on the extreme ends of the belief spectrum—especially those in positions of power—who believe it's either too late to do anything, or that climate change isn't happening at all. Technology exacerbates this problem through the spread of false information. Thankfully by this point most people—around 90% of Americans and a higher percentage of scientists—are in agreement that it's happening, although we're still divided on the cause. The same poll conducted in October 2021 by the Associated Press-NORC Center for Public Affairs Research and the Energy Policy Institute at the University of Chicago, found that only 54% of Americans believe humans contribute to climate change. A separate study conducted that same month looked at 88,125 peer-reviewed climate studies published between 2012 and 2020, and determined that 99.9% of those studies found human activity to be directly responsible for our warming planet. It's important, however, not to write off the people who aren't yet fully convinced. Technology, as much as it has given us near-infinite access to information, is also a tremendous propagator of mis- and disinformation, which is fed to people by algorithms as immutable fact, and is often indistinguishable from the truth. Sarah T Roberts, who is Associate Professor at the University of California, Los Angeles (UCLA) where she also serves as the co-founder of the UCLA Center for Critical Internet Inquiry, explains further. Sarah T Roberts: “When I think about people who fall victim to conspiracy theories, what I see is a human impulse to make sense of a world that increasingly doesn't. And they're doing it in the absence of information that is way more complex and hard to parse out and might actually point criticism at places that are very uncomfortable. They sense a wrongness about the world but they don't have the right information, or access to it, or even the ability to parse it, because we've destroyed public schools. And then the auxiliary institutions that help people, such as libraries, and that leaves them chasing their own tail through conspiracy theories instead of unpacking things like the consequences of western imperialism, or understanding human migration as economic and environmental injustice issues. Y'know, you combine all that, and people, what do they do? They reach for the pablum of Social Media, which is instantaneous, always on, easy to digest, and worth about as much as, y'know, those things might be worth. I guess what I'm trying to do is draw some connections around phenomena that seem like they have come from nowhere. It would behoove us to connect those dots both in this moment, but also draw back on history, at least the last 40 years of sort of like neoliberal policies that have eroded the public sphere in favor of private industry. What it didn't do was erode the public's desire to know, but what has popped up in that vacuum are these really questionable information sources that really don't respond to any greater norms, other than partisanship, advertising dollars, etc. And that's on a good day!” The fact is, there are a number of industries and people who have a vested interest in maintaining the status quo. Not all of them engage in disinformation schemes, but some corporations—and people—who are interested in fighting climate change aren't willing to look at solutions that might change their business or way of life. Too much change is scary, so they look for solutions that keep things as they are. AR Siders: “Too much of our climate change adaptation is focused on trying to maintain the status quo. We're trying to say, ‘hey, the climate is changing, what can we do to make sure that everything stays the same in the face of climate change?' And I think that's the wrong way to think about this.” That's AR Siders, assistant professor in the Biden School of Public Policy and Administration and the Department of Geography and a Core Faculty Member of the Disaster Research Center. Siders' research focuses on climate change adaptation governance, decision-making, and evaluation. ARSiders: “I think we need to think about the idea that we're not trying to maintain the status quo, we're trying to choose how we want our societies to change. I often start talks by showing historic photos, and trying to point out, in 1900, those photos don't look like they do today. So, 100 years in the future, things are going to look different. And that's true even if you don't accept climate change. Even if we stop climate change tomorrow, we might have another pandemic. We'll have new technology. And so our goal shouldn't be to try to lock society into the way it works today, it should be to think about, what are the things we really care about preserving, and then what things do we actively want to choose to change? Climate adaptation can be a really exciting field if we think about it that way.” And it is! But as more people have opened their eyes to the real threat looming in the near-horizon, disinformation entities and bad actors have changed their tactics, shifting responsibility to individuals, and away from the corporations causing the majority of the harm. So let's talk about our personal responsibility to healing the climate. Tan Copsey: “We always should be careful of this trap of individual action, because in the past the fossil fuel industry has emphasized individual action.” That's Tan Copsey, who is Senior Director, Projects and Partnerships at Climate Nexus, a strategic communications organization. His work focuses on communicating the impacts of climate change and the benefits of acting to reduce climate risks. You'll be hearing from him a lot this episode. We spoke recently about climate change solutions and responsibilities across countries and industries. He continued: Tan Copsey: “I don't know if it's true but apparently BP invented the carbon footprint as a way of kind of getting people to focus on themselves and feel a sense of guilt, and project out a sense of blame, but that's not really what it's about. Dealing with climate change should ultimately be a story about hope, and that's what I kind of try and tell myself and other people.” Speaking of, Shell had a minor PR awakening in November 2020 when they tweeted a poll asking: “What are you willing to change to help reduce carbon emissions?” The tweet prompted many high-profile figures like climate activist Greta Thunberg and US congresswoman Alexandria Ocasio-Cortez to call out the hypocrisy of a fossil fuel company asking the public for personal change. In truth, research has found that the richest 1% of the world's population were responsible for the emission of more than twice as much carbon dioxide as the poorer half of the world from 1990 to 2015, with people in the US causing the most emissions per capita in the world. Now, this doesn't mean to abandon personal responsibility. We should all make what efforts we can to lower our carbon footprint where feasible—whether by reviewing consumption habits, eating less meat, driving less, or anything from a wide variety of options. There's interesting psychological research around how making sustainable choices keeps us grounded in the mindset of what needs to change. I spoke with Anne Therese Gennari, a speaker, educator, and environmental activist known as The Climate Optimist, about the psychology behind individual action, and how the simple act of being more climate conscious in our daily lives can make the world a better place in ways beyond reducing our carbon footprints. Anne Therese Gennari: “Do our individual actions matter… and I think it matters so much, for 4 reasons. The first one is that it mends anxiety. A lot of people are starting to experience climate anxiety, and the first step out of that is actually to put yourself back in power. Choosing optimism is not enough. Telling ourselves, ‘I want to be optimistic,' is gonna fall short very quickly, but if we keep showing up for that work and that change, we're actually fueling the optimism from within. And that's how we keep going. The second one is that it builds character. So, the things that you do every day start to build up your habits, and that builds your character. Recognizing that the things we do becomes the identity that we hold onto, and that actually plays a huge part on what I'll say next, which is, start shifting the culture. We are social creatures, and we always look to our surroundings to see what's acceptable and okay and not cool and all these things, so the more of us that do something, it starts to shift norms and create a new culture, and we have a lot of power when we start to shift the culture. And then lastly, I'll just say, we always plant seeds. So whatever you do, someone else might see and pick up on, you never know what's gonna ripple effect from your actions.” No one person can make every change needed, but we can all do something. Every small action has the potential to create positive effects you'll never know. One surprising piece of information is that some of the things we're doing that we know are bad for the environment—like online delivery—may have more of a positive environmental impact than we thought. While the sheer amount of product that we order—especially non-essential items—is definitely exacerbating climate change, there are some positive takeaways. Christopher Mims, tech columnist at the Wall Street Journal and author of Arriving Today, on how everything gets from the factory to our front door, explains how, especially once our transportation and delivery vehicles have been electrified, ordering online may be a significantly greener alternative to shopping in stores. Christopher Mims: “The good news—you would think all of this ordering stuff online is terrible for the environment—look, it's bad for the environment in as much as it makes us consume more. We're all over-consuming, on average. But it's good for the environment in that, people forget, hopping into a 2 or 3 thousand pound car and driving to the grocery store—or a store—to get 5 to 15 pounds of goods and driving it home is horribly inefficient compared to putting the same amount of goods onto a giant box truck that can make 150 stops (if you're talking about a UPS or an Amazon delivery van), or a few dozen if you're talking about groceries. The funny thing is that delivery has the potential to be way more sustainable, and involve way less waste than our current system of going to stores. Frankly, physical retail is kind of a nightmare environmentally.” That's only a small piece of the puzzle, and there are still social and economic issues involved in the direct-to-home delivery industry. More important in regards to our personal responsibility is to stay engaged in the conversation. A both/and mindset is best: embrace our own individual responsibilities, one of which is holding companies and entities with more direct impact on the climate accountable for making infrastructural and operational change that can give individuals more freedom to make responsible choices. Tan Copsey again. Tan Copsey: “It is about political action and engagement for me. Not just voting, but it's about everything that happens in between. It's about community engagement, and the tangible things you feel when there are solar panels on a rooftop, or New York begins to move away from gas. I mean, that's a huge thing! In a more existential sense, the news has been bad. The world is warming, and our approach to dealing with it distributes the benefits to too few people. There are definitely things you can do, and so when I talk about political pressure, I'm not just talking about political pressure for ‘climate action,' I'm talking about political pressure for climate action that benefits as many people as possible.” So, if part of our responsibility is to hold our leaders to account… what changes do we need? What should we be encouraging our leaders to do? Since we're talking about political engagement, let's start with government. Tan spoke to me about government response to another global disaster—the COVID-19 Pandemic—and some of the takeaways that might be applied to battling climate change as well. Tan Copsey: “What's really interesting to me about the pandemic is how much money governments made available, particularly the Fed in the US, and how they just pumped that money into the economy as it exists. Now, you can pump that money into the economy and change it, too, and you can change it quite dramatically. And that's what we're beginning to see in Europe as they attempt to get off Russian gas. You're seeing not just the installation of heat pumps at astonishing scale, but you're also seeing real acceleration of a push toward green energy, particularly in Germany. You're also seeing some ideas being revisited. In Germany it's changing people's minds about nuclear power, and they're keeping nukes back on.” Revisiting debates we previously felt decided on is unsettling. Making the future a better place is going to require a great deal of examination and change, which can be scary. It's also something federal governments are designed not to be able to do too quickly. But that change doesn't have to work against the existing economy; it can build with it. It might be notable to people looking at this from a monetary perspective—the world's seven most industrialized countries will lose a combined nearly $5 trillion in GDP over the next several decades if global temperatures rise by 2.6 degrees Celsius. So it behooves everyone to work on these solutions. And what are those solutions? AR Siders spoke to me about the four types of solutions to climate issues. A lot of her work involves coastal cities, so her answer uses “flooding” as an example, but the strategies apply to other problems as well. AR Siders: “So the main categories are, Resistance, so this is things like building a flood wall, putting in dunes, anything that tries to stop the water from reaching your home. Then there's Accommodation, the classic example here is elevating homes, so the water comes, and the water goes, but it does less damage because you're sort of out of the way. Then there's Avoidance, which is ‘don't build there in the first place,' (America, we're not very good at that one). And then Retreat is, once you've built there, if you can't resist or accommodate, or if those have too many costs, financial or otherwise, then maybe it's time to relocate.” We'll need to apply all four strategies to different problems as they crop up, but it's important that we're proactive and remain open to which solution works best for a given issue. City governments have tremendous opportunities to emerge as leaders in this space. Studies project that by the end of the century, US cities could be up to 10 degrees Fahrenheit warmer in the afternoon and 14 degrees warmer at night, meaning cities need to start taking action now. Phoenix, Arizona—a city that experiences the “heat island effect” year round—is actively making efforts to minimize these effects. In 2020, they began testing “cool pavement,” a chemical coating that reflects sunlight and minimizes the absorption of heat to curb the heat island effect. Additionally, measures to offer better transit options are on the table, with cities like Austin and New York emerging as leaders in the space. The Citi Bike app in New York City now shows transit information alongside rental and docking updates as acknowledgement that for many trips biking isn't enough, but in combination with buses or trains, biking can simplify and speed a commute as part of a greener lifestyle. Austin's recognition of the synergies between bikeshare and public transit has been praised as a model for other cities, as city transit agencies move away from seeing themselves as managers of assets (like busses), and towards being managers of mobility. I spoke with Art Chang, who has been a longtime entrepreneur and innovator in New York City—and who was, at the time of our discussion, running for mayor—about the need for resilience in preparing cities for the future. Art Chang: “There was a future—a digital future—for New York, but also being open to this idea that seas were rising, that global temperatures were going up, that we're going to have more violent storms, that things like the 100-year flood line may not be drawn to incorporate the future of these rising seas and storms. So we planned, deliberately and consciously, for a hundred-fifty year storm. We softened the edge of the water, because it creates such an exorbitant buffer for the rising seas and storms. We created trenches that are mostly hidden so that overflow water had a place to go. We surrounded the foundations of the building with what we call ‘bathtubs,' which are concrete enclosures that would prevent water from going into these places where so much of the infrastructure of these buildings were, and then we located as much of the mechanicals on top of the building, so they would be protected from any water. Those are some of the most major things. All technologies, they're all interconnected, they're all systems.” Making any of the changes suggested thus far requires collective action. And one of the ways in which we need to begin to collaborate better is simply to agree on the terms we're using and how we're measuring our progress. Some countries, like the United States, have an advantage when it comes to reporting on climate progress due to the amount of forests that naturally occur within their borders. That means the US can underreport emissions by factoring in the forests as “carbon sinks,” while other countries that may have lower emissions, but also fewer naturally-occurring forests, look worse on paper. This isn't factually wrong, but it obscures the work that's needed to be done in order to curb the damage. I asked Tan about these issues, and he elaborated on what he believes needs to be done. Tan Copsey: “Again, I'd say we resolve the ambiguity through government regulation. For example, the Securities and Exchange Commission is looking at ESG. So this big trend among investors and companies, the idea that you take account of environmental, social, and governance factors in your investments, in what your company does. Realistically, there hasn't been consistent measure of this. I could buy an exchange-traded fund, and it could be ‘ESG,' and I wouldn't really know what's in it. And it could be that what's in it isn't particularly good. And so regulators are really trying to look at that now and to try and standardize it, because that matters. Likewise, you have carbon markets which are sort of within European Union, and then you have voluntary carbon markets, which are often very reliant on forest credits sourced from somewhere else, where you're not quite sure if the carbon reduction is permanent or not. And yeah, there is a need for better standards there.” To do this holistically we will need to get creative with economic incentives, whether that involves offsets, green energy credits, or new programs at local, state, or national levels. One of the more aggressive and comprehensive plans for rethinking energy policy came from the EU in summer 2021, just as Germany and Belgium reeled from killer floods that were likely exacerbated by the climate crisis. The EU announced its ”Fit for 55” plans, ”a set of inter-connected proposals, which all drive toward the same goal of ensuring a fair, competitive and green transition by 2030 and beyond.” It's an approach that is systemic, recognizing the interconnectedness of a wide variety of policy areas and economic sectors: energy, transportation, buildings, land use, and forestry. And we need more programs and regulations like this. But until we have those better regulations we need, there are still things business leaders can do to make their businesses better for the environment today, so let's move away from government and talk about businesses. A lot of businesses these days pay an enormous amount of lip service (and money) to showing that they care about the environment, but the actual work being done to lower their carbon footprint or invest in cleaner business practices is a lot less significant. Tan spoke to me about this as well. Tan Copsey: “They need to move from a model which was a little bit more about PR to something that's real. In the past when a business issued a sustainability report, it was beautiful! It was glossily designed… And then when it came to like, filings with the SEC, they said ‘climate change is a serious issue and we are taking it seriously,' because their lawyers read it very, very closely. And so, if dealing with climate risk is embedded in everything you do as a business (as it probably should be), because almost every business, well, every business probably, interacts with the energy system—every business is a climate change business. They should be thinking about it, they should be reporting on it, y'know, when it comes to CEOs, it should be part of the way we assess their performance.” Nowadays, lots of companies are talking about “offsetting” their carbon emissions, or attempting to counter-act their emissions by planting trees or recapturing some of the carbon. But is this the right way to think about things? Dorothea Baur: “Offsetting is a really good thing, but the first question to ask should not be, ‘can I offset it?' or ‘how can I offset it?', but, ‘is what I'm doing, is it even necessary?'” That's Dorothea Baur, a leading expert & advisor in Europe on ethics, responsibility, and sustainability across industries such as finance, technology, and beyond. Her PhD is in NGO-business partnerships, and she's been active in research and projects around sustainable investment, corporate social responsibility, and increasingly, emerging technology such as AI. Dorothea Baur: “So, I mean, let's say my favorite passion is to fly to Barcelona every other weekend just for fun, for partying. So, instead of offsetting it, maybe I should stop doing it. And the same for tech companies saying, you know, ‘we're going to be carbon negative!' but then make the most money from totally unsustainable industries. That's kind of a double-edged sword.” It is notable that one of the key ways businesses and governments attempt to offset their emissions is “planting trees,” which has more problems than you may think. Yes, trees are an incredibly important part of a carbon sink approach, and we definitely need to plant more of them—but there's a catch to how we say we're going to do it. The promise of tree-planting has been such an easy add-on for companies' marketing campaigns to make over the years that there's a backlog of trees to be planted and not enough tree seedlings to keep up with the promises. It's not uncommon for companies to make the commitment to their customers to plant trees first, only for them to struggle to find partners to plant the promised trees. Dorothea Baur lamented this fact in her interview. Dorothea Baur: “It's also controversial, what I always joke about—the amount of trees that have been promised to be planted? I'm waiting for the day when I look out of my window in the middle of the city and they start planting trees! Because so much—I mean, the whole planet must be covered with trees! The thing is, it takes decades until the tree you plant really turns into a carbon sink. So, all that planting trees—it sounds nice, but also I think there's some double-counting going on. It's easy to get the credit for planting a tree, but it's hard to verify the reduction you achieve because it takes such a long time.” It's going to take more than lip service about tree-planting; we have to actually expand our infrastructural capability to grow and plant them, commit land to that use, and compensate for trees lost in wildfires and other natural disasters. Beyond that, we have to make sure the trees we're planting will actually have the effect we want. The New York Times published an article in March, arguing that “Reforestation can fight climate change, uplift communities and restore biodiversity. When done badly, though, it can speed extinctions and make nature less resilient…companies and countries are increasingly investing in tree planting that carpets large areas with commercial, nonnative species in the name of fighting climate change. These trees sock away carbon but provide little support to the webs of life that once thrived in those areas.” And that can mean the trees take resources away from existing plant life, killing it and eliminating the native carbon-sink—leading to a situation where net carbon emissions were reduced by nearly zero. These are problems that require collaboration and communication between industries, governments, activists, and individuals. Beyond those initiatives, companies can also improve their climate impact by investing in improvements to transportation for employees and customers, perhaps offering public transit or electric vehicle incentives to employees, or investing in a partnership with their municipality to provide electric vehicle charging stations at offices and storefronts. Additionally, business responsibility may include strategic adjustments to the supply chain or to materials used in products, packaging, or delivery. Another issue when it comes to offsetting emissions is the leeway the tech industry gives itself when it comes to measuring their own global climate impact, when the materials they need to build technology is one of the chief contributors to carbon emissions. Dorothea Baur again. Dorothea Baur: “The whole supply chain of the IT industry is also heavily based on minerals. There are actually, there are really interesting initiatives also by tech companies, or like commodity companies that specifically focus on the minerals or the metals that are in our computers. Like cobalt, there's a new transparency initiative, a fair cobalt initiative. So they are aware of this, but if you look at where is the main focus, it's more on the output than on the input. And even though the tech companies say, ‘oh, we're going to be carbon neutral or carbon negative,' as long as they sell their cloud services to the fossil industry, that's basically irrelevant.” Currently, AI tech is an “energy glutton”—training just one machine learning algorithm can produce CO2 emissions that are 5 times more than the lifetime emissions of a car. But there is still hope for AI as a tool to help with climate change, namely using it to learn how to more efficiently run energy grids and predict energy usage, especially as energy grids become more complicated with combined use of solar, wind, and water power in addition to traditional fossil fuels. AI can also make the global supply chain more efficient, reducing emissions and speeding up the process of developing new, cleaner materials. One small-scale use-case is “Trashbot,” which sorts waste materials into categories using sensors and cameras, eliminating the need for people to try to sort out their own recyclables. What's clear from every emerging report is that net zero emissions are no longer enough. We need governments and companies and every entity possible to commit to net negative emissions. Cities need ambitious plans for incentivizing buildings that sequester carbon. Companies need logistics overhauls to ensure their supply chains are as compliant as possible, and then some. Tan Copsey: ““What's interesting is when they talk about Net Zero—particularly companies, but also a lot of governments—they talk about Net Zero by 2050. What is that, 28 years. 28 years is still a long time away, and if you're a government, the current president certainly won't be president in 2050. If you're a company CEO, you may not be CEO next quarter, let alone in 28 years, and so we have to have nearer-term targets. You want to be Net Zero by 2050? Tell me how you're gonna get there. Tell me what you're gonna do by 2030, tell me what you're gonna do by next quarter. One of the things that encourages me is things like change in financial regulation, which sounds arcane and slightly off-topic, but it's not. It's about what companies report when, and how investors hold those companies to account to nearer-term action, because that's how we get there.” One of the reasons that corporations do so little to minimize their carbon footprint is that they don't accurately measure their own carbon emissions. Using AI to track emissions can show problem areas, and what can be done to address those issues. Abhishek Gupta, machine learning engineer, founder of the Montreal AI Ethics Institute, and board member of Microsoft's CSE Responsible AI board, spoke to me about an initiative he's working on to help ease this burden by making it easier for developers to track the effect they're having on the environment by incorporating data collection into their existing workflow. Abhishek Gupta: “One of the projects that we're working on is to help developers assess the environmental impacts of the work that they do. Not to say that there aren't initiative already, there are—the problem with a lot of these are, they ignore the developer's workflow. So the problem then is, if you're asking me to go to an external website and put in all of this information, chances are I might do it the first couple of times, but I start to drop the ball later on. But if you were to integrate this in a manner that is similar to ML Flow, now that's something that's a little more natural to the developer workflow; data science workflow. If you were to integrate the environmental impacts in a way that follows this precedent that's set by something like ML Flow, there is a lot higher of a possibility for people taking you up on that, and subsequently reporting those outcomes back to you, rather than me having to go to an external website, fill out a form, take that PDF report of whatever… that's just too much effort. So that's really what we're trying to do, is to make it easy for you to do the right thing.” And Abhishek isn't the only one who sees potential in AI. Dorothea Baur also spoke to me about her belief in AI, although she sees us using it for a different purpose. Dorothea Baur: “AI has huge potential to cause good, especially when it comes to environmental sustainability. For example, the whole problem of pattern recognition in machine learning, where if it's applied to humans, it is full of biases, and it kind of confuses correlation and causation, and it's violating privacy, etc. There are a lot of issues that you don't have when you use the same kind of technology in a natural science context, you know? Where you just observe patterns of oceans and clouds and whatever, or when you try to control the extinction of species. I mean, animals don't have a need for or a right to privacy, so why not use AI in contexts where it doesn't violate anyone's moral rights? And where you, at the same time, resolve a real problem.” Turning AI and algorithms away from people and towards nature is a wise decision in many respects. A lot of our efforts to curb the effects of climate change thus far have overlooked the same people that are overlooked in our data, and in almost every measurable respect, negative impacts of the climate crisis are felt most by marginalized populations and poorer communities. Tan Copsey: “I think that when it comes to climate tech, you need to think about who it's supposed to benefit. There's more than 7B people on earth, it can't just be for the US market, it has to be for everyone.” “The best futures for the most people” really comes into play here—communities of color are often more at risk from air pollution, due to decades of redlining forcing them into more dangerous areas. Seniors, people with disabilities, and people with chronic illnesses may have a harder time surviving extreme heat or quickly evacuating from natural disasters. Subsidized housing is often located in a flood plain, causing mold, and frequently lacks adequate insulation or air conditioning. People with a low-income may also be hard-pressed to afford insurance or be able to come back from an extreme loss after catastrophe strikes. Some indigenous communities have already lost their homelands to rising sea levels and drought. Indigenous communities, speaking of, often have traditional approaches—empowered by millennia of historical experience—to living gently on the planet and a mindset for cooperating with nature that are well worth learning. Seeking leadership on climate issues from Indigenous people should be a priority. An article published by Mongabay on December 21, 2021 gives an example of an initiative in Mexico that is using the knowledge of indigenous communities, and is working. Essentially, the Ejido Verde company grants interest-free loans to local communities to plant and tend pine trees for the tapping of resin, a multibillion-dollar global industry. Younger generations are eager to participate, and fewer people feel the need to migrate away from their homes. According to a paper by the Royal Botanic Gardens of Kew, the only way that recovery can work is if it is based on sound science, supported by fair governance, incentivized by long-term funding mechanisms, and guided by indigenous knowledge and local communities. Speaking of long-term funding mechanisms, let's talk about another group of leaders who have the potential to make a drastic positive impact today: private investors. Activist investors may seem unwelcome, but when they're making priorities known on behalf of humanity, they're ultimately doing us all a service. These people have the ability to help shape company and government policy by letting their dollars speak for us, by investing in solutions and burgeoning industries that we drastically need. That's been happening, such as when the shareholders of both ExxonMobil and Chevron sent strong messages about getting serious with respect to climate responsibility. In Europe, shareholder votes and a Dutch court ordered Royal Dutch Shell to cut its emissions faster than they'd already been planning. And social and financial pressure is a good way to nudge executives in the right direction, especially leaders who don't make climate-friendly decisions out of fear of pushback from their boards and investors. Tan Copsey: “Investors increasingly should be thinking about the companies they invest in on the basis of their climate performance. And that isn't just, ‘oh, they reduced some greenhouse gas emissions,' because, y'know, you look at a lot of tech companies and they have reduced greenhouse gas emissions, but really they have to do more than that. For businesses in other sectors, it may not be that simple. Certainly there are harder to abate sectors, and so it could be that you are the CEO of a steel company, and your emissions are still gigantic, but the change you can make by introducing, say, hydrogen, and getting rid of coal, or introducing renewable energy plus hydrogen to your—the way in which you do steel, is transformative for the global economy and transformative for the climate system, and in a way investing in that company is more climate-friendly than investing in a tech company; but chances are you have an ETF and you're doing both.” Despite everything I've talked about today, it's important for all of us to remain optimistic. I asked Anne Therese Gennari why optimism is important, and her answer didn't disappoint. Anne Therese Gennari: “Optimism, for scientific reasons, is actually very important. If you look to neuroscience, we need optimism to believe something better is possible, and then find the motivation and the courage to take action right now to get us closer to that goal. And I think there is a huge difference between optimism and toxic positivity, and I think a lot of people who don't agree with optimism associate it with always trying to be happy, thinking good thoughts and hoping things will turn out to the better. And that's why I love to come back to this understanding that ‘awareness hurts, and that's okay.' Because when we tell ourselves that not everything is beautiful, and sometimes things will be painful, we can actually handle that, and we can take that. But from that place of awareness, we can start to grow a seed of hope and tell ourselves, ‘well, what if? What if we did take action, and this happened? What if we can create a more beautiful world in the future? And so, we can paint a picture that's all doomsday, or we can paint one that's beautiful. So which one do we want to start working towards?” And if you find yourself saying, “I really want to be optimistic, but it's too hard! There's just so much bad news out there…” don't fret! You aren't alone. You might even say that's a quite human response. Anne Therese Gennari: “We're human beings, and as a species, we respond to certain kinds of information in different ways. Information that's negative or fear based has a very limiting response in our brains. When we hear something that's overwhelming, like climate change, and we know it's urgent, we might understand that it's urgent, but the action isn't there. Because how our brains respond to something that we don't want to happen is actually to not take action. And it goes back to way back in time, where like, you're facing this dangerous animal, and you're like ‘there's no way I can fight this animal, I can't outrun it, so what am I gonna do? I'm gonna stand here super still and hope that it doesn't see me.' That's literally what our brains think about when something's that overwhelming. And so I think the more urgent the matter is, the more important it is that we actually fuel ourselves with an optimistic future or goal to work towards, because that is the only way that we can actually trigger action.” So let's fuel our minds with an optimistic future to work towards. Despite all the bad news you've heard—even on this episode—there are a lot of hopeful developments happening! The most recent U.N. Climate Conference, COP26, established the Glasgow Climate Pact, which recognizes that the situation is at an emergency level, asking countries to accelerate their plans by calling for provable action by next year. Policy changes, government regulations, and people becoming motivated are all on the rise. Caleb Gardner, who was lead digital strategist for President Obama's political advocacy group, OFA and is now founding partner of 18 Coffees, a strategy firm working at the intersection of digital innovation, social change, and the future of work, spoke to me about what he's most optimistic about, which is right in line with this show's values. Caleb Gardner: “I'm probably most optimistic about technology's ability to tackle global problems like climate change. I'm actually pretty bullish on technology's ability to solve and actually innovate around the reduction of carbon in our atmosphere, electric vehicles, electric grid… and what's great is a lot of that's already being driven by the private sector around the world, so it's not as dependent on government as we think that it is.” So let's talk about some of the emerging technologies that show a lot of promise in mitigating the effects of climate change—and that might make sense to invest in, if you have the means to do so. A team of UCLA scientists led by Aaswath Raman has developed a thin, mirror-like film that reflects heat to outer space through radiative cooling, and can lower the temperatures of objects it's applied to by more than 10 degrees. The idea comes from generations of knowledge from people living in desert climates who learned to cool water by letting the heat radiate out of it overnight. If this film were added to paint and/or applied to pipes and refrigeration units, it could help cool buildings and make refrigeration systems more efficient, reducing the need for air conditioning, which accounts for as much as 70% of residential energy demand in the United States and Middle East. One of the strongest selling points of innovations like this film is that it doesn't need electricity; it only needs a clear day to do its job. Another innovation in reflecting energy back into space comes in the form of ‘cloud brightening,' a technique where salt drops are sprayed into the sky so that clouds reflect more radiation, allowing us to refreeze the polar ice caps. Then there's the new trend of green roofs, in particular the California Academy of Sciences' Living Roof, which spans 2.5 acres and runs six inches deep, with an estimated 1.7 million plants, collecting 100 percent of storm water runoff and offering insulation to the building below. The whole endeavor is brilliantly hopeful and strategic. A massive green roof is completely on brand for a science museum, but that doesn't mean other buildings and businesses wouldn't benefit from them as well. The National Park Service even estimates that over a forty year building lifespan, a green roof could save a typical structure about $200,000, nearly two-thirds of which would come from reduced energy costs. Other building technologies move beyond solar panels and green roofs, with automated building management systems detecting usage patterns of lighting, heating, and air conditioning. There have also been innovations in window insulation, trapping heat during the winter and blocking it out in the summer. ‘Green cement' can be heated to lower temperatures and cuts emissions by a third compared to regular cement. There are new Hydrogen-powered ships whose emissions are water. Electric planes have been developed for short-distance flights. Large floating solar power installations have the potential to generate terawatts of energy on a global scale, and when built near hydropower, can generate electricity even in the dark. Lithium batteries continue to get smaller and more efficient, and can be charged faster and more often than other batteries, making electric vehicles cheaper. And speaking of electric vehicles, they can help with our energy storage problems, with owners buying electricity at night to charge their cars and selling it to the grid when demand is high and cars are unused during the day. Feeding cows seaweed and replacing beef with insects such as mealworms can drastically reduce methane emissions. Scientists in Argentina are working on backpacks for cows that collect their methane, which have shown to collect enough methane from a single cow every day to fuel a refrigerator for 24 hours. To help curb other types of emissions, carbon capture and storage technologies like NZT allow us to capture CO2 in offshore storage sites several kilometres beneath the North Sea. But it's not just about new technologies, or technologies that only work for the richest people. Here's Tan again to elaborate on this idea. Tan Copsey: “This is a really tricky moment, y'know, this is a really bad time to be inefficiently using the resources we have. As we think about climate tech, think about optimizing mobility, as well as copying the existing model. There's a lot of existing tech out there that would make people's lives better—very simple irrigation systems—and so, we shouldn't just think of this in terms of big new exciting things, we should think about it in terms of deploying existing things.” All of this is part of embracing the mindset that says things can change. We need a can-do mindset, but we also need clarity and collaboration. Basically all options need to be implemented if we want to curb the damage that has already been done. Our solutions need to work in conjunction with one another, and support the greatest number of people. To close out, here's Christopher Mims with the last word on putting away the doom and gloom, and remaining optimistic in the face of overwhelming adversity. Christopher Mims: “If you really think about the whole sweep of human history, we live in a time where the pace of especially technological, and therefore in some ways cultural change, is so much faster than ever. We keep inventing new ways to kind of trip ourselves up, and then we have to just adapt so quickly to them. We're constantly playing catch-up with our own technological and social developments. So there's a lot of beating ourselves up over like, ‘woah, how come we didn't do it this way, or we didn't do this right?' or whatever. Sometimes I'm just like, ahh, just chill! We're going as fast as we can. It's very easy to get caught up in the moment to moment, but I think there is this kind of overall arc where, if we don't cook ourselves to death, or blow ourselves up, or distract ourselves to death, we're moving in directions that, once we have fully understood how to live in harmony with the technology that we've created, we'll probably be okay.” Thanks for joining me on The Tech Humanist Show today. I hope you've learned something, and at the very least, that you're going into the future with more hope than you had before.

Wonks and War Rooms
Content Moderation with Andrew Strait (re-release)

Wonks and War Rooms

Play Episode Listen Later Nov 17, 2021 33:21 Transcription Available


Former content moderator and current director of the Ada Lovelace Institute, Andrew Strait and Elizabeth chat about what content moderation is, why it is always flawed, and how the way in which platforms are constructed impact the flow of content. They talk about a bunch of related issues including how to (and how not to) regulate tech companies in order to minimize harms.Additional ResourcesAndrew recommended two great books that look at content moderation and content moderators: Behind the Screen by Sarah T. Roberts and Custodians of the Internet by Tarleton Gillespie. This interview with Sarah T. Roberts discusses the psychological impact of being a content moderator. After the interview Andrew also mentioned the work of Daphne Keller and Robyn Caplan.Andrew brings up the landmark “right to be forgotten” case from 2014.The German regulation mentioned in this episode is NetzDG. Here is a primer written by academics Heidi Tworek and Paddy Leerssen in April 2019, just over a year after the regulation came into effect.This episode Andrew mentions the idea of affordances. To learn more about this concept make sure to come back for next week's episode where we will explore technological affordances!

The Reboot from Tech for the People
Episode 3: Security is Political

The Reboot from Tech for the People

Play Episode Listen Later Nov 15, 2021 43:56


Rae and Chris talk about what a 0-day vulnerability is and why you should care, the NSO group being used to attack Palestinian human rights and civil society organizations, why tech companies don't take security more seriously -- and a lot more.Rae on Twitter: @raraeraeza Chris on Twitter: @cmgTech for the People: @techfortheppl on Twitter, Facebook, InstagramRae's article: Corporeal moderation: digital labour as affective good - If you have institutional access, you can find it at Wiley Online or through your academic institution's library https://onlinelibrary.wiley.com/doi/epdf/10.1111/1469-8676.13106Elinor Carmi's book Media Distortions (Open Access!): https://media-distortions.net/Sarah T. Roberts' book on commercial content moderation, Behind the Screen: https://yalebooks.yale.edu/book/9780300235883/behind-screenKim Zetter, Countdown to Zero Day: https://www.penguinrandomhouse.com/books/219931/countdown-to-zero-day-by-kim-zetter/Front Line Defenders - Six Palestinian human rights defenders hacked with NSO Group's Pegasus Spyware: https://www.frontlinedefenders.org/en/statement-report/statement-targeting-palestinian-hrds-pegasus

Pb Living - A daily book review
A Book Review - Behind the Screen: Content Moderation in the Shadows of Social Media Book by Sarah T. Roberts

Pb Living - A daily book review

Play Episode Listen Later May 14, 2021 9:53


Social media on the internet can be a nightmarish place. A primary shield against hateful language, violent videos, and online cruelty uploaded by users is not an algorithm. It is people. Mostly invisible by design, more than 100,000 commercial content moderators evaluate posts on mainstream social media platforms: enforcing internal policies, training artificial intelligence systems, and actively screening and removing offensive material—sometimes thousands of items per day. Sarah T. Roberts, an award-winning social media scholar, offers the first extensive ethnographic study of the commercial content moderation industry. Based on interviews with workers from Silicon Valley to the Philippines, at boutique firms and at major social media companies, she contextualizes this hidden industry and examines the emotional toll it takes on its workers. This revealing investigation of the people “behind the screen” offers insights into not only the reality of our commercial internet but the future of globalized labor in the digital age --- This episode is sponsored by · Anchor: The easiest way to make a podcast. https://anchor.fm/app --- Send in a voice message: https://anchor.fm/pbliving/message Support this podcast: https://anchor.fm/pbliving/support

Are You a Robot?
S5E1: Content Moderation of Social Media // Sarah T. Roberts

Are You a Robot?

Play Episode Listen Later Apr 12, 2021 64:00


In this episode of Are You A Robot? Sarah T. Roberts joins us to discuss her book, “Behind the Screen: Content Moderation in the Shadows of Social Media,” which was released in June 2019. Sarah is an associate professor of Information Studies at the UCLA School of Education and Information Studies, specializing in Internet culture, social media, and the intersection of media, technology and society. You can follow Sarah on Twitter (@ubiquity75) or LinkedIn: https://bit.ly/3s1Het3 This episode is brought to you by EthicsGrade, an ESG Ratings agency with a particular focus on Technology Governance, especially AI Ethics. You can find more information about EthicsGrade here: https://www.ethicsgrade.io/​ You can also follow EthicsGrade on Twitter (@EthicsGrade) and LinkedIn: https://bit.ly/2JCiQOg​ ​Connect with Us: Join our Slack channel for more conversation about the big ethics issues that rise from AI: https://bit.ly/3jVdNov​ Follow Are You A Robot? on Twitter, Instagram and Facebook: @AreYouARobotPod Follow our LinkedIn page: https://bit.ly/3gqzbSw​ Subscribe to our newsletter: https://bit.ly/3r4qj9R​ Follow Demetrios on Twitter @Dpbrinkm and LinkedIn: https://bit.ly/2TPrA5w Sarah's book and articles: Behind the Screen: Content Moderation in the Shadows of Social Media: https://amzn.to/3fYaFd2 Meet the people who scar themselves to clean up our social media networks: https://bit.ly/3wKAKlM “Digital detritus: ‘Error' and the logic of opacity in social media content moderation”: https://bit.ly/2OyLhzl

The Sunday Show
Social Media, Speech & Content Moderation at Scale

The Sunday Show

Play Episode Listen Later Mar 21, 2021 72:37


This episode features a discussion on the challenges of content moderation at scale with four great experts on the the key issues including Tarleton Gillespie, a Principal Researcher at Microsoft Research New England and an Adjunct Associate Professor in the Department of Communication at Cornell University; Kate Klonick, Assistant Professor at Law at St. John's University Law School and an Affiliate Fellow at the Information Society Project at Yale Law School; Jameel Jaffer, the executive director of the Knight First Amendment Institute at Columbia University; and Sarah T. Roberts, Assistant Professor of Information Studies at UCLA. Tech Policy Press fellow Romi Geller and cofounder Bryan Jones discuss news of the day.

The Radical AI Podcast
All Tech is Human Series #8 - Improving Social Media: Content Moderation & Democracy with Sarah T. Roberts & Murtaza Shaikh

The Radical AI Podcast

Play Episode Listen Later Jan 27, 2021 64:36


This conversation explores the topic Improving Social Media: Content Moderation & Democracy with invited panelists Sarah T. Roberts and Murtaza Shaikh Sarah T. Roberts is the co-founder and Co-Director of the UCLA Center for Critical Internet Inquiry, and the author of Behind the Screen: Content Moderation in the Shadows of Social Media. Murtaza Shaikh is the Senior Advisor on Hate Speech, Social Media and Minorities to the UN Special Rapporteur on Minority Issues This conversation is moderated by All Tech Is Human's David Ryan Polgar. The organizational partner for the event is TheBridge. The conversation does not stop here! For each of the episodes in our series with All Tech is Human, you can find a detailed “continue the conversation” page on our website radicalai.org. For each episode we will include all of the action items we just debriefed as well as annotated resources that were mentioned by the guest speakers during the livestream, ways to get involved, relevant podcast episodes, books, and other publications. 

Response-ability.Tech
Making Tech Accountable: Reflecting on 2020. With Martha Dark

Response-ability.Tech

Play Episode Listen Later Dec 16, 2020 26:00


In this episode we catch up with Martha Dark. Martha is the co-founder of Foxglove, a new NGO that exists to make tech fair for everyone. Made up of lawyers, technology experts and communications specialists, Foxglove believe that governments and big tech companies are misusing digital technology, and that this is harming the rest of us. Their aim is to fix this situation.We interviewed Martha in Episode 2 ahead of her talk at the 2020 conference. What better way to end the year than by celebrating the ways in which Foxglove has held tech accountable in 2020. During our conversation, Martha shares Foxglove's successes:Calling for transparency around the U.K. Government's secretive deals around the transfer the personal health information of millions of NHS users to private tech firms during the pandemic such as Amazon and Google, and controversial AI firms Faculty and Palantir.Calling out Google profiling children who are watching YouTube videos in order to deliver targeted ads to them.Calling out the U.K. Home Office's “racist” visa streaming algorithm which reviewed applications against a “suspect nationalities” list.Fighting for fair treatment for Facebook's social media content moderators, and for safe working conditions during the pandemic.Challenging the U.K. Government's devastating use of algorithms to grade A-level exams.Martha also shares with us what Foxglove will likely be working on in 2021. Lastly, Martha explains why Foxglove needs our help and how we can all support Foxglove's work in holding tech accountable.We loved talking to Martha and we hope you enjoy the episode. You can follow Foxglove on Twitter and find out more about them at foxglove.org.ukVisit us online at anthtechconf.co.uk and sign up for our newsletter, or follow us on LinkedIn and Twitter.Mentioned our conversation:Behind the Screen: Content Moderation in the Shadows of Social Media by Sarah T. Roberts.Ghost Work: How to Stop Silicon Valley From Building a New Global Underclass by Mary L. Gray and Siddharth Suri.Open letter to Facebook's leaders signed by over 200 Facebook content moderators from across the worldFacebook moderators forced to work in Dublin office despite high-tier lockdown, 23 October 2020.Leo Varadkar to press Facebook on working conditions for its content moderators, 15 November 2020.

Wonks and War Rooms
Content Moderation with Andrew Strait

Wonks and War Rooms

Play Episode Listen Later Dec 2, 2020 32:53 Transcription Available


Former content moderator, Andrew Strait, and Elizabeth chat about what content moderation is, why it is always flawed, and how the way platforms are constructed impact the flow of content. They talk about a bunch of related issues including how to (and how not to) regulate tech companies in order to minimize harms. Additional ResourcesAndrew recommended two great books that look at content moderation and content moderators: Behind the Screen by Sarah T. Roberts and Custodians of the Internet by Tarleton Gillespie.After the interview Andrew also mentioned the work of Daphne Keller and Robyn Caplan.The German regulation mentioned in this episode is NetzDG. Here is a primer written by academics Heidi Tworek and Paddy Leerssen in April 2019, just over a year after the regulation came into effect.Andrew quickly mentioned "safe harbor" (in the US you might hear "Section 230"). Here is a brief explainer from Reuters.

Chronique des médias
Chronique des médias - La modération des plateformes dans le viseur

Chronique des médias

Play Episode Listen Later Oct 30, 2020 2:45


La modération des réseaux sociaux et des plateformes en ligne est de plus en plus interrogée et critiquée par les citoyens et le pouvoir politique. Le tweet a été retiré, le compte même supprimé, mais le message de l’ancien Premier ministre malaisien a eu le temps de faire le tour du monde sur Twitter. « Les musulmans, y disait-il, ont le droit d’être en colère et de tuer des millions de Français pour les massacres du passé ». À 95 ans, Mahathir Mohamad a été par deux fois Premier ministre de Malaisie, pendant une période de 24 ans, et ce jusqu’en février dernier. Il était donc suivi par 1,3 million de personnes sur le réseau social. Il a fallu l’intervention de Cédric O, le secrétaire d’État français au Numérique, pour que Twitter procède à la suppression de son compte, alors même que son message génocidaire était jusque-là maintenu avec un avertissement. C’est dire toute la difficulté qu’éprouvent les plateformes internet à réguler les messages incendiaires d’appel à la haine. Une chercheuse américaine, Sarah T. Roberts, vient de publier en français Derrière les écrans, un livre dans lequel elle explique que les Facebook, les Twitter ou les Instagram nous vendent la vision d’une modération automatisée à partir de mots-clés et de machine learning. En réalité, dit-elle, « c’est un doux rêve » vendu au public « pour réduire la pression autour de la modération ». 15 000 modérateurs chez Facebook Car pour retirer les messages les plus violents en tenant compte du contexte, de l’humour ou de l’ironie éventuelle, il faut toujours un œil humain. Sur Facebook, on trouve ainsi une armée de 15 000 modérateurs qui doivent supporter de voir le pire dans des conditions éprouvantes. Il leur faut à la fois censurer ce qui franchit les bornes de l’acceptable et laisser passer ce qui peut choquer mais relève de la liberté d’expression et permet de capter du temps de cerveau disponible chez l’internaute. Parfois, on l’a vu pendant la campagne américaine, la modération devient un véritable censeur éditorial. Les patrons de Twitter et Facebook ont ainsi été convoqués par les sénateurs républicains pour avoir bloqué un article du New York Post qui faisait état de prétendus mails d’Hunter Biden, le fils du candidat démocrate, à propos de ses affaires en Ukraine. La visibilité de l’article avait été réduite car les deux plateformes ne croyaient pas à la véracité de ces mails. Ce qui est sûr, c’est que les géants du web ne peuvent plus se dédouaner de leur responsabilité. La Commission européenne, qui prépare pour décembre une loi sur les services numériques, a déjà fait savoir que les plateformes devront coopérer avec les régulateurs, fournir des informations sur leurs algorithmes et dire aux utilisateurs comment marche leur système de recommandation de contenus et leur modération.

The Tech Humanist Show
The Tech Humanist Show: Episode 12 – Dr. Sarah T. Roberts

The Tech Humanist Show

Play Episode Listen Later Oct 9, 2020 55:34


About this episode's guest: Sarah T. Roberts is an Assistant Professor in the Department of Information Studies, Graduate School of Education & Information Studies, at UCLA. She holds a Ph.D. from the iSchool at the University of Illinois at Urbana-Champaign. Prior to joining UCLA in 2016, she was an Assistant Professor in the Faculty of Information and Media Studies at Western University in London, Ontario for three years. On the internet since 1993, she was previously an information technology professional for 15 years, and, as such, her research interests focus on information work and workers and on the social, economic and political impact of the widespread adoption of the internet in everyday life. Since 2010, the main focus of her research has been to uncover the ecosystem – made up of people, practices and politics – of content moderation of major social media platforms, news media companies, and corporate brands. She served as consultant to and is featured in the award-winning documentary The Cleaners, which debuted at Sundance 2018 and aired on PBS in the United States in November 2018. Roberts is frequently consulted by the press and others on issues related to commercial content moderation and to social media, society and culture, in general. She has been interviewed on these topics in print, on radio and on television worldwide including: The New York Times, Associated Press, NPR, Le Monde, The Atlantic, The Economist, BBC Nightly News, the CBC, The Los Angeles Times, Rolling Stone, Wired, The Washington Post, Australian Broadcasting Corporation, SPIEGEL Online, and CNN, among many others. She is a 2018 Carnegie Fellow and a 2018 recipient of the EFF Barlow Pioneer Award for her groundbreaking research on content moderation of social media. She tweets as @ubiquity75. This episode streamed live on Thursday, October 1, 2020. Here's an archive of the show on YouTube: About the show: The Tech Humanist Show is a multi-media-format program exploring how data and technology shape the human experience. Hosted by Kate O'Neill. Subscribe to The Tech Humanist Show hosted by Kate O'Neill channel on YouTube for updates. Transcript 01:43all right01:44hey humans01:48how we doing out there come on in start01:50gathering around the uh the old digital01:52campfire01:54let me hear from those of you who are in01:55line uh right now tell me01:57tell me who's out there and tell me01:59where you're tuning in from02:01i hope you're starting to get your02:02questions and thoughts ready02:04for our guest i'm sure many of you have02:06already seen who our guest is and i'll02:07be reading her bio here in just a moment02:09so start thinking of your questions02:11about commercial content moderation and02:13what you want to02:14know about that and you know all that02:17kind of stuff02:18uh i hear sarah laughing in the02:19background it's not to laugh02:22really good valid questions i think i02:25was just snorting02:26honestly through my uh through my sinus02:29trouble02:30so uh welcome to those of you who are02:32all tuned in welcome to the tech02:34humanist show this is a multimedia02:36format program02:37exploring how data and technology shape02:39the human experience02:41and i am your host kate o'neil so i hope02:44you'll subscribe and follow wherever02:45you're catching this02:46so that you won't miss any new episodes02:49i02:50am going to introduce our guest here in02:51just a moment uh one one last shout out02:53if anybody's out there wanting to say hi02:56feel free02:56you are welcome to comment and i see a02:59bunch of you03:00online so feel free to tune uh03:03comment in and tell me who you are and03:05where you're tuning in from03:07but just get those you know type in03:08fingers warmed up because we're gonna03:10want you to03:10to weigh in with some questions and03:12comments as the show goes on03:14but now i'll go ahead and introduce our03:17esteemed guest so today we have the03:19very great privilege of talking with03:21sarah t roberts who03:22is an assistant professor in the03:24department of information studies03:26graduate school of education and03:28information studies at ucla03:30she holds a phd from the ischool at the03:32university of illinois urbana-champaign03:34my sister's school i went to university03:36of illinois chicago03:38prior to joining ucla in 2016 she was an03:40assistant professor03:42in the faculty of information and media03:44studies at western university in london03:46ontario for three years03:47on the internet since 1993 she was03:50previously an information technology03:52professional for 15 years and as such03:54her research interests focus on03:56information work and workers and on the03:58social03:59economic and political impact of the04:01widespread adoption of the internet in04:02everyday life right totally04:06so since 2010 the main focus of her04:08research has been to uncover the04:10ecosystem04:11made up of people practices and politics04:14of content moderation of major social04:16media platforms04:17news media companies and corporate04:19brands04:20she served as consultant tune is04:21featured in the award-winning04:22documentary04:23the cleaners which debuted at sundance04:26201804:27and aired on pbs in the united states in04:29november04:30 so roberts is frequently consulted04:33by the press and others on issues04:34related to commercial content moderation04:36and to social media society and culture04:38in general04:39she's been interviewed on these topics04:41in print on radio04:42on television worldwide and now on the04:44tech humanist show04:45uh including the new york times04:47associated press npr04:48le monde the atlantic i mean this list04:50is going to go on and on so04:52buckle in folks the economist bbc04:55rolling stone wired and picking and04:57choosing now it's a really really04:59impressive list of media05:00she's a 2018 carnegie fellow and a 201805:04recipient of the eff barlow05:06pioneer award for her groundbreaking05:08research on content moderation05:10of social media so audience again please05:12start getting your questions ready for05:13our outstanding guest05:15please do note as a live show i well05:17i'll do my best to vet comments and05:19questions in real time05:20we may not get to all of them but very05:23much appreciate05:24you being here tuned in and05:25participating in the show so with that05:27please welcome uh our dear guest05:31sarah t roberts and you are live on the05:34show05:34sarah thank you so much for being here05:37thank you uh05:38thanks for the invitation and thanks to05:40your audience and05:41uh all those interested folks who are05:44spending time with us today i'm really05:45grateful05:46for the opportunity we've already got uh05:48david polgar05:49saying excited for today's talk hey our05:52buddy05:53dave drp05:54[Laughter]05:56all right so i wanna talk right away05:59about your um06:01your book behind the screen i i hadn't06:03had a chance to read and until i was06:05preparing for the06:06show and it was it was wonderful to get06:07a chance to dig into your research06:09so tell us a little bit about that came06:11out last year is that right06:13um yeah it just just a little over a06:15year ago uh06:16came out on on yale university press06:19um you know the academic06:23publishing cycle is its own beast it's06:25its own world06:26it uh as it relates to06:29um kind of like journalism and and06:31mainstream press timelines it's much06:33slower06:34that said uh i wrote the book in about a06:37year which is about a normal06:39a normal cycle but it took about eight06:42years to put together the research that06:44went into the book06:46and this is because when i started my06:48research in 201006:50which you know we say 2010 it seems like06:53yesterday that was a decade ago now06:55you know if we're in terminable 202006:59you know which is which is a million07:01years long so far but07:03back in 2010 when i started looking into07:05this topic as a07:07as a doctoral researcher at the07:09university of illinois07:10uh you know there were a lot of things07:12stacked against that endeavor07:14including the fact that i was a doctoral07:16student at the university of illinois i07:17had no cachet i had very few07:20like material resources um you know to07:23finance07:24a study that would require uh07:27at the end of the day required going07:29around the world quite literally07:32but maybe the biggest barrier at the07:34time was the fact07:36that i was still fighting an uphill07:38battle trying to tell people07:40that major mainstream social media07:43platforms07:44were engaged in a practice that is now07:47weirdly um you know a phrase that you07:51might say around the dinner table and07:52everyone would get which is content07:54moderation07:55and that further when i would um raise07:58the issue08:00and and bring up the fact that firms08:01were engaged in this practice which08:04you know has to do with the adjudication08:06of people's08:08self-expression online and sits08:10somewhere between users08:13and the platform and then the platform's08:15recirculation of users material08:18uh you know people would argue with me08:20at that point08:22about the fact that that practice would08:24even go on08:25and then when i would say that uh you08:27know kind of offer08:28incontrovertible proof that in fact it08:30did go on uh08:32then we would uh find ourselves in a08:34debate about whether or not08:36it was a legion of human beings08:40who was undertaking this work or uh in08:43fact it was computational08:45now in 2010 in 2020 the landscape is08:48complicated but in 201008:51the technology and the sort of08:53widespread adoption08:54of of computational uh08:58automated let's say algorithmic kinds of09:01content moderation or machine learning09:03and forum content moderation was not a09:05thing09:05it was humans and so i had to start the09:09conversation09:10so far below baseline09:14that it you know it took uh it took09:17quite a lot of effort just to get09:19everybody on the same page to discuss it09:22and you know when i'm talking about09:24uh engaging in these conversations i09:27mean just like trying to vet this as a09:29as an appropriate research topic at the09:32graduate school you know what i mean09:34like to get faculty members09:36many of whom were world experts in in09:39various aspects of uh of the internet or09:42of09:42media or information systems themselves09:46um it was new to them too that was did09:49you originally frame it was it it's a09:51question of how09:52is this done or what was the original09:54framework of that question yeah09:56so i'll tell you a little bit about the09:57origin of why i got interested10:00and it's something that i write about in10:01the book because i think it's so10:03important to acknowledge kind of those10:06those antecedents i had read i was10:08actually teaching down at the university10:10of illinois in the summer10:12of 2010 and i was on a break from10:15teaching and10:16you know probably drinking a latte which10:18is what i'm doing right now10:19and um and uh uh reading the paper i was10:23reading the new york times and there was10:24a very small10:26uh but compelling article in the new10:28york times about a group of workers10:30who were there there were a couple of10:32sites they mentioned but there was in10:33particular a group of workers in rural10:35iowa well here i was sitting in rural10:38central illinois thinking about this10:40group of workers in rural iowa as10:42profiled in this piece10:44who were in fact engaging in what we now10:46know as commercial content moderation10:48they were working10:49in effectively a call center uh10:53adjudicating content for unnamed kind of10:55you know10:56media sites websites and social media10:59properties11:00and i kind of circulated that article11:03around i shared it with friends i shared11:05it with my colleagues and i shared it11:06with professors and11:07the argument that i made was that it was11:10it was multifaceted first of all it11:12sounded like a miserable11:14job and guess what that has been borne11:16out it is a11:17very difficult and largely unpleasant11:20job11:21uh so i was captivated by that fact that11:24there were these you know11:25unnamed people who a generation or two11:28ago would have been on a family farm11:30who were now in the quote unquote11:32information economy but seemed to be11:34doing11:34a drag just awful work11:38uh but also there was this bigger issue11:41of11:42uh you know really having this this big11:44reveal11:45of the of the actual11:48ecosystem an unknown here for unknown11:51portion of the social media ecosystem11:54effectively letting us know how the11:56sausage was being made right11:58and yet if you were to look at any of12:01the12:02the uh the social media platforms12:05themselves or any of the discourse at12:06really high levels in12:08industry or in regulatory bodies this12:11was not12:12this was a non-starter but i was was12:14arguing at the time12:16that how content was being adjudicated12:18on the platforms12:20under what circumstances under what12:23conditions and under what policies was12:25in fact12:27maybe the only thing that mattered at12:29the end of the day12:30right now in 2010 that was a little bit12:32of a harder case to make12:34by 2016 not so much after we saw the uh12:38the ascent of donald trump in the united12:40states we saw brexit12:42we saw uh this the rise of bolsonaro and12:45in brazil largely12:46uh attributed to um12:49social media campaigns there and kind of12:52discontinued sustained12:54support through those channels uh and12:57here we are in 2020 where uh13:00we might argue or we might claim that13:02misinformation and disinformation online13:04is one of the primary13:06concerns of civil society today13:09and i would put front and center13:13in those all of those discussions13:16the fact that social media companies13:18have this incredible immense power13:20to decide what stays up and what doesn't13:24and how they do it and who they engage13:27to do it13:28should actually be part of the13:30conversation if not13:31i would argue that it's a very13:33incomplete conversation so when i talk13:35about like the13:36scholarly publishing cycle it took a13:39year to put the book out right but it13:40took eight years to amass the evidence13:44to um to do the to the interviews and13:47media that you mentioned13:48to converse with industry people at the13:51top levels eventually but13:52you know starting at the bottom with the13:54workers themselves to find workers who13:56are willing13:56to talk to me and break those13:58non-disclosure agreements that they were14:00under um and to kind of create also14:04a a locus of activity for other14:07researchers and scholars and activists14:09who are also interested in in uncovering14:12uh this area and really sort of create14:14co-create a field of study so that's14:17what took eight years it took a year to14:18get the book out14:19um but all that legwork of proving in a14:22way14:23that this mattered took a lot longer i14:25don't have to make that same case14:27anymore14:27as i'm sure you you can imagine um14:30people people are interested they're14:33concerned14:34and um they want to know more they're14:36demanding a lot more14:38um from firms as users14:41you know as people who are now engaged14:43in social media in some aspect14:45of their lives every day need i say more14:48about zooming14:49constantly which is now our you know our14:52primary14:53medium of connection for so many of us14:55in our work lives even14:57yeah hey we already have a question from15:00our buddy drp david ryden-polgar let me15:04uh15:04put this against the background we can15:06actually see it here uh15:08he says sarah would love to hear your15:10thoughts on section 2315:12230 and how any potential changes would15:15impact content moderation15:16so we're going right in right deep yeah15:19really15:20so um let me try to flush that out a15:22little bit15:24for others who aren't um you know inside15:26quite as as deep15:28um section 230 is15:31a part of the uh communications decency15:34act which goes back to 1996 but15:36effectively what what anyone needs to15:38know about section 230 is that15:40it's the it it's sort of the legal15:42framework15:43that informs social media companies15:48rights and responsibilities around15:51content15:52when we think about legacy media um15:55so-called uh broadcast television for15:58example or other other forms of of media16:01that we consume16:02you know i always bring up the the16:04example of george carlin who16:06famously um uh16:10you know made a career out of the seven16:12dirty words that you couldn't say16:13on radio right so there are all kinds16:16of governing uh16:19legal and other kinds of norms about16:22what is allowed and disallowed in some16:24of these legacy media16:26when it comes to social media however16:30there is a pretty16:35drastically contrasted permissiveness16:38that is in place uh that16:41seeds the power of the decision-making16:44around16:45what is allowable and what is not16:46allowable to the platforms themselves so16:49this is a really different kind of16:50paradigm right16:52and it's section 230 that allows that16:54that's the16:55that's the precedent that's the that's16:57the guidance uh16:58legally that uh that provides that kind17:01of17:02uh both responsibility and discretion17:05and what it does is it allows the17:07companies17:08um to make their own decisions17:12effectively17:13about what policies they will follow17:15internally now this doesn't go for17:17every single piece of content you know17:18one of the the biggest examples that17:21uh that this does not cover is child17:24sexual exploitation material which is17:25just illegal full stop it doesn't matter17:28if platforms wanted to traffic in that17:30material or not it's illegal17:32but beyond that just to certain to a17:35certain extent what section 230 allows17:38is for platforms to redistribute17:42effectively material that other people17:44submit17:45uh without being held liable for that17:47material17:48and so if we think about that that's17:50actually the business model of social17:51media17:52the business model of social media is to17:54get other people to create content17:56upload it circulate it and engage with17:59it download it18:00and effectively the platforms have um18:03you know argued and claimed that they18:04are really18:05you know don't kill the messenger right18:07like they're just like the18:08the the apparatus by which this material18:10gets shared18:12i think that um18:15you know at one time that really made18:16sense particularly when the18:18when this uh when the communications18:20decency act was passed and this goes18:22back in18:23into the mid 90s when what was18:26kind of imagined as needing this this18:29uh reprieve from liability was an isp an18:33internet service provider18:35which at that time uh i guess the most18:38imaginative version of that you could18:40think of would be america online for18:41those of you who18:42remember that on the program shout out18:45to the aol days yeah18:47right aol like all the you know the18:49discs and cd-roms you got and used as18:51coasters18:52um but you know back in that time but an18:55internet service provider really was a18:57pass-through in some cases you know i18:58knew a guy who ran an isp locally19:01he really just had a room with a with a19:03huge internet pipe coming in19:06and a wall of modems and you would dial19:08up through your modem and connect19:10through and then be on the internet to19:11some other service19:12so that was the model then but the model19:15now19:15uh is you know multi-billion dollar19:19transnational corporations19:21uh who have immense power in decision19:24making around content19:26and yet are are uh19:29in the american context at least largely19:32not liable for those decisions19:34uh legally or or otherwise um19:38making incredibly powerful19:42decisions about what kind of material we19:45all see and engage in19:47and what is permissible and what is not19:49online and they do that at their19:50discretion well if they're doing that at19:52their discretion19:54do you think that they're largely going19:56to um19:58fall into a mode of altruism and like20:01what's best20:01for civil society are they going to look20:03at their bottom line20:05and their shareholder demands and20:07respond to that i mean20:09the audience yeah i mean frankly20:12publicly traded companies20:13have a legal mandate to respond to their20:15shareholders and to generate revenue for20:17them so20:18um when those things are at odds when20:20when those things are aligned with20:22what's good for you know20:23america is good for uh facebook's20:26internal policies around content20:28moderation that works out great20:29but if there's you know if ever those20:32two pathways should diverge20:34we know which one they're going to fall20:35under and there's just there's very20:37little20:37um legal consequence or legal uh20:41expectation for uh reporting out on how20:46uh these decisions get made the way that20:48that20:49we have seen more decisions getting uh20:52publicly20:53unveiled through things like um20:56the publication of of what had been21:00previously kind of closely held secret21:03policies internally is through public21:06pressure21:06through the pressure of civil society21:08groups and advocacy groups through the21:10pressure21:11of the public through the pressure and21:13the constant threat of21:15you know things like reform to section21:17230 or other kinds of21:19regulation so it's a very interesting21:23moment and it's interesting to bring up21:24section 230 because21:26again a couple of years ago i had21:28colleagues um21:30who are in uh legal studies and who are21:34you know law professors essentially tell21:36me that 230 would soon be rendered21:38moot anyway because it's just it's it's21:41you know based on um on21:45well it should be solely relevant in the21:47united states right in the jurisdiction21:49of the united states21:50and so because these platforms were21:52going worldwide21:54uh you know there21:57it would be rendered mood well i would21:59say it's actually been the opposite22:00that's right that what is happening is22:02that section 230 is getting bundled up22:04as the norm22:06and is now being promulgated either just22:09through uh through the process of these22:13platforms going global but kind of22:14keeping their americanness and22:16keeping their um their response their22:20you know business practices largely22:22responsible to american laws first and22:24foremost22:25but also even to the point that uh you22:28know it recently22:29has become known i think more and more22:32to people like me who aren't legal22:34scholars but who have a great interest22:36in how this stuff goes down that section22:39230 like language22:41is being bundled up and put into trade22:44agreements22:45uh at the nation state level or22:48you know region level with the united22:50states and trading partners and we know22:52that22:53you know these these trade agreements22:56which have been you know huge hugely22:57politically22:59uh problematic and were a major issue in23:03fact of the 2016 election23:05uh you know they're they're they're23:07anti-democratic i mean how do you even23:09know what's in a trade agreement they're23:10totally secret23:12uh but i i learned while watching a uh23:15uh house uh subcommittee23:19uh convening about section 230 from23:22a highly placed google executive23:26that in fact their their lobbyists are23:28pushing for this kind of language in23:31in these trade agreements so we see that23:33instead of 230 becoming less relevant23:35because of the globalization23:37of american social media platforms it's23:39actually becoming a norm that is now23:42being23:43first of all it was sort of like softly23:45reproduced just because of the spread of23:47these american platforms and23:49how they were doing business but now23:50it's actually becoming codified23:52through other means means like like23:55trade agreements that the public has23:57really no23:58mechanism to intervene upon and i think24:00that's really worrisome24:02what about those mechanisms where the24:04sorry what were you gonna say24:06no okay i was just gonna say that's one24:07of my short and concise professorial24:09answers24:11let me drink a coffee well david24:14uh thanks you for that uh great24:17historical overview and i'm sure24:18the rest of our viewers and listeners do24:20too i i wonder about the ones24:22the the examples that don't have that24:25kind of24:26uh consumer involvement so i'm wondering24:28about for example24:29you know youtube and it's kids content24:32and24:33and so there have been a lot of changes24:35it seems like24:36with regard to that that platform and24:38that subject over the24:40over the last few years so can you maybe24:42give us an overview of24:43how that has gone down um24:46well i think that you know youtube is24:49such an interesting example24:51to talk about for for many reasons uh24:53for its reach and pervasiveness you know24:56it's a24:56market leader for sure it's globality i24:59would also say that youtube is25:01particularly interesting because when we25:04think about25:05uh social media content as being25:10monetized there is no greater25:13and more direct example than youtube25:15where it actually pays people who are25:17really highly successful on the platform25:19for content right25:20so like when there's no kind of like a25:23metaphor there about monetization it is25:25literally monetized right25:27um and this you know just to kind of tie25:30this back to the section 23025:31conversation25:32when we imagined isps as just path25:35pass-throughs you know that was one25:37thing but here we have25:39these huge companies like youtube and25:40others involved actively25:43in production so that kind of like25:46firewall between just being an25:48intermediary and actually being actively25:50engaged in producing media25:51has gone but the there's like a legacy25:54legal environment that it still25:56informs it so youtube you know they pay25:58producers they have these like26:01uh pretty extraordinary studios in26:05in major uh in major26:08cities around the world including la26:10where i live26:12uh they you know they are kind of the26:15go-to outlet and people26:18want to participate in youtube for all26:20sorts of reasons but there's certainly26:21you know a dollar sign reason that26:24people get involved26:25and you bring up this issue of kids26:27content26:28um again here's where we see sort of26:31like the softening and the eroding of26:33regulation too it26:35started it's it's not just youtube i26:36have to confess it's not just26:38social media companies that have eroded26:40uh you know child protections around26:42um media that that goes back to the you26:45know 40 years ago in the reagan26:47administration when there used to be26:48very stringent rules around26:50uh saturday morning cartoons for example26:52and advertising to children that could26:54go on26:55during that time uh shout out to my26:58colleague molly neeson who has worked27:00extensively on that27:01on that particular topic and that27:02erosion so27:05i see uh on on youtube again27:08a lot of the pressure to kind of reform27:11and27:11i think when you're talking about kids27:13content you're talking about27:15some of like some like really disturbing27:17and weird content that was showing up27:20um you know kind of like cheaply made27:22unknown27:23weird creepy sometimes not really27:25clearly27:27necessarily uh27:30benevolently made like you know27:33sometimes creepy sexual undertones27:36uh other kinds of stuff going on you27:38know really and really no way to know27:40that's part of the problem no way to27:42know right um27:43and then uh the massive problem of27:46trying to27:48moderate that material right um you know27:51i think of it27:52as like the the classic story of the the27:55whole27:56springing through the the dyke holding27:58the water back you know27:59you plug one hole another one springs28:02open28:02so it's a little bit falls down so the28:05whole wall28:06and then your inundated that's right28:07that's right and so28:09you know that is a good metaphor to28:10think about the problem of these like28:12kind of isolated28:14uh hot spots that explode on platforms28:17as a new social issue or maybe a new28:21uh a geopolitical conflict erupts28:25somewhere in the world it's you know28:26gets meted out and replicated on social28:28media and attention gets drawn to it28:31and so i think this issue of child28:34content and its kind of exploitive28:35nature and28:36strange nature in some cases was28:38something that advocacy groups and28:40others brought attention to28:41and the platform had to reconfigure and28:44focus on it28:45now i mentioned earlier that you know28:47back in 2010 it really was humans who28:49were doing this work almost exclusively28:50but by 202028:52we are using computational tools28:55to try to deal with content as well28:57although i28:58i'll repeat the quote that i once heard29:00from a reporter29:02who who heard it from a an engineer at a29:05company that shall not be named but it29:06might sound like29:08um you know boo-boob let's say might29:10rhyme with that29:11uh and the quote was uh whatever the29:14algorithm is doing it's29:15not watching the video so you know29:17they're using these computational29:19mechanisms to do all kinds of other29:21stuff but it's not like29:22an algorithm can watch and sense make29:25out of a video it has to look at other29:26stuff29:28so that's an interesting point though29:30too and i want to follow up on that with29:31a question about29:32you know do you do you personally29:34advocate for more29:35ai in the mix of con of content29:38moderation such as you know facebook29:39recently made an announcement that they29:40were using29:41ai to simulate bad actors so that they29:44could train their moderation29:45systems automated moderation systems to29:47more effectively recognize it do you29:49think that that ultimately29:50will work and will benefit the humans29:52who are part of this ecosystem or29:54is it likely to produce unintended ill29:56effects so i mean that's a really great29:59question because that's sort of like the30:0164 000 question about my work if30:04you know one would one would think if my30:05concern is the welfare of workers30:08which has always kind of been my cut in30:10on this topic and where i start and30:11where i come back to an end30:13um then hey wouldn't it be great if30:15tomorrow we could just flip that switch30:16and go30:17to those uh purely computational means i30:20think that30:21in theory right in theory but i think30:24there are a lot of red flags there30:26you know one red flag is that if it's30:29been this difficult30:30as and i kind of laid the groundwork for30:32that at the at the front end of the show30:34to unpack and uncover uh30:37the ecosystem involving humans and i30:39have to say30:40the majority of my work has been30:43reliant upon the willingness of human30:46beings involved in the system30:48to leak essentially to break30:51their non-disclosure agreements and to30:54you know essentially snitch on what they30:56felt was30:58problematic also sometimes what they31:00felt was good about the work they did31:02how do you get uh an algorithm or a31:04machine learning based tool31:06to call a journalist or31:09uh you know do an interview with a31:11researcher31:13i don't know how to do that you know the31:14closest thing we could come to is31:16getting access to it and looking31:18at code but that's not easy to do and31:20it's much harder to do31:22than finding uh and i cannot stress the31:25difficulty of what it was like31:27in the early days to find people willing31:29to talk to me so31:30you know you can't do that with ai how31:32do we how do we audit those tools how do31:34we31:35how do we you know what's the check on31:37power that the firms have with those31:39tools31:40in terms of how they're set up and what31:42they keep in and what they keep31:43out it also sounds like a potentially31:46even greater violation31:47of uh that non-disclosure if someone31:50leaks a bit of code31:51rather than just tell their own personal31:53story oh for sure i mean and and31:56you know the the other thing too that31:58that comes to mind for me is32:00the nature of how these tools work32:03and you know a great worry and i think a32:05legitimate worry of many people in the32:07space32:07is that uh they32:11the tendency to use those tools would be32:13to32:14uh calibrate them32:17to be even uh less permissive let's say32:21or to you know because of their nature32:23they would have less of an32:24ability to look at a given piece of32:27content32:28and you know see that it violates abc32:31policy but understand it in the context32:34of you know again32:35a cultural expression or um32:38you know an advocacy piece around a32:41conflict zone32:42and then make an exception so what we32:44would see32:45is uh more conservative and greater32:49false positives around material that32:52quote unquote is disallowed right32:55again all of this adjudicating to the32:58logic that the firms themselves create33:00which for um for many years itself was33:03opaque33:05uh so this is you know it's not as easy33:08to say unfortunately if we could just33:10get those darn algorithms right if we33:11could just get33:12you know machine learning to get33:13sophisticated enough we could33:16take out the human element and and33:18basically33:19you know save people from having to do33:21this work33:23unfortunately i think it's more33:24complicated than that and i would say33:26that33:26you know bringing up the idea of33:29training machine learning tools as you33:30did33:31one of the gross ironies of this whole33:33thing that i've been33:34monitoring is that uh33:38content moderation commercial content33:40moderation for these major platforms33:42is its own kind of self-fulfilling uh33:46industry that begets uh sub industries33:49in and of itself33:49so that when machine learning tools have33:52come on what needs to happen33:54is that people need to sort data sets to33:56create data sets for the machine33:58learning tools to train on33:59and they need to be themselves trainers34:02and classifiers for the machine learning34:04tools so now we have a whole new stratum34:06of people34:07working to train machine learning34:09algorithms which has them essentially34:11doing a certain kind of content34:12moderation34:13it's a lot easier that cottage industry34:14of evil ai34:16spawn it's like anything like34:19how are we gonna make the ai bad enough34:21to train our ai34:23uh automation systems to recognize that34:25so that we can keep a good environment34:27but then you've got this whole cottage34:29industry around the bad34:30ai seems like a very awkward way of34:32going34:33so you know as someone who monitors like34:36like hiring trends and things like that34:37too34:38i was i was watching companies looking34:41for people to to come be34:42classifiers on data sets which is just34:44moderation before the fact right34:46yeah you know you talked about that in34:48the book too you have34:50you presented a taxonomy of sorts of34:52labor arrangements from34:53in-house moderators to what you call34:56micro labor you know looking at34:58mechanical turk and things like that can34:59you walk us through that a little bit so35:01that we can become familiar with what35:02the35:02the human issues relative to each level35:06yeah one of the one of the early35:07insights i had when i was trying to35:09figure out the contours of this industry35:11from35:11you know the outside and it reminds me35:13of that parable of you know35:15people feeling different parts of the35:16elephant without really being35:18being able to see it and they don't35:19really they don't really get the big35:21picture35:22um was that you know what i was35:24considering as being kind of like a35:26monolithic35:27practice really wasn't it was happening35:28in all kinds of different places and in35:30different guises35:32including using different names like35:33there was no kind of cohesive name to35:35call35:36this this work practice so i started out35:38kind of knowing about these workers35:40in in iowa that i reference in the book35:42and i referenced today35:44who were working in a call center and it35:46turned out that call centers were really35:48a prevalent way35:50that this work was going that it was um35:53you know kind of at somewhat of a remove35:55geographically and organizationally so35:57it'd be kind of like a third party35:59contracted out group of workers36:00somewhere in the world36:02when i started out i knew about the36:03workers in places like iowa florida etc36:06but i soon came to know about workers in36:08places like india36:09or in malaysia or of course key to the36:12book in the philippines36:13so that um that that call center36:16environment for content moderation work36:18is really prevalent36:20and it's global but there are also36:23workers who36:24uh prior to covid we're going every day36:26for example in the bay area down from36:28san francisco on the36:30company buses um and going on site to36:33companies36:34that i describe in the book one that has36:36the you know36:37pseudonym of megatech and is a stand-in36:40for36:40any number of companies in fact i'll36:42just tell you a little anecdote that36:44i've met a lot of people from industry36:46who like over cocktails after meetings36:48will come up to me36:49all from different companies and say36:52we're mega tech aren't we and it's like36:54you know like at least six different36:56corporations think they're making36:57answers36:58yes yes sounds right yeah that tells you37:01something37:02so um you know these people were on site37:05workers they were37:06in you know the belly of the beast37:07essentially they were working37:09in places where there was also uh37:11engineering product development37:13marketing37:14uh communications you know soup to nuts37:16uh37:17although interestingly enough they were37:20also contractors in the case of the37:21books so37:22they still had this differential and37:24lesser status even though they were37:26going on site37:27to the corporate hq you know it still37:31wasn't quite the right badge caller as37:33they described it to me although they37:35thought about the people who were37:36working as contractors and call centers37:38as another kind of worker37:40even though they were essentially very37:43very similar37:44then we had people that i encountered37:47who were37:48you know very entrepreneurial and37:50especially in in sort of the early days37:52were37:52developing a model that looks almost37:56like an ad agency they were37:58independent companies that were starting38:00to specialize in providing content38:02moderation services38:03to other companies and it was a boutique38:05kind of service38:06a specialty service and they would often38:09offer38:10social media management across the board38:13so not only were they offering38:14the removal of content in some cases but38:16they would even38:18offer again in that advertising model38:20the generation of content38:22because believe it or not sometimes you38:24know your auto parts company's facebook38:26page just doesn't38:27generate a lot of organic interest and38:29so you hire a company to come post about38:31how awesome your auto parts company is38:34um likewise if there's a you know as38:37somebody once38:38told me and it's in the book too if you38:40open a hole on the internet it gets38:41filled with38:43bleep with uh you know if you have38:46a web page or you have a facebook page38:48and there's no activity38:49that's like organic or really about what38:51it's supposed to be about i guarantee38:52you that somebody will be posting38:54invective racist comments and so on38:56these boutique firms said38:58to usually to smaller companies hey39:00we'll manage the whole thing we'll39:01delete that stuff39:02we'll generate new stuff for you it'll39:04look organic nobody will really know39:06that that's what we're doing39:07and they were having great success when39:09i talked to them was that generally39:11filed under this sort of banner of user39:12generated content39:14or was it called other things generally39:16um39:17you know it was kind of like a social39:19media management is how they would call39:21couch that and how they would pitch it39:25and uh you know it was like uh hey39:28company x you your business has nothing39:31really to do with social media that's39:33not39:33you know your primary business let us39:35handle it for you39:36and a lot of companies jumped at the39:38chance to kind of outsource that and not39:40deal with it39:41an interesting thing in that kind of39:43bucket of39:44of the taxonomy that you mentioned is39:46that those companies39:48uh in some cases got bought up by39:52ad firms or ad firms have started doing39:54this service as well39:56or they become really really big and39:58successful so there's like a few that40:00kind of40:01uh uh rose to the top and have survived40:05and then you already mentioned this40:07really interesting and and kind of40:09worry some arena where this work goes on40:12which is in the micro labor realm40:14the amazon mechanical turk model40:17uh which is effectively you know digital40:19piece work it's people40:21adjudicating a bit of content here40:23they're often40:25paid a per view or per decision40:28uh and then they try to aggregate enough40:30to make that make sense for them40:31financially40:33and it it turns out although that's40:36supposed to be an anonymous relationship40:38you know savvy mechanical turkers they40:40can figure out who they're working for40:42because a lot of times40:43you know they'd receive a set of of40:46images or other content to adjudicate40:48and like you know the interface was40:50obvious41:00[Music]41:02before and you get those guidelines41:04again then you know yeah41:06that's right so you know i i came to41:09know some folks who were41:10uh you know who themselves sort of began41:13to specialize within41:14mechanical turk and other platforms on41:17this kind of thing and they would seek41:18out this work because they got good at41:20it like you said41:21and they got good at knowing the41:22internal policies and juggling them for41:24all these different firms and41:26began to specialize in this work on that41:28platform41:29i was wondering you know when thinking41:31about this as you mentioned earlier41:33about the41:34the consequences of misinformation41:36especially as we41:37are deep in the process of the us41:40presidential election cycle and41:42i say the u.s because i want to be41:43sensitive to the fact that there are41:44global viewers but i feel like everyone41:46in the world is kind of41:48you know hooked into the u.s41:49presidential election right now41:51and we're all like yeah aren't they41:53right and we're all being subject to41:55you know all of this uh well the the41:58dumpster fire of it all but also the42:00misinformation that accompanies it42:02and so i wonder how should people think42:04and understand the difference between42:07content on social media and content in42:09news media42:10and what are some of the differences in42:12approaches to moderating42:14harmful content and you know kind of42:16just thinking about42:18the access to you know free access to42:21information you know this is kind of a42:23big42:24muddy question i'm not sure i'm42:26articulating very well but42:27hopefully you see the direction of of42:29the um42:30the question that i'm asking her yeah i42:34i'll i'll do my best to respond and we42:36can42:36you know we can you can offer guidance42:40yeah as i go i mean i i think your42:43question in essence is what the hell42:45right yeah42:48information misinformation42:50disinformation the election42:52what the hell and so i think you speak42:54for a global audience when you pose that42:56question and42:58you're right about the u.s election i43:00know uh friends and colleagues who were43:02up early in australia watching it and43:04you know as mortified as we were by the43:06the behavior on display43:08and the other night yes the debate and43:11the kind of the nadir43:12of uh you know american politics in my43:15lifetime is how i described it43:17um you know i i often43:20bring up the the rise of social media43:24as a force in again in american civic43:27life43:29that it's important to not think about43:31it having happened in a vacuum or having43:33happened43:34uh without without43:37um other forces at play and in the other43:40part of my life i43:42am a professor in a program that trains43:44and prepares43:45people for careers and information43:47professions primarily in librarianship43:50and so i know something about the way43:53in which we've seen a gross43:57erosion of the american44:00public sphere and opportunities for44:03people to become informed44:06in places that traditionally have been44:10more transparent more committed to the44:13public good44:13not-for-profit i'm thinking about44:16institutions like public schools44:18and institutions like public libraries44:21so if we were to take44:24you know uh funding a funding graph or44:28something like that and put them44:29together about expenditures or44:31where where money goes in our society we44:34would see44:35you know that off the cliff kind of44:37defunding44:38of of these uh institutions that i just44:41mentioned44:42while we see a rise in social media44:46and what i think that suggests at least44:49to me is that44:50it's not that the american public44:51doesn't have a desire to be informed44:54or to have information sources and i44:56would add to that by the way44:57it's not necessarily in the public44:59sphere in the same way45:00but we have seen total erosion in45:04regional and local journalism too right45:06during the same time right45:08into mega media that's right mega media45:11which45:12you know came about by the shuttering of45:14local news45:15and it there was a time when you know45:17cities like mine i come from madison45:19wisconsin 25045:21000 people yeah they yeah they might45:24have had a a45:25a reporter in dc you know what i mean45:28for our local paper the capitol times45:30which went the way of the dodo some45:33some years ago and that that local paper45:35no longer exists in a print form45:38so there's a whole i mean we could do a45:40whole show on this and you probably45:42shouldn't have me on for the show so45:44apologies to to the users that this45:46isn't my total area of expertise but i'm45:48just trying to connect some dots here45:50for people to make sense of it right45:52right and you know when we think about45:53the differences between social media45:55information circulation and something45:58like journalism46:00agree or disagree with what you read in46:02in in the newspaper or you hear on the46:05news46:06of your choice but there are things46:09there that are not present46:10in the same way in the social media46:12ecosystem uh46:13you know an author name a set of46:16principles by which46:18uh the journalists46:21at least pay lip service to but most of46:24them46:25live by you know that they have been46:27educated46:28to uh to serve and then do so46:31in their work there's editorial control46:34that before stories go to print they46:37have to go through a number of eyes46:38there's fact checking if you've ever you46:41know i've been on the46:42the the side of having been interviewed46:44for journalistic pieces and i get phone46:46calls from fact checkers to make sure46:48that the journalists got46:49right what i think yeah right46:52you think that did you really say xyz46:55yes i did that doesn't exist and you46:57know46:58your your your racist uncle47:00recirculating47:01um god knows what from whatever outlet47:04that is just go those those47:08what we might think of barriers to entry47:10but we also might think of as safeguards47:11are just gone47:13and with all of the other institutions47:16eroded that i mentioned47:17you know public schooling library public47:20libraries and so on the mechanisms that47:22people might use to47:24vet material to understand what it means47:27when they look at a paper of record47:29versus47:32a dubious outlet let's say a dubious47:34internet based outlet47:36and how those uh sources differ those47:39mechanisms to to learn about those47:41things have been eroded as well47:43um is there even a civics class anymore47:45in pu

The Tech Humanist Show
The Tech Humanist Show: Episode 12 – Dr. Sarah T. Roberts

The Tech Humanist Show

Play Episode Listen Later Oct 9, 2020 55:34


About this episode's guest: Sarah T. Roberts is an Assistant Professor in the Department of Information Studies, Graduate School of Education & Information Studies, at UCLA. She holds a Ph.D. from the iSchool at the University of Illinois at Urbana-Champaign. Prior to joining UCLA in 2016, she was an Assistant Professor in the Faculty of […]

The Tech Humanist Show
The Tech Humanist Show: Episode 12 – Sarah T. Roberts

The Tech Humanist Show

Play Episode Listen Later Oct 9, 2020 55:34


The Tech Humanist Show explores how data and technology shape the human experience. It's recorded live each week in a live-streamed video program before it's made available in audio format. Hosted by Kate O’Neill. About this episode's guest: Sarah T. Roberts, who is an Assistant Professor in the Department of Information Studies, Graduate School of Education & Information Studies, at UCLA. She holds a Ph.D. from the iSchool at the University of Illinois at Urbana-Champaign. Prior to joining UCLA in 2016, she was an Assistant Professor in the Faculty of Information and Media Studies at Western University in London, Ontario for three years. On the internet since 1993, she was previously an information technology professional for 15 years, and, as such, her research interests focus on information work and workers and on the social, economic and political impact of the widespread adoption of the internet in everyday life. Since 2010, the main focus of her research has been to uncover the ecosystem - made up of people, practices and politics - of content moderation of major social media platforms, news media companies, and corporate brands. She served as consultant to and is featured in the award-winning documentary The Cleaners, which debuted at Sundance 2018 and aired on PBS in the United States in November 2018. Roberts is frequently consulted by the press and others on issues related to commercial content moderation and to social media, society and culture, in general. She has been interviewed on these topics in print, on radio and on television worldwide including: The New York Times, Associated Press, NPR, Le Monde, The Atlantic, The Economist, BBC Nightly News, the CBC, The Los Angeles Times, Rolling Stone, Wired, The Washington Post, Australian Broadcasting Corporation SPIEGEL Online, and CNN, among many others. She is a 2018 Carnegie Fellow and a 2018 recipient of the EFF Barlow Pioneer Award for her groundbreaking research on content moderation of social media. She tweets as @ubiquity75. This episode streamed live on Thursday, October 1, 2020.

We Be Imagining
If you open a hole in the internet, people will immediately fill it with s*** (with Sarah T. Roberts)

We Be Imagining

Play Episode Listen Later Apr 2, 2020 80:46


Boston Calling
Screen time

Boston Calling

Play Episode Listen Later Dec 27, 2019 27:14


Several former contractors, who did content moderation work for Facebook, are suing in Europe over the psychological trauma they say the work has caused them. The lawsuit is bringing new scrutiny to the content moderation ecosystem that Facebook and other platforms rely on to police what gets posted on their platforms. Author Sarah T. Roberts says that human content moderation isn’t going away anytime soon. Also, a North Korean cartoon called ‘Bunny Brothers and the Wolf’, may not be the thinly disguised anti-American propaganda it appears to be; Sesame Street, revolutionized children's television in the US, now it’s doing the same and around the world; and Blue’s Clues, an iconic kids TV program in the US has a new host, Filipino actor Josh Dela Cruz. He tells Marco what the reaction has been like among Asian-American kids. (Woman looking at the internet site of the online network Facebook. Credit: Classen/ullstein bild/Getty Images)

Function with Anil Dash
Bonus: Live from Texas Tribune Fest: Reclaiming Community on the Web

Function with Anil Dash

Play Episode Listen Later Dec 19, 2019 57:29


The integrity of the internet is at stake -- what have we lost and how do we get it back? At the 2019 Texas Tribune Festival, Anil spoke with web scholars and writers about reclaiming the internet through historical context, how we are tethered to social media and the inventive ways marginalized people have always reinvented the platforms available. Panelists Charlton McIlwain Anne Helen Petersen Sarah T. Roberts Siva Vaidhyanathan

Factually! with Adam Conover
A.I. Doesn’t Run the Internet; Exploited Humans Do, with Sarah T. Roberts

Factually! with Adam Conover

Play Episode Listen Later Dec 3, 2019 71:24


Technology expert and UCLA professor of information studies Sarah T. Roberts joins Adam to discuss the oversold fantasy of artificial intelligence, the real humans who labor behind the scenes to moderate your social media feeds, and the psychological effects the work takes on them.    This episode is sponsored by Acuity (www.acuityscheduling.com/factually), KiwiCo (www.kiwico.com/FACTUALLY), and Parcast - Natural Disasters (www.parcast.com/NATURALDISASTERS).

Eyes Cool Podcast
Behind the Screen

Eyes Cool Podcast

Play Episode Listen Later Nov 4, 2019 50:23


.A look at Sarah T. Roberts' new book Behind the Screen: Content Moderation in the Shadows of Social Media.

Techtonic with Mark Hurst | WFMU
Sarah T. Roberts, author of "Behind the Screen," on content moderators from Sep 16, 2019

Techtonic with Mark Hurst | WFMU

Play Episode Listen Later Sep 16, 2019


On content moderation: Prof. Sarah T. Roberts discusses her book "Behind the Screen" about the invisible, and often exploited, human moderators of Big Tech social media. Tomaš Dvořák - "Game Boy Tune" - Machinarium Soundtrack - "Mark's intro" - "Interview with Sarah T. Roberts" - "Your comments" Sen. Josh Hawley and Google's Karan Bhatia - "Remix of EDM Detection Mode by Kevin Macleod" [Music from filmmusic.io "EDM Detection Mode" by Kevin MacLeod License: CC BY] https://www.wfmu.org/playlists/shows/88340

Techtonic with Mark Hurst | WFMU
Sarah T. Roberts, author of "Behind the Screen," on content moderators from Sep 16, 2019

Techtonic with Mark Hurst | WFMU

Play Episode Listen Later Sep 16, 2019


On content moderation: Prof. Sarah T. Roberts discusses her book "Behind the Screen" about the invisible, and often exploited, human moderators of Big Tech social media. Tomaš Dvořák - "Game Boy Tune" - Machinarium Soundtrack - "Mark's intro" - "Interview with Sarah T. Roberts" - "Your comments" Sen. Josh Hawley and Google's Karan Bhatia - "Remix of EDM Detection Mode by Kevin Macleod" [Music from filmmusic.io "EDM Detection Mode" by Kevin MacLeod License: CC BY] http://www.wfmu.org/playlists/shows/88340

Stop Everything! - ABC RN
Is it time to leave Neverland?

Stop Everything! - ABC RN

Play Episode Listen Later Mar 21, 2019 54:17


We look at live streaming and the responsibility of social media platforms in the wake of the Christchurch terrorist attack, talk over the documentary detailing sexual abuse allegations against Michael Jackson and our teen correspondent mourns the cancellation of Netflix's One Day at a Time.

Stop Everything! - ABC RN
Is it time to leave Neverland?

Stop Everything! - ABC RN

Play Episode Listen Later Mar 21, 2019 54:17


We look at live streaming and the responsibility of social media platforms in the wake of the Christchurch terrorist attack, talk over the documentary detailing sexual abuse allegations against Michael Jackson and our teen correspondent mourns the cancellation of Netflix's One Day at a Time.

Note to Self
Meet the Humans Who Protect Your Eyes

Note to Self

Play Episode Listen Later Jun 7, 2017 23:07


Rochelle LaPlante works on contract as a content moderator. She’s seen basically every kind of image you can imagine. All the boring, normal stuff - cat videos, vacation snapshots, headshots for dating sites. Weird stuff, like hundreds and hundreds of feet. And the occasional nightmare-inducing photo of horrific violence, child abuse, graphic porn. It takes a toll. Some things, you can’t unsee. Sometimes Rochelle knows who she’s working for, often not. For about four cents a click, she marks whether the images, text or videos meet the guidelines she’s given. Meet the invisible workforce of content moderation. This week, all the pictures that never make it to your screen. With Professor Sarah T. Roberts, who studies digital pieceworkers, and Rochelle LaPlante, who you should really thank for protecting your eyeballs. Support Note to Self by becoming a member today at NotetoSelfRadio.org/donate.    

Note To Self
Meet the Humans Who Protect Your Eyes

Note To Self

Play Episode Listen Later Jun 7, 2017 23:07


Rochelle LaPlante works on contract as a content moderator. She’s seen basically every kind of image you can imagine. All the boring, normal stuff - cat videos, vacation snapshots, headshots for dating sites. Weird stuff, like hundreds and hundreds of feet. And the occasional nightmare-inducing photo of horrific violence, child abuse, graphic porn. It takes a toll. Some things, you can’t unsee. Sometimes Rochelle knows who she’s working for, often not. For about four cents a click, she marks whether the images, text or videos meet the guidelines she’s given. Meet the invisible workforce of content moderation. This week, all the pictures that never make it to your screen. With Professor Sarah T. Roberts, who studies digital pieceworkers, and Rochelle LaPlante, who you should really thank for protecting your eyeballs. Support Note to Self by becoming a member today at NotetoSelfRadio.org/donate.    

Note to Self
Meet the Humans Who Protect Your Eyes

Note to Self

Play Episode Listen Later Jun 7, 2017 23:07


Rochelle LaPlante works on contract as a content moderator. She’s seen basically every kind of image you can imagine. All the boring, normal stuff - cat videos, vacation snapshots, headshots for dating sites. Weird stuff, like hundreds and hundreds of feet. And the occasional nightmare-inducing photo of horrific violence, child abuse, graphic porn. It takes a toll. Some things, you can’t unsee. Sometimes Rochelle knows who she’s working for, often not. For about four cents a click, she marks whether the images, text or videos meet the guidelines she’s given. Meet the invisible workforce of content moderation. This week, all the pictures that never make it to your screen. With Professor Sarah T. Roberts, who studies digital pieceworkers, and Rochelle LaPlante, who you should really thank for protecting your eyeballs. Support Note to Self by becoming a member today at NotetoSelfRadio.org/donate.    

Note To Self
Meet the Humans Who Protect Your Eyes

Note To Self

Play Episode Listen Later Jun 7, 2017 23:07


Rochelle LaPlante works on contract as a content moderator. She’s seen basically every kind of image you can imagine. All the boring, normal stuff - cat videos, vacation snapshots, headshots for dating sites. Weird stuff, like hundreds and hundreds of feet. And the occasional nightmare-inducing photo of horrific violence, child abuse, graphic porn. It takes a toll. Some things, you can’t unsee. Sometimes Rochelle knows who she’s working for, often not. For about four cents a click, she marks whether the images, text or videos meet the guidelines she’s given. Meet the invisible workforce of content moderation. This week, all the pictures that never make it to your screen. With Professor Sarah T. Roberts, who studies digital pieceworkers, and Rochelle LaPlante, who you should really thank for protecting your eyeballs. Support Note to Self by becoming a member today at NotetoSelfRadio.org/donate.    

Note to Self
Meet the Humans Who Protect Your Eyes

Note to Self

Play Episode Listen Later Jun 6, 2017 23:07


Rochelle LaPlante works on contract as a content moderator. She’s seen basically every kind of image you can imagine. All the boring, normal stuff - cat videos, vacation snapshots, headshots for dating sites. Weird stuff, like hundreds and hundreds of feet. And the occasional nightmare-inducing photo of horrific violence, child abuse, graphic porn. It takes a toll. Some things, you can’t unsee. Sometimes Rochelle knows who she’s working for, often not. For about four cents a click, she marks whether the images, text or videos meet the guidelines she’s given. Meet the invisible workforce of content moderation. This week, all the pictures that never make it to your screen. With Professor Sarah T. Roberts, who studies digital pieceworkers, and Rochelle LaPlante, who you should really thank for protecting your eyeballs. Support Note to Self by becoming a member today at NotetoSelfRadio.org/donate.