POPULARITY
What do we do about recommendation algorithms? What ethical standards could we use to reshape technology? Hosts Annanda and Keisha talk to Stewart Noyce, a technologist who helped develop the internet, and Rev. Dr. Sakena Young-Scaggs, an Afrofuturist scholar and philosopher, to understand how we can all navigate recommendation algorithms in a life-giving way. SHOW NOTES Learn more about Stewart's work in marketing and consulting at StewartNoyce.com See IBM promoting their work at the 1994 Winter Olympics in this vintage ad: https://www.youtube.com/watch?v=JNZ7k9Kgmek How do algorithms drive social inequality? Virginia Eubanks explains in Automating Inequality: How High-Tech Tools Profile, Police and Punish the Poor (St. Martin's Press) What's Afrofuturism all about? Read Afrofuturism: The World of Black Sci-Fi and Fantasy Culture by Ytasha L. Womack (Lawrence Hill Books) Learn about Black entrepreneurs receiving 1% of all venture capital: Sources of Capital for Black Entrepreneurs, Harvard Business Review, 2019 by Steven S. Rogers, Stanly Onuoha, and Kayin Barclay Explore more on “life giving and death dealing” from African feminist theologian Mercy Oduyoye in Beads & Strands: Reflections of an African Woman on Christianity in Africa (Theology in Africa), Orbis Press (2013)
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: The UBI dystopia: a glimpse into the future via present-day abuses, published by Solenoid Entity on April 12, 2023 on LessWrong. Disclaimer: This does risk being a little culture-war-adjacent. This is more political than I normally see on LessWrong, but the spirit of it is intended to be less "Smash Capitalism" or "Become Ungovernable" and more "look at these things that are happening in Australia." Inspired by: Basic Income, Not Basic Jobs: Against Hijacking Utopia, SSC Gives A Graduation Speech. Summary Some bad things are happening today to people who depend on the government for money. This suggests that similar bad things could happen in the future if more people depend on the government for money. If you're an advocate for UBI, or it's a linchpin in your plan for how we'll live well post-AI, it's important to consider two connected worries carefully: Creating the scheme as truly universal (and keeping it that way) may be politically untenable in a democracy. In many likely worlds, some people will depend on this non-universal UBI for survival, leaving them vulnerable to coercive control and arbitrary punishment by the government (lawfully), government officials (unlawfully), even corrupt bureaucrats. Worries along the lines of "UBI could make us all serfs" and various techno-futuristic dystopian visions are already common enough. There's also a growing part of journalism/civil society/activism concerned with an industry that "farms the unemployed" — billing the government for services it ostensibly provides to poor people, while in fact spending their time on coercive control and a moralistic form of discipline. A "digital poorhouse" per Virginia Eubanks. I think the average LW reader has also probably heard worries about Government Issued Digital Currency (GIDC), which is certainly part of the concern here. Others have expressed worry about payment processors and banks being politically manipulated. But I think many people here are less familiar with arguments about problems within the social welfare system that already exist today. These problems are suggestive to me that, absent a change in the culture of these institutions, a UBI might lead to serious abuses. After listing some reasons it might be hard to actually make an actual universal policy, or 'make it an inalienable human right', etc., I then briefly survey some of the abuses of welfare recipients that have recently occurred in Australia, due to the ability of the government, public servants, and employees of private companies to 'turn off the tap' on their money. Two main problems 1. You probably can't make it universal, and you probably can't keep it that way. The most compact way to pump this intuition is to imagine a caricature of a political debate, where a outrage-stoking populist politician is tearing shreds off a nervously stammering, principled, liberal, establishment centrist candidate. Are you seriously proposing to give access to our hard earned tax dollars to migrants? Refugees? Oh, you're not? How can we trust you on this? So you'll demand government-issued photo-ID and proof of citizenship for people signing up? Ok, what about convicted terrorists? What about people (accused of) traveling abroad to join terrorist organizations (accused by members of the government or security agencies without a trial)? What about 'terrorist sympathizers?'We won't be giving it to convicted felons, though, right? Oh we ARE? But not while they're in prison, right? And obviously child sex offenders are barred from the UBI for life, right?What about people who refuse to get vaccinated or vaccinate their children? What about people who take part in unpopular protests?What about draft-dodgers? Taxpayers would be paying for them to sit on their behinds while everyone else who gets conscripted does their duty a...
Link to original articleWelcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: The UBI dystopia: a glimpse into the future via present-day abuses, published by Solenoid Entity on April 12, 2023 on LessWrong. Disclaimer: This does risk being a little culture-war-adjacent. This is more political than I normally see on LessWrong, but the spirit of it is intended to be less "Smash Capitalism" or "Become Ungovernable" and more "look at these things that are happening in Australia." Inspired by: Basic Income, Not Basic Jobs: Against Hijacking Utopia, SSC Gives A Graduation Speech. Summary Some bad things are happening today to people who depend on the government for money. This suggests that similar bad things could happen in the future if more people depend on the government for money. If you're an advocate for UBI, or it's a linchpin in your plan for how we'll live well post-AI, it's important to consider two connected worries carefully: Creating the scheme as truly universal (and keeping it that way) may be politically untenable in a democracy. In many likely worlds, some people will depend on this non-universal UBI for survival, leaving them vulnerable to coercive control and arbitrary punishment by the government (lawfully), government officials (unlawfully), even corrupt bureaucrats. Worries along the lines of "UBI could make us all serfs" and various techno-futuristic dystopian visions are already common enough. There's also a growing part of journalism/civil society/activism concerned with an industry that "farms the unemployed" — billing the government for services it ostensibly provides to poor people, while in fact spending their time on coercive control and a moralistic form of discipline. A "digital poorhouse" per Virginia Eubanks. I think the average LW reader has also probably heard worries about Government Issued Digital Currency (GIDC), which is certainly part of the concern here. Others have expressed worry about payment processors and banks being politically manipulated. But I think many people here are less familiar with arguments about problems within the social welfare system that already exist today. These problems are suggestive to me that, absent a change in the culture of these institutions, a UBI might lead to serious abuses. After listing some reasons it might be hard to actually make an actual universal policy, or 'make it an inalienable human right', etc., I then briefly survey some of the abuses of welfare recipients that have recently occurred in Australia, due to the ability of the government, public servants, and employees of private companies to 'turn off the tap' on their money. Two main problems 1. You probably can't make it universal, and you probably can't keep it that way. The most compact way to pump this intuition is to imagine a caricature of a political debate, where a outrage-stoking populist politician is tearing shreds off a nervously stammering, principled, liberal, establishment centrist candidate. Are you seriously proposing to give access to our hard earned tax dollars to migrants? Refugees? Oh, you're not? How can we trust you on this? So you'll demand government-issued photo-ID and proof of citizenship for people signing up? Ok, what about convicted terrorists? What about people (accused of) traveling abroad to join terrorist organizations (accused by members of the government or security agencies without a trial)? What about 'terrorist sympathizers?'We won't be giving it to convicted felons, though, right? Oh we ARE? But not while they're in prison, right? And obviously child sex offenders are barred from the UBI for life, right?What about people who refuse to get vaccinated or vaccinate their children? What about people who take part in unpopular protests?What about draft-dodgers? Taxpayers would be paying for them to sit on their behinds while everyone else who gets conscripted does their duty a...
This episode features a conversation between Lane Rasberry, Wikimedia-in-Residence at the University of Virginia School of Data Science, and Virginia Eubanks, author, journalist, and associate professor of political science at the University at Albany. The conversation was recorded in 2019 but the topics are still relevant today. Eubanks looks toward the future, warning of the unintended—or at times intended—consequences of emerging technologies. The discussion focuses on the effects of algorithmic automation, as well as the practice, policies, and implementation of these algorithms. Although she critiques the tech world, Eubanks also provides many reasons for optimism.Virginia Eubanks authored the 2018 book Automating Inequality, which is a detailed investigation into data-based discrimination. She is also the author of Digital Dead End: Fighting for Social Justice in the Information Age and the co-editor, with Alethia Jones, of Ain't Gonna Let Nobody Turn Me Around: Forty Years of Movement Building with Barbara Smith. She also writes for various outlets, including the Guardian, American Scientist, and the New York Times. Recently, Virginia began the PTSD Bookclub, an ongoing project that explores books about trauma and its aftermath. You can find this project and Virginia Eubank's other projects at virginia-eubanks.com.
On this week's episode of The Waves, Slate senior editor Shannon Palus sits down with writer and political scientist Virginia Eubanks. They talk about Virginia's New York Times magazine essay , “His PTSD, and My Struggle to Live With It,” and how the condition is more widespread than most people realize, even as terms like “trauma” and “triggered” are tossed around cavalierly. Later in the show, they talk about why you shouldn't give unsolicited advice to people living with PTSD—and what kind of support caregivers of people with PTSD really need. In Slate Plus: Why Virginia wanted to write her New York Times essay, and whether the COVID-19 pandemic is, technically speaking, a traumatic event. Further Recommended Reading: What to Say When Someone Tells You They're Chronically Ill by Rachel Meeks Irritable Hearts: A PTSD Love Story by Gabriel Mac Podcast production by Cheyna Roth with editorial oversight by Shannon Palus, Daisy Rosario and Alicia Montgomery. Send your comments and recommendations on what to cover to thewaves@slate.com Learn more about your ad choices. Visit megaphone.fm/adchoices
On this week's episode of The Waves, Slate senior editor Shannon Palus sits down with writer and political scientist Virginia Eubanks. They talk about Virginia's New York Times magazine essay , “His PTSD, and My Struggle to Live With It,” and how the condition is more widespread than most people realize, even as terms like “trauma” and “triggered” are tossed around cavalierly. Later in the show, they talk about why you shouldn't give unsolicited advice to people living with PTSD—and what kind of support caregivers of people with PTSD really need. In Slate Plus: Why Virginia wanted to write her New York Times essay, and whether the COVID-19 pandemic is, technically speaking, a traumatic event. Further Recommended Reading: What to Say When Someone Tells You They're Chronically Ill by Rachel Meeks Irritable Hearts: A PTSD Love Story by Gabriel Mac Podcast production by Cheyna Roth with editorial oversight by Shannon Palus, Daisy Rosario and Alicia Montgomery. Send your comments and recommendations on what to cover to thewaves@slate.com Learn more about your ad choices. Visit megaphone.fm/adchoices
On this week's episode of The Waves, Slate senior editor Shannon Palus sits down with writer and political scientist Virginia Eubanks. They talk about Virginia's New York Times magazine essay , “His PTSD, and My Struggle to Live With It,” and how the condition is more widespread than most people realize, even as terms like “trauma” and “triggered” are tossed around cavalierly. Later in the show, they talk about why you shouldn't give unsolicited advice to people living with PTSD—and what kind of support caregivers of people with PTSD really need. In Slate Plus: Why Virginia wanted to write her New York Times essay, and whether the COVID-19 pandemic is, technically speaking, a traumatic event. Further Recommended Reading: What to Say When Someone Tells You They're Chronically Ill by Rachel Meeks Irritable Hearts: A PTSD Love Story by Gabriel Mac Podcast production by Cheyna Roth with editorial oversight by Shannon Palus, Daisy Rosario and Alicia Montgomery. Send your comments and recommendations on what to cover to thewaves@slate.com Learn more about your ad choices. Visit megaphone.fm/adchoices
On this week's episode of The Waves, Slate senior editor Shannon Palus sits down with writer and political scientist Virginia Eubanks. They talk about Virginia's New York Times magazine essay , “His PTSD, and My Struggle to Live With It,” and how the condition is more widespread than most people realize, even as terms like “trauma” and “triggered” are tossed around cavalierly. Later in the show, they talk about why you shouldn't give unsolicited advice to people living with PTSD—and what kind of support caregivers of people with PTSD really need. In Slate Plus: Why Virginia wanted to write her New York Times essay, and whether the COVID-19 pandemic is, technically speaking, a traumatic event. Further Recommended Reading: What to Say When Someone Tells You They're Chronically Ill by Rachel Meeks Irritable Hearts: A PTSD Love Story by Gabriel Mac Podcast production by Cheyna Roth with editorial oversight by Shannon Palus, Daisy Rosario and Alicia Montgomery. Send your comments and recommendations on what to cover to thewaves@slate.com Learn more about your ad choices. Visit megaphone.fm/adchoices
On this week's episode of The Waves, Slate senior editor Shannon Palus sits down with writer and political scientist Virginia Eubanks. They talk about Virginia's New York Times magazine essay , “His PTSD, and My Struggle to Live With It,” and how the condition is more widespread than most people realize, even as terms like “trauma” and “triggered” are tossed around cavalierly. Later in the show, they talk about why you shouldn't give unsolicited advice to people living with PTSD—and what kind of support caregivers of people with PTSD really need. In Slate Plus: Why Virginia wanted to write her New York Times essay, and whether the COVID-19 pandemic is, technically speaking, a traumatic event. Further Recommended Reading: What to Say When Someone Tells You They're Chronically Ill by Rachel Meeks Irritable Hearts: A PTSD Love Story by Gabriel Mac Podcast production by Cheyna Roth with editorial oversight by Shannon Palus, Daisy Rosario and Alicia Montgomery. Send your comments and recommendations on what to cover to thewaves@slate.com Learn more about your ad choices. Visit megaphone.fm/adchoices
On this week's episode of The Waves, Slate senior editor Shannon Palus sits down with writer and political scientist Virginia Eubanks. They talk about Virginia's New York Times magazine essay , “His PTSD, and My Struggle to Live With It,” and how the condition is more widespread than most people realize, even as terms like “trauma” and “triggered” are tossed around cavalierly. Later in the show, they talk about why you shouldn't give unsolicited advice to people living with PTSD—and what kind of support caregivers of people with PTSD really need. In Slate Plus: Why Virginia wanted to write her New York Times essay, and whether the COVID-19 pandemic is, technically speaking, a traumatic event. Further Recommended Reading: What to Say When Someone Tells You They're Chronically Ill by Rachel Meeks Irritable Hearts: A PTSD Love Story by Gabriel Mac Podcast production by Cheyna Roth with editorial oversight by Shannon Palus, Daisy Rosario and Alicia Montgomery. Send your comments and recommendations on what to cover to thewaves@slate.com Learn more about your ad choices. Visit megaphone.fm/adchoices
On this week's episode of The Waves, Slate senior editor Shannon Palus sits down with writer and political scientist Virginia Eubanks. They talk about Virginia's New York Times magazine essay , “His PTSD, and My Struggle to Live With It,” and how the condition is more widespread than most people realize, even as terms like “trauma” and “triggered” are tossed around cavalierly. Later in the show, they talk about why you shouldn't give unsolicited advice to people living with PTSD—and what kind of support caregivers of people with PTSD really need. In Slate Plus: Why Virginia wanted to write her New York Times essay, and whether the COVID-19 pandemic is, technically speaking, a traumatic event. Further Recommended Reading: What to Say When Someone Tells You They're Chronically Ill by Rachel Meeks Irritable Hearts: A PTSD Love Story by Gabriel Mac Podcast production by Cheyna Roth with editorial oversight by Shannon Palus, Daisy Rosario and Alicia Montgomery. Send your comments and recommendations on what to cover to thewaves@slate.com Learn more about your ad choices. Visit megaphone.fm/adchoices
On this week's episode of The Waves, Slate senior editor Shannon Palus sits down with writer and political scientist Virginia Eubanks. They talk about Virginia's New York Times magazine essay , “His PTSD, and My Struggle to Live With It,” and how the condition is more widespread than most people realize, even as terms like “trauma” and “triggered” are tossed around cavalierly. Later in the show, they talk about why you shouldn't give unsolicited advice to people living with PTSD—and what kind of support caregivers of people with PTSD really need. In Slate Plus: Why Virginia wanted to write her New York Times essay, and whether the COVID-19 pandemic is, technically speaking, a traumatic event. Further Recommended Reading: What to Say When Someone Tells You They're Chronically Ill by Rachel Meeks Irritable Hearts: A PTSD Love Story by Gabriel Mac Podcast production by Cheyna Roth with editorial oversight by Shannon Palus, Daisy Rosario and Alicia Montgomery. Send your comments and recommendations on what to cover to thewaves@slate.com Learn more about your ad choices. Visit megaphone.fm/adchoices
On this week's episode of The Waves, Slate senior editor Shannon Palus sits down with writer and political scientist Virginia Eubanks. They talk about Virginia's New York Times magazine essay , “His PTSD, and My Struggle to Live With It,” and how the condition is more widespread than most people realize, even as terms like “trauma” and “triggered” are tossed around cavalierly. Later in the show, they talk about why you shouldn't give unsolicited advice to people living with PTSD—and what kind of support caregivers of people with PTSD really need. In Slate Plus: Why Virginia wanted to write her New York Times essay, and whether the COVID-19 pandemic is, technically speaking, a traumatic event. Further Recommended Reading: What to Say When Someone Tells You They're Chronically Ill by Rachel Meeks Irritable Hearts: A PTSD Love Story by Gabriel Mac Podcast production by Cheyna Roth with editorial oversight by Shannon Palus, Daisy Rosario and Alicia Montgomery. Send your comments and recommendations on what to cover to thewaves@slate.com Learn more about your ad choices. Visit megaphone.fm/adchoices
On this week's episode of The Waves, Slate senior editor Shannon Palus sits down with writer and political scientist Virginia Eubanks. They talk about Virginia's New York Times magazine essay , “His PTSD, and My Struggle to Live With It,” and how the condition is more widespread than most people realize, even as terms like “trauma” and “triggered” are tossed around cavalierly. Later in the show, they talk about why you shouldn't give unsolicited advice to people living with PTSD—and what kind of support caregivers of people with PTSD really need. In Slate Plus: Why Virginia wanted to write her New York Times essay, and whether the COVID-19 pandemic is, technically speaking, a traumatic event. Further Recommended Reading: What to Say When Someone Tells You They're Chronically Ill by Rachel Meeks Irritable Hearts: A PTSD Love Story by Gabriel Mac Podcast production by Cheyna Roth with editorial oversight by Shannon Palus, Daisy Rosario and Alicia Montgomery. Send your comments and recommendations on what to cover to thewaves@slate.com Learn more about your ad choices. Visit megaphone.fm/adchoices
On this week's episode of The Waves, Slate senior editor Shannon Palus sits down with writer and political scientist Virginia Eubanks. They talk about Virginia's New York Times magazine essay , “His PTSD, and My Struggle to Live With It,” and how the condition is more widespread than most people realize, even as terms like “trauma” and “triggered” are tossed around cavalierly. Later in the show, they talk about why you shouldn't give unsolicited advice to people living with PTSD—and what kind of support caregivers of people with PTSD really need. In Slate Plus: Why Virginia wanted to write her New York Times essay, and whether the COVID-19 pandemic is, technically speaking, a traumatic event. Further Recommended Reading: What to Say When Someone Tells You They're Chronically Ill by Rachel Meeks Irritable Hearts: A PTSD Love Story by Gabriel Mac Podcast production by Cheyna Roth with editorial oversight by Shannon Palus, Daisy Rosario and Alicia Montgomery. Send your comments and recommendations on what to cover to thewaves@slate.com Learn more about your ad choices. Visit megaphone.fm/adchoices
Thoughts in Between: exploring how technology collides with politics, culture and society
Iason Gabriel is a research scientist at DeepMind and was previously a lecturer in political and moral philosophy at Oxford University. His work focuses on the moral questions raised by artificial intelligence. In this conversation we discuss how and why AI is different from other technologies; the problem of value alignment in AI; what political philosophy can tell us about how to build ethical AI systems; and much more. Iason's paper on value alignment that we discuss is here. In the paper, he also recommends this paper on decolonial AI; the book Superintelligence by Nick Bostrom; and the book Automating Inequality by Virginia Eubanks.-----------------Thanks to Cofruition for consulting on and producing the show. You can learn more about Entrepreneur First at www.joinef.com and subscribe to my weekly newsletter at tib.matthewclifford.com
Diversity, Equity, and Inclusion (DEI) have been the buzz phrase in the last few years. Today, my guest addresses how this concept plays out in product development and management. Dr. Dédé Tetsubayashi, Founder & CEO of incluu, is an Ethical Technologist and Social Scientist. She synergizes her lived experience as a Black queer woman with an invisible disability with her over 20 years of experience in ethical tech, product equity & inclusion as a catalyst for creating brave spaces and products for all. Her work empowers individuals and organizations committed to investing in equitable and accessible product development and design processes, unlocking their full potential in their products, people, and practices.In our conversation, Dédé pulls back the curtain on what building products for equitable outcomes and non-discrimination means and why she founded her business, incluu. We dive into some fascinating topics such as:How straddling multiple identities played a role in shaping Dédé's future and ultimate career choices.Why relationship building and intentional curiosity are crucial in building a culture of inclusivity.How the Incluu team helps companies and their teams understand their circles of influence and why it's essential to increase those circles.Bridging the gap between technology, justice, and ethics.The biggest challenge facing leaders today is fear and functioning from a place of fear of the unknown.Why a leader needs to hold strong beliefs loosely that will enable them to have a transparent dialogue. Why do leaders need to be aware of their knowledge gaps and understand when it's time to ask for help.Dédé recommends two books that offer a deeper dive into how high tech tools profile, police, and punish the poor and reinforce the inequality in our society: Automating Inequality by Virginia Eubanks and Weapons of Math Destruction by Cathy O'Neil.Connect with Dédé at www.incluu.us and tune into her podcast Brave Spaces Roundtable to learn more about the work that Dédé and her team do.If you found the conversation helpful, subscribe to the podcast, so you don't miss any future episodes.Connect with your host, Kele:Instagram: https://www.instagram.com/thetailoredapproach/Website: https://thetailoredapproach.com
Our guest today is Professor Veronica Barassi. Veronica is an anthropologist and author of Data Child Citizen (MIT Press, 2020). Veronica campaigns and writes about the impact of data technologies and artificial intelligence on human rights and democracy. As a mother, Veronica was becoming increasingly concerned about the data being collected on her two children by digital platforms. Her research resulted in the book as well as a TED talk, What tech companies know about your kids, that's had over 2 million views. Since the publication of her book, she says there's been a huge acceleration in the datafication of children, partly due to the pandemic, and an increase in the ways in which AI technologies are being used to profile people.Veronica explores what she believes anthropology uniquely brings to the study of data technologies and AI. She asks (and answers), “why would an anthropological approach be different from say, for instance, Virginia Eubanks, who uses ethnographic methodologies and has a real context-specific understanding of what's happening on the ground.” Turning to anthropology's (late) engagement with AI, data, and algorithms, she says it used to be a niche area of research. But “we've actually seen a reality check for anthropologists because these technologies are…involved in deeply problematic and hierarchical processes of meaning-construction and power-making that there's no way that anthropologists could shy away from this”.One of the best books “that really makes us see things for what they are ["in this current time we're living in"] is David Graeber's The Utopia of Rules: On Technology, Stupidity, and the Secret Joys of Bureaucracy. Graeber “talks about how bureaucracy is actually there to construct social truth, but this type of bureaucratic work has been now replaced by algorithms and artificial intelligence”, a connection she tries to make in her article, David Graeber, Bureaucratic Violence and the Critique of Surveillance Capitalism.We discuss how anthropologists can make their work both academically rigorous and accessible to the public, and she talks about her own personal experience of doing the TED talk and how she felt a responsibility to bring the topic of child datafication to a wider audience, campaigning, and raising awareness.Veronica provokes anthropology scholars with a call to action given that one of her “major critiques of anthropology…is the fact that as anthropologists often shy away from engaging theoretically with disciplines that do not share their approach". And what does it mean when we say research is “not anthropological enough”? Lastly, Veronica suggests that, given machines must be taught basic concepts, like what is a child (“as anthropologists, we know that these concepts are so complex, so culturally specific, so biased”), what anthropology can do is “highlight the way in which these technologies are always inevitably going to get and be biased”. She ends on a note of excitement: “We're going to see such great research emerging in the next few years. I'm actually looking forward to that”. Follow Veronica on Twitter @veronicabarassi. Read an edited version of our conversation together with reading list.
Old prejudices are often coded into new technologies, even those technologies that claim to enhance diversity and fairness. We break down the metaphors of the New Jim Code (from Ruha Benjamin) and the Digital Poorhouse (from Virginia Eubanks) to show how modern technological "fixes" discriminate against Black people and poor people, respectively. Even the best-intentioned algorithms can have disastrous consequences (not unlike Abi's cooking). We suggest some ways that designers and communicators can better account for race and poverty in their designs. In addition, we reveal the fourth rhetorical appeal from Aristotle's lost works. For transcripts and sources, see https://faculty.mnsu.edu/tctalk/
Edquity is an emergency financial assistance platform for college students that is distributing $50 million per quarter. Founder David Helene is a college financing wunderkind who is committed to keeping low-income college students enrolled; in this episode, we dive deep into how edtech can help support equity in higher education.Resources that David recommends:Paying the Price by Sara Goldrick-RabAutomating Inequality by Virginia Eubanks
Today we are going to hear from Virginia Eubanks from the University of Albany in America as she talks about her book: Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. It has particular interest in Australia because of the Federal Governments love affair with similar systems as it pushes to automate our social security system.
“Feminists have long dreamed of sexual freedom,” writes Amia Srinivasan. “What they refuse to accept is its simulacrum: sex that is said to be free, not because it is equal, but because it is ubiquitous.”Srinivasan is an Oxford philosopher who, in 2018, wrote the viral essay “Does Anyone Have the Right to Sex?” Her piece was inspired by Elliot Rodger's murderous rampage and the misogynist manifesto he published to justify it. But Srinivasan's inquiry opened out to larger questions about the relationship between sex and status, what happens when we're undesired for unjust reasons and whether we can change our own preferences and passions. The task, as she frames it, is “not imagining a desire regulated by the demands of justice, but a desire set free from the binds of injustice.” I love that line.Srinivasan's new book of essays, “The Right to Sex,” includes that essay alongside other challenging pieces considering consent, pornography, student-professor relationships, sex work and the role of law in regulating all of those activities. This is a conversation about topics we don't always cover on this show, but that shape the world we all live in: Monogamy and polyamory, the nature and malleability of desire, the interplay between sex and status-seeking, what it would mean to be sexually free, the relationship between inequality and modern dating, incels, the feminist critique of porn, how the internet has transformed the sexual culture for today's young people and much more.(One note: This conversation was recorded before the Supreme Court permitted a Texas law prohibiting abortions after six weeks, arguably ushering in the post-Roe era. We're working on an episode that will discuss that directly.)Mentioned: The Right to Sex by Amia Srinivasan"Sex Worker Syllabus and Toolkit for Academics" by Heather Berg, Angela Jones and PJ Patella-ReyBook recommendations: Ain't Gonna Let Nobody Turn Me Around by Alethia Jones and Virginia Eubanks, with Barbara Smith Revolting Prostitutes by Juno Mac and Molly SmithFeminist International by Verónia Gago, translated by Liz Mason-DeeseYou can find transcripts (posted midday) and more episodes of "The Ezra Klein Show" at nytimes.com/ezra-klein-podcast, and you can find Ezra on Twitter @ezraklein. Book recommendations from all our guests are listed at www.nytimes.com/article/ezra-klein-show-book-recs.Thoughts? Guest suggestions? Email us at ezrakleinshow@nytimes.com.“The Ezra Klein Show” is produced by Annie Galvin, Jeff Geld and Rogé Karma; fact-checking by Michelle Harris; original music by Isaac Jones; mixing by Jeff Geld, audience strategy by Shannon Busta. Special thanks to Kristin Lin.
They’re called electronic visit verification apps, or EVVs. They log the hours and the movements of home health care workers paid for by Medicaid. States are just starting to roll them out as part of an Obama-era program that promised to make managing the work of home aides more efficient and reduce fraud in the system. Marketplace’s Meghan McCarty Carino speaks with Virginia Eubanks, the author of “Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor.” She’s been following Arkansas’ implementation of EVVs and co-wrote a story about it for the Guardian newspaper. Eubanks said the state’s app has been glitchy, which led to missed paychecks for aides.
They’re called electronic visit verification apps, or EVVs. They log the hours and the movements of home health care workers paid for by Medicaid. States are just starting to roll them out as part of an Obama-era program that promised to make managing the work of home aides more efficient and reduce fraud in the system. Marketplace’s Meghan McCarty Carino speaks with Virginia Eubanks, the author of “Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor.” She’s been following Arkansas’ implementation of EVVs and co-wrote a story about it for the Guardian newspaper. Eubanks said the state’s app has been glitchy, which led to missed paychecks for aides.
They’re called electronic visit verification apps, or EVVs. They log the hours and the movements of home health care workers paid for by Medicaid. States are just starting to roll them out as part of an Obama-era program that promised to make managing the work of home aides more efficient and reduce fraud in the system. Marketplace’s Meghan McCarty Carino speaks with Virginia Eubanks, the author of “Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor.” She’s been following Arkansas’ implementation of EVVs and co-wrote a story about it for the Guardian newspaper. Eubanks said the state’s app has been glitchy, which led to missed paychecks for aides.
How do you see your work and the world in 10 years? This is part 2 with Joanna Bryson, Virginia Eubanks, Hessie Jones, Morgan Klaus, Osamu Saito, Rupal Srivastava, Gabriela de Queiroz e Buse Çetin.
Systems used by the government to automate social services may create injustice. Virginia talked to Kunumi about the impact of these systems on people's lives. She also focuses her work on trauma: social, physical, and mental.
In this episode: Big news from Nazareth College. We've formed the Institute for Technology, Artificial Intelligence, and Society (ITAS), a pioneering initiative in higher education to train future professionals to guide and develop technology toward equitable and just ends. Guests: Dianne Oliver, Ph.D., is co-director of the Nazareth College Institute for Technology, AI, and Society (ITAS), and since 2015 has been dean of the College of Arts & Sciences. Her background includes degrees in both computer science and in religion and ethics, deeply connecting her experience and work to this initiative at Naz. Yousuf George, Ph.D., is the other ITAS co-director and associate to the president for strategy and momentum. He joined the College in 2008 as a faculty member in the Mathematics Department and later served as the associate dean of the College of Arts and Sciences. Wendy Norris, Ph.D., is an assistant professor and founding faculty member for ITAS. Wendy's doctorate is in information science from the University of Colorado Boulder. She brings expertise in the design of humanitarian crisis response technologies to her teaching and research. She joined the Mathematics Department in 2020 to help launch Nazareth's new ethical data science major. Chelsea Wahl, Ph.D. — another founding faculty member for ITAS — joined Nazareth's Sociology & Anthropology Department as an assistant professor of sociology in 2020. She earned her bachelor's degree in sociology from Hamilton College and her doctorate from University of Pennsylvania. She specializes in technology, inequality, work, and organizations. Student Nate Ancona ‘21 is a senior majoring in business management and a four-year member of the swim team. He is currently taking several ITAS courses that explore programming, AI, and the ethical and societal impacts of technology and is looking at graduate schools. Book mentioned in the podcast: Automating Inequality by Virginia Eubanks
Pagina 3 con Edoardo Camurri
In this episode I speak with librarian Barbara Fister about the growing role of algorithms in our daily lives, why the architects of these systems matter, and how the move to online learning in expanding student awareness of surveillance culture. Project Information Literacy (PIL) is a nonprofit research institute that conducts ongoing, national studies on what it is like being a student in the digital age. In the past decade,EPISODE NOTES:Information Literacy in the Age of Algorithms: Student Experiences with News and Information, and the Need for Change, Head, Alison J.; Fister, Barbara; MacMillan, Margy, Project Information Literacy - https://projectinfolit.org/publications/algorithm-study/ Facebook–Cambridge Analytica data scandal - https://en.wikipedia.org/wiki/Facebook%E2%80%93Cambridge_Analytica_data_scandal Was software responsible for the financial crisis? - https://www.theguardian.com/technology/2008/oct/16/computing-software-financial-crisis Subprime Attention Crisis, Tim Hwang - https://us.macmillan.com/books/9780374538651 Age of Surveillance Capitalism, Shoshana Zuboff - - https://www.publicaffairsbooks.com/titles/shoshana-zuboff/the-age-of-surveillance-capitalism/9781610395694/ Software that monitors students during tests perpetuates inequality and violates their privacy, Shea Swauger - https://www.technologyreview.com/2020/08/07/1006132/software-algorithms-proctoring-online-tests-ai-ethics/ Google and advertising: digital capitalism in the context of Post-Fordism, the reification of language, and the rise of fake news - https://www.nature.com/articles/s41599-017-0021-4 The History of Google Ads 20 Years in the Making (Infographic)https://instapage.com/blog/google-adwords-infographic How Bezos built his data machine, Leo Kelion, https://www.bbc.co.uk/news/extra/CLQYZENMBI/amazon-data Automating InequalityHow High-Tech Tools Profile, Police, and Punish the Poor, Virginia Eubanks - https://us.macmillan.com/books/9781250074317 Machine Bias: There's software used across the country to predict future criminals. And it's biased against blacks, by Julia Angwin, Jeff Larson, Surya Mattu and Lauren Kirchner, ProPublica - https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing Algorithms of Oppression, Safiya Noble - http://algorithmsofoppression.com/ Edward Snowden NSA FILES: DECODED, By EWEN MACASKILL and GABRIEL DANCE - https://www.theguardian.com/world/interactive/2013/nov/01/snowden-nsa-files-surveillance-revelations-decoded The Fiduciary Model of Privacy, Jack M. Balkin, https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3700087 VPN Usage Surges During COVID-19 Crisis [Infographic], Niall McCarthy- https://www.forbes.com/sites/niallmccarthy/2020/03/17/vpn-usage-surges-during-covid-19-crisis-infographic/?sh=7ac8e6ab7d79 Failure to Disrupt, By Justin Reich - https://failuretodisrupt.com/ CONspirituality: A weekly study of converging right-wing conspiracy theories and faux-progressive wellness utopianism - https://conspirituality.net/about/
This episode we’re reading Sociology Non-Fiction! We discuss the differences between sociology and psychology, what Karl Marx and Aziz Ansari have in common, the over-educated but kind-of-broke worker, and the difficulties of reading books that make us both sad and angry. Plus: Pandemic Monkey Brains! You can download the podcast directly, find it on Libsyn, or get it through Apple Podcasts, Stitcher, Google Podcasts, Spotify, or your favourite podcast delivery system. In this episode Anna Ferri | Meghan Whyte | Matthew Murray | Amanda Wanner Things We Read (or tried to read) From Here to Eternity: Traveling the World to Find the Good Death by Caitlin Doughty Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor by Virginia Eubanks (this is better than Matthew implied in the episode, it is worth reading) Everybody Lies: Big Data, New Data, and What the Internet Can Tell Us About Who We Really Are by Seth Stephens-Davidowitz Evicted: Poverty and Profit in the American City by Matthew Desmond The Secret Life of Groceries: The Dark Miracle of the American Supermarket by Benjamin Lorr Palaces for the People: How Social Infrastructure Can Help Fight Inequality, Polarization, and the Decline of Civic Life by Eric Klinenberg Talking to Strangers: What We Should Know About the People We Don't Know by Malcolm Gladwell All the Rage: Mothers, Fathers and the Myth of Equal Partnership by Darcy Lockman Can't Even: How Millennials Became the Burnout Generation by Anne Helen Petersen What We Don't Talk About When We Talk About Fat by Aubrey Gordon Other Media We Mentioned The Age of Surveillance Capitalism by Shoshona Zuboff Disasters: A Sociological Approach by Kathleen Tierney The Credential Society: An Historical Sociology of Education and Stratification by Randall Collins Engines of Anxiety: Academic Rankings, Reputation, and Accountability by Wendy Nelson Espeland and Michael Sauder Beyond the Body: Death and Social Identity by Elizabeth Hallam, Glennys Howarth, Jenny Hockey The Protestant Work Ethic and the Spirit of Capitalism by Max Weber The Presentation of Self in Everyday Life by Erving Goffman Grocery: The Buying and Selling of Food in America by Michael Ruhlamn Three Squares: The Invention of the American Meal by Abigail Carroll Death of Sandra Bland (Wikipedia) Food Mirages in Guelph, Ontario: The Impacts of Limited Food Accessibility and Affordability on Low-income Residents by Benjamin Reeve (not mentioned during the episode, but this is someone’s actual sociology thesis that Matthew thinks is neat) Body Politics: Power, Sex, and Nonverbal Communication by Nancy M. Henley (Amanda meant to mention this book but forgot!) Links, Articles, and Things Where Do Librarians Come From? Examining Educational Diversity in Librarianship by Rachel Ivy Clarke (I think this is way less humanities-focussed than our program was…) Michel Foucault (Wikipedia) Dr. Thomas Kemple Readers' Advisory for Library Staff (Facebook Group) JUMPSUIT - “Jumpsuit: how to make a personal uniform for the end of capitalism” Code Switch (NPR Podcast) Louder Than A Riot (NPR Podcast) According to Need (99% Invisible Podcast) Sabrina and Friends: Answers in Progress How Conspiracy Theories Work (a good example of a video showing the research process) Trader Joe's (Wikipedia) What does it mean to be working class in Canada? (Macleans article) 15 Sociology Books by BIPOC (Black, Indigenous, & People of Colour) Authors Every month Book Club for Masochists: A Readers’ Advisory Podcasts chooses a genre at random and we read and discuss books from that genre. We also put together book lists for each episode/genre that feature works by BIPOC (Black, Indigenous, & People of Colour) authors. All of the lists can be found here. Beauty Diplomacy: Embodying an Emerging Nation by Oluwakemi M. Balogun W. E. B. Du Bois's Data Portraits: Visualizing Black America edited by by Whitney Battle-Baptiste and Britt Rusert The Next American Revolution: Sustainable Activism for the Twenty-First Century by Grace Lee Boggs & Scott Kurashige Racism without Racists: Color-Blind Racism and the Persistence of Racial Inequality in the United States by Eduardo Bonilla-Silva Women, Race & Class by Angela Y. Davis Winners Take All: The Elite Charade of Changing the World by Anand Giridharadas Follow Me, Akhi: The Online World of British Muslims by Hussein Kesvani I Am Woman: A Native Perspective on Sociology and Feminism by Lee Maracle Body and Soul: The Black Panther Party and The Fight Against Medical Discrimination by Alondra Nelson Finding Latinx: In Search of the Voices Redefining Latino Identity by Paola Ramos Fruteros: Street Vending, Illegality, and Ethnic Community in Los Angeles by Rocío Rosales Decolonizing Methodologies: Research and Indigenous Peoples by Linda Tuhiwai Smith Off the Books: The Underground Economy of the Urban Poor by Sudhir Venkatesh Indigenous Writes: A Guide to First Nations, Métis & Inuit Issues in Canada by Chelsea Vowel Caste: The Origins of Our Discontents by Isabel Wilkerson Watch us Stream! Our Twitch channel - Fridays in January, 9pm Eastern Our YouTube channel - Recordings of streams Give us feedback! Fill out the form to ask for a recommendation or suggest a genre or title for us to read! Check out our Tumblr, follow us on Twitter or Instagram, join our Facebook Group, or send us an email! Join us again on Tuesday, January 19th when we’ll be talking about our Reading Resolutions for 2021! Then on Tuesday, February 2nd, just in time for Valentine’s Day, we’ll be doing our annual romance fiction episode and talking about the genre of Regency Romance!
Alina Utrata talks to Stefanie Felsberger, a PhD candidate at Cambridge University, about her research on surveillance, data flows and mensuration tracking apps. They discuss how colonization impacted the development of surveillance technologies, why we think (or shouldn't think) about data as a commodity instead of labor, and how the ownership of knowledge about female bodies has translated into power—from the witch burnings to period apps.Tweet at Alina.Tweet at Stefanie.Contact us.Articles mentioned in this podcast:Stefanie Felsberger's article “Colonial Cables – The Politics of Surveillance in the Middle East and North Africa.”The woman who tried to hide her pregnancy from Big Data (and failed) and why pregnant women are such a high value target for advertisers. And if want to know more about the Smart Period Cup.Amazon experimenting with paying some consumers for their data. They've also entered the healthcare market.The US military is buying location data from every day apps, including a Muslim prayer app and Muslim dating site.More on testing and importing technologies in low rights environments, or how colonization spurred the development of surveillance technologies. For some more contemporary examples, how technologies developed by US military contractors in Yemen were used to disburse G20 protesters in Pittsburgh in 2009.More on surveillance tech used to target the Black Lives Matter protests here and here. And an ACLU overview on surveillance tech available in the US, as well as who has stingray tracking devices. And on the use of police drones to surveil protestors. Virginia Eubanks on how marginalized groups are often governments' test subjects (her full book on the subject here or here.) Relatedly, how Baltimore became the US's lab for developing surveillance tech.How the UNHCR is collecting iris data from refugees in Jordan.On Chinese companies role in Africa and the Middle East, watch part II of this documentary.On the NSO Group and how their tech was linked to the murder of journalist Jamal Khashoggi and the hacking of Jeff Bezos's phone.More academic books and articles:Jarrett, Kylie. 2016. Feminism, Labour and Digital Media: The Digital Housewife. New York and London: Routledge.Lupton, Deborah. 2016. The Quantified Self: A Sociology of Self-Tracking. Cambridge: Polity Press. EPub.Federici, Silvia. 2004. Caliban and the Witch. Brooklyn, NY: Autonomedia.Browne, Simone. 2015. Dark Matters: On the Surveillance of Blackness. Durham and London: Duke University Press.Fuchs, Christian. 2013. “Theorizing and Analyzing Digital Labor: From Global Value Chains to Modes of Production.” The Political Economy of Communication 2, no. 1: 3–27.Kaplan, Martha. 1995. “Panopticon in Poona: An Essay on Foucault and Colonialism.” Cultural Anthropology 10: 85-98.Mitchell, Timothy. 1988. Colonizing Egypt. Berkley, Los Angeles and London: University of California Press.Nowhere Land by Kevin MacLeodLink: https://incompetech.filmmusic.io/song/4148-nowhere-landLicense: http://creativecommons.org/licenses/by/4.0/ Hosted on Acast. See acast.com/privacy for more information.
On this week's episode of the show, I chat with Virginia Eubanks about how high-tech tools and software profile and punish people of color and low-income people and families. The post Episode #181: Virginia Eubanks appeared first on PolicyViz.
On this week's episode of the show, I chat with Virginia Eubanks about how high-tech tools and software profile and punish people of color and low-income people and families. The post Episode #181: Virginia Eubanks appeared first on PolicyViz.
Welcome to another episode of Decoding 40 in the middle of a pandemic and racial injustice. The guys check in with L.O. about to tackle another home construction project and a new book he's reading: Automating Inequality by Virginia Eubanks. Vin shares a funeral crashing story about him going to a funeral parlor to get a program just for an excuse to play hookie from work. It is more Seinfield than Seinfeld. Alaric talks about a myriad of emotions sparked from the death of Justice Ruth Bader Ginsburg and the Brianna Taylor grand jury verdict. Mack is more optimistic and believes our humanity will prevail. You'll hear all of this and more on this episode of Decoding 40. If you haven't already registered to vote, please do so by visiting www.vote.org. Don't let all those who dedicated their entire life to fighting for these rights die in vain. The 2020 Census is more than a population count. It's an opportunity to shape your community's future. Please complete the Census for 2020. Be sure to catch us every Monday night at 11pm EST for Decoding 40 After Dark on FB Live and YouTube. Want to be our Whiskey Warrior of the Week? Or, do you have an event or product that you would like us to attend, sample and promote? Then, please send us an email to Decoding40@gmail.com to start the discussion.
Welcome to another episode of Decoding 40 in the middle of a pandemic and racial injustice. The guys check in with L.O. about to tackle another home construction project and a new book he’s reading: Automating Inequality by Virginia Eubanks. Vin shares a funeral crashing story about him going to a funeral parlor to get a program just for an excuse to play hookie from work. It is more Seinfield than Seinfeld. Alaric talks about a myriad of emotions sparked from the death of Justice Ruth Bader Ginsburg and the Brianna Taylor grand jury verdict. Mack is more optimistic and believes our humanity will prevail. You’ll hear all of this and more on this episode of Decoding 40. If you haven't already registered to vote, please do so by visiting www.vote.org. Don't let all those who dedicated their entire life to fighting for these rights die in vain. The 2020 Census is more than a population count. It's an opportunity to shape your community's future. Please complete the Census for 2020. Be sure to catch us every Monday night at 11pm EST for Decoding 40 After Dark on FB Live and YouTube. Want to be our Whiskey Warrior of the Week? Or, do you have an event or product that you would like us to attend, sample and promote? Then, please send us an email to Decoding40@gmail.com to start the discussion.
Virginia Eubanks examines the relationship between technology and society in her book Automating Inequality: How High-Tech tools Profile, Police, and Punish the Poor and joins us this week for a discussion about who matters in a democracy and the empathy gap between the people who develop the technology for social systems and the people who use those systems.Eubanks is an Associate Professor of Political Science at the University at Albany, SUNY. She is also the author of Digital Dead End: Fighting for Social Justice in the Information Age; and co-editor, with Alethia Jones, of Ain't Gonna Let Nobody Turn Me Around: Forty Years of Movement Building with Barbara Smith. Her writing about technology and social justice has appeared in Scientific American, The Nation, Harper's, and Wired. She was a founding member of the Our Data Bodies Project and a 2016-2017 Fellow at New America.Additional InformationAutomating Inequality: How High-Tech tools Profile, Police, and Punish the PoorEubanks will present a lecture on her work for Penn State's Rock Ethics Institute on October 1, 2020 at 6:00 p.m. The event is free and open to anyone. Register here.Related EpisodesA roadmap to a more equitable democracyWill AI destroy democracy?Facebook is not a democracy
În episodul 125 al podcastului “Un român în Londra” am vorbit despre tirania algoritmului pentru liceeni britanici (Automating Inequality de Virginia Eubanks), covidro.info făcut de Alex Mihăileanu. Show notes: manuelcheta.com
In this episode, Jack and Shobita discuss big tech's decisions to pull back from facial recognition technology, and how the Black Lives Matter movement is influencing science and technology overall. And they chat with Virginia Eubanks, author of Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor (St. Martin's Press, 2018) and Associate Professor of Political Science at the University at Albany, SUNY.- Kashmir Hill, "Wrongfully Accused by an Algorithm." The New York Times. June 24, 2020.- Coalition for Critical Technology, "Abolish the #TechtoPrisonPipeline," June 23, 2020.- Virginia Eubanks, Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor, St. Martin's Press, 2018.- Virginia Eubanks, “Zombie Debts are Hounding Struggling Americans. Will You Be Next?" The Guardian, October 15, 2019- Virginia Eubanks, “Algorithms Designed to Fight Poverty Can Actually Make It Worse” Scientific American, Volume 319, No 5 (pp 68-71). November 2018. (Part of a Special issue, “The Science of Inequality”)- Virginia Eubanks, “High-Tech Homelessness” American Scientist, July-August 2018- Virginia Eubanks, “The Digital Poorhouse,” Harper’s Magazine January 2018- Virginia Eubanks, “A Child Abuse Prediction Model Fails Poor Families,” WIRED Magazine, January 15, 2018- Virginia Eubanks, "Want to Cut Welfare? There’s an App for That," The Nation, March 27, 2015Transcript available at thereceivedwisdom.org
This week on Sinica, we continue with the ongoing California series of podcasts that Kaiser recorded last winter, and present a conversation taped in December, when he chatted with Margaret (Molly) Roberts, an associate professor in the Department of Political Science at the University of California, San Diego. Molly also co-directs the China Data Lab at the 21st Century China Center, and her latest book, Censored: Distraction and Diversion Inside China’s Great Firewall, takes a deep, data-driven look at the way that internet censorship functions, and how it impacts Chinese internet users. 15:21: Dispelling two narratives about China’s internet censorship25:24: Distracting online communities by digitally flooding forums32:43: How censorship affects those who experience it41:52: How the discussion around Chinese internet censorship has evolvedRecommendations:Molly: Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor, by Virginia Eubanks. Kaiser: The Syllabus, by Evgeny Morozov: A website offering curated syllabi featuring text, audio, and video on a range of topics, including technology, global affairs, arts and culture, and more.
Episode 7--The Politics of Geoengineering, Climate, and COVID-19 featuring Jane FlegalShobita and Jack discuss the ongoing COVID-19 pandemic and its implications in the United States and Britain, and interview Jane Flegal, Program Officer overseeing US climate at The William and Flora Hewlett Foundation, a Fellow at the Institute for Science, Innovation, and Society at the University of Oxford, Adjunct Professor in the School for the Future of Innovation in Society at Arizona State University.Links Related to the Podcast:- Ezra Klein (2019). "The geoengineering question." Vox. December 23.- Jane A. Flegal, Anna-Maria Hubert, David R. Morrow, Juan B. Moreno-Cruz (2019). "Solar Geoengineering: Social Science, Legal, Ethical, and Economic Frameworks." Annual Review of Environment and Resources. October.- David E. Winickoff, Jane A. Flegal, and Asfawossen Asrat (2015). "Engaging the Global South on climate engineering research." Nature Climate Change. June 24.- Jane A. Flegal and Aarti Gupta (2018). "Evoking equity as a rationale for solar geoengineering research? Scrutinizing emerging expert visions of equity." International Environmental Agreement: Politics, Law and Economics. 18: 45-61.- Jane A. Flegal and Andrew Maynard (2017). "'Geostorm' is a very silly movie that raises some very serious questions." Popular Science. October 22.- Carnegie Climate Governance Initiative (2020). "Remembering Steve Rayner: the person who framed the geoengineering debate."- Morgan Ames (2019). The Charisma Machine: The Life, Death, and Legacy of One Laptop per Child. MIT Press.- Virginia Eubanks (2018). Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. St. Martin's Press.Full transcript available at thereceivedwisdom.org.
Jepson student Ben Weinstein, ’20, interviews 2019-20 Jepson Leadership Forum speaker Virginia Eubanks prior to her presentation, "Algorithms, Austerity, and Inequality." February 13, 2020
The Jepson Leadership Forum presents Virginia Eubanks, Associate professor of political science at the University at Albany - State University of New York, for a presentation on "Algorithms, Austerity, and Inequality." Feb. 13, 2020
The State of Indiana denies one million applications for healthcare, foodstamps and cash benefits in three years―because a new computer system interprets any mistake as “failure to cooperate.” In Los Angeles, an algorithm calculates the comparative vulnerability of tens of thousands of homeless people in order to prioritize them for an inadequate pool of housing resources. In Pittsburgh, a child welfare agency uses a statistical model to try to predict which children might be future victims of abuse or neglect. Since the dawn of the digital age, decision-making in finance, employment, politics, health and human services has undergone revolutionary change. Today, automated systems―rather than humans―control which neighborhoods get policed, which families attain needed resources, and who is investigated for fraud. While we all live under this new regime of data, the most invasive and punitive systems are aimed at the poor. In Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor (St. Martin's, 2018), Virginia Eubanks systematically investigates the impacts of data mining, policy algorithms, and predictive risk models on poor and working-class people in America. The book is full of heart-wrenching and eye-opening stories, from a woman in Indiana whose benefits are literally cut off as she lays dying to a family in Pennsylvania in daily fear of losing their daughter because they fit a certain statistical profile. The U.S. has always used its most cutting-edge science and technology to contain, investigate, discipline and punish the destitute. Like the county poorhouse and scientific charity before them, digital tracking and automated decision-making hide poverty from the middle-class public and give the nation the ethical distance it needs to make inhumane choices: which families get food and which starve, who has housing and who remains homeless, and which families are broken up by the state. In the process, they weaken democracy and betray our most cherished national values. This deeply researched and passionate book could not be more timely. John Danaher is a lecturer the National University of Ireland, Galway. He is also the host of the wonderful podcast Philosophical Disquisitions. You can find it here on Apple Podcasts. Learn more about your ad choices. Visit megaphone.fm/adchoices
The State of Indiana denies one million applications for healthcare, foodstamps and cash benefits in three years―because a new computer system interprets any mistake as “failure to cooperate.” In Los Angeles, an algorithm calculates the comparative vulnerability of tens of thousands of homeless people in order to prioritize them for an inadequate pool of housing resources. In Pittsburgh, a child welfare agency uses a statistical model to try to predict which children might be future victims of abuse or neglect. Since the dawn of the digital age, decision-making in finance, employment, politics, health and human services has undergone revolutionary change. Today, automated systems―rather than humans―control which neighborhoods get policed, which families attain needed resources, and who is investigated for fraud. While we all live under this new regime of data, the most invasive and punitive systems are aimed at the poor. In Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor (St. Martin's, 2018), Virginia Eubanks systematically investigates the impacts of data mining, policy algorithms, and predictive risk models on poor and working-class people in America. The book is full of heart-wrenching and eye-opening stories, from a woman in Indiana whose benefits are literally cut off as she lays dying to a family in Pennsylvania in daily fear of losing their daughter because they fit a certain statistical profile. The U.S. has always used its most cutting-edge science and technology to contain, investigate, discipline and punish the destitute. Like the county poorhouse and scientific charity before them, digital tracking and automated decision-making hide poverty from the middle-class public and give the nation the ethical distance it needs to make inhumane choices: which families get food and which starve, who has housing and who remains homeless, and which families are broken up by the state. In the process, they weaken democracy and betray our most cherished national values. This deeply researched and passionate book could not be more timely. John Danaher is a lecturer the National University of Ireland, Galway. He is also the host of the wonderful podcast Philosophical Disquisitions. You can find it here on Apple Podcasts. Learn more about your ad choices. Visit megaphone.fm/adchoices
"In Automating Inequality, Virginia Eubanks systematically investigates the impacts of data mining, policy algorithms, and predictive risk models on poor and working-class people in America. The book is full of heart-wrenching and eye-opening stories, from a woman in Indiana whose benefits are literally cut off as she lays dying to a family in Pennsylvania in daily fear of losing their daughter because they fit a certain statistical profile."In April, the Penn State Rock Ethics Institute is hosting Eubanks to talk about her book Automating Inequality. Prior to that, on March 26, Brady Clemens, our district library consultant, will lead a nonfiction book club in Schlow's Sun Room to discuss the book. We chat with Brady about Automating Inequality and some of the sobering and alarming information Eubanks discovered in her research and investigation.
The State of Indiana denies one million applications for healthcare, foodstamps and cash benefits in three years―because a new computer system interprets any mistake as “failure to cooperate.” In Los Angeles, an algorithm calculates the comparative vulnerability of tens of thousands of homeless people in order to prioritize them for an inadequate pool of housing resources. In Pittsburgh, a child welfare agency uses a statistical model to try to predict which children might be future victims of abuse or neglect. Since the dawn of the digital age, decision-making in finance, employment, politics, health and human services has undergone revolutionary change. Today, automated systems―rather than humans―control which neighborhoods get policed, which families attain needed resources, and who is investigated for fraud. While we all live under this new regime of data, the most invasive and punitive systems are aimed at the poor. In Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor (St. Martin's, 2018), Virginia Eubanks systematically investigates the impacts of data mining, policy algorithms, and predictive risk models on poor and working-class people in America. The book is full of heart-wrenching and eye-opening stories, from a woman in Indiana whose benefits are literally cut off as she lays dying to a family in Pennsylvania in daily fear of losing their daughter because they fit a certain statistical profile. The U.S. has always used its most cutting-edge science and technology to contain, investigate, discipline and punish the destitute. Like the county poorhouse and scientific charity before them, digital tracking and automated decision-making hide poverty from the middle-class public and give the nation the ethical distance it needs to make inhumane choices: which families get food and which starve, who has housing and who remains homeless, and which families are broken up by the state. In the process, they weaken democracy and betray our most cherished national values. This deeply researched and passionate book could not be more timely. John Danaher is a lecturer the National University of Ireland, Galway. He is also the host of the wonderful podcast Philosophical Disquisitions. You can find it here on Apple Podcasts. Learn more about your ad choices. Visit megaphone.fm/adchoices
The State of Indiana denies one million applications for healthcare, foodstamps and cash benefits in three years―because a new computer system interprets any mistake as “failure to cooperate.” In Los Angeles, an algorithm calculates the comparative vulnerability of tens of thousands of homeless people in order to prioritize them for an inadequate pool of housing resources. In Pittsburgh, a child welfare agency uses a statistical model to try to predict which children might be future victims of abuse or neglect. Since the dawn of the digital age, decision-making in finance, employment, politics, health and human services has undergone revolutionary change. Today, automated systems―rather than humans―control which neighborhoods get policed, which families attain needed resources, and who is investigated for fraud. While we all live under this new regime of data, the most invasive and punitive systems are aimed at the poor. In Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor (St. Martin's, 2018), Virginia Eubanks systematically investigates the impacts of data mining, policy algorithms, and predictive risk models on poor and working-class people in America. The book is full of heart-wrenching and eye-opening stories, from a woman in Indiana whose benefits are literally cut off as she lays dying to a family in Pennsylvania in daily fear of losing their daughter because they fit a certain statistical profile. The U.S. has always used its most cutting-edge science and technology to contain, investigate, discipline and punish the destitute. Like the county poorhouse and scientific charity before them, digital tracking and automated decision-making hide poverty from the middle-class public and give the nation the ethical distance it needs to make inhumane choices: which families get food and which starve, who has housing and who remains homeless, and which families are broken up by the state. In the process, they weaken democracy and betray our most cherished national values. This deeply researched and passionate book could not be more timely. John Danaher is a lecturer the National University of Ireland, Galway. He is also the host of the wonderful podcast Philosophical Disquisitions. You can find it here on Apple Podcasts. Learn more about your ad choices. Visit megaphone.fm/adchoices
The State of Indiana denies one million applications for healthcare, foodstamps and cash benefits in three years―because a new computer system interprets any mistake as “failure to cooperate.” In Los Angeles, an algorithm calculates the comparative vulnerability of tens of thousands of homeless people in order to prioritize them for an inadequate pool of housing resources. In Pittsburgh, a child welfare agency uses a statistical model to try to predict which children might be future victims of abuse or neglect. Since the dawn of the digital age, decision-making in finance, employment, politics, health and human services has undergone revolutionary change. Today, automated systems―rather than humans―control which neighborhoods get policed, which families attain needed resources, and who is investigated for fraud. While we all live under this new regime of data, the most invasive and punitive systems are aimed at the poor. In Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor (St. Martin's, 2018), Virginia Eubanks systematically investigates the impacts of data mining, policy algorithms, and predictive risk models on poor and working-class people in America. The book is full of heart-wrenching and eye-opening stories, from a woman in Indiana whose benefits are literally cut off as she lays dying to a family in Pennsylvania in daily fear of losing their daughter because they fit a certain statistical profile. The U.S. has always used its most cutting-edge science and technology to contain, investigate, discipline and punish the destitute. Like the county poorhouse and scientific charity before them, digital tracking and automated decision-making hide poverty from the middle-class public and give the nation the ethical distance it needs to make inhumane choices: which families get food and which starve, who has housing and who remains homeless, and which families are broken up by the state. In the process, they weaken democracy and betray our most cherished national values. This deeply researched and passionate book could not be more timely. John Danaher is a lecturer the National University of Ireland, Galway. He is also the host of the wonderful podcast Philosophical Disquisitions. You can find it here on Apple Podcasts. Learn more about your ad choices. Visit megaphone.fm/adchoices
The State of Indiana denies one million applications for healthcare, foodstamps and cash benefits in three years―because a new computer system interprets any mistake as “failure to cooperate.” In Los Angeles, an algorithm calculates the comparative vulnerability of tens of thousands of homeless people in order to prioritize them for an inadequate pool of housing resources. In Pittsburgh, a child welfare agency uses a statistical model to try to predict which children might be future victims of abuse or neglect. Since the dawn of the digital age, decision-making in finance, employment, politics, health and human services has undergone revolutionary change. Today, automated systems―rather than humans―control which neighborhoods get policed, which families attain needed resources, and who is investigated for fraud. While we all live under this new regime of data, the most invasive and punitive systems are aimed at the poor. In Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor (St. Martin's, 2018), Virginia Eubanks systematically investigates the impacts of data mining, policy algorithms, and predictive risk models on poor and working-class people in America. The book is full of heart-wrenching and eye-opening stories, from a woman in Indiana whose benefits are literally cut off as she lays dying to a family in Pennsylvania in daily fear of losing their daughter because they fit a certain statistical profile. The U.S. has always used its most cutting-edge science and technology to contain, investigate, discipline and punish the destitute. Like the county poorhouse and scientific charity before them, digital tracking and automated decision-making hide poverty from the middle-class public and give the nation the ethical distance it needs to make inhumane choices: which families get food and which starve, who has housing and who remains homeless, and which families are broken up by the state. In the process, they weaken democracy and betray our most cherished national values. This deeply researched and passionate book could not be more timely. John Danaher is a lecturer the National University of Ireland, Galway. He is also the host of the wonderful podcast Philosophical Disquisitions. You can find it here on Apple Podcasts. Learn more about your ad choices. Visit megaphone.fm/adchoices
The State of Indiana denies one million applications for healthcare, foodstamps and cash benefits in three years―because a new computer system interprets any mistake as “failure to cooperate.” In Los Angeles, an algorithm calculates the comparative vulnerability of tens of thousands of homeless people in order to prioritize them for an inadequate pool of housing resources. In Pittsburgh, a child welfare agency uses a statistical model to try to predict which children might be future victims of abuse or neglect. Since the dawn of the digital age, decision-making in finance, employment, politics, health and human services has undergone revolutionary change. Today, automated systems―rather than humans―control which neighborhoods get policed, which families attain needed resources, and who is investigated for fraud. While we all live under this new regime of data, the most invasive and punitive systems are aimed at the poor. In Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor (St. Martin's, 2018), Virginia Eubanks systematically investigates the impacts of data mining, policy algorithms, and predictive risk models on poor and working-class people in America. The book is full of heart-wrenching and eye-opening stories, from a woman in Indiana whose benefits are literally cut off as she lays dying to a family in Pennsylvania in daily fear of losing their daughter because they fit a certain statistical profile. The U.S. has always used its most cutting-edge science and technology to contain, investigate, discipline and punish the destitute. Like the county poorhouse and scientific charity before them, digital tracking and automated decision-making hide poverty from the middle-class public and give the nation the ethical distance it needs to make inhumane choices: which families get food and which starve, who has housing and who remains homeless, and which families are broken up by the state. In the process, they weaken democracy and betray our most cherished national values. This deeply researched and passionate book could not be more timely. John Danaher is a lecturer the National University of Ireland, Galway. He is also the host of the wonderful podcast Philosophical Disquisitions. You can find it here on Apple Podcasts. Learn more about your ad choices. Visit megaphone.fm/adchoices
The State of Indiana denies one million applications for healthcare, foodstamps and cash benefits in three years―because a new computer system interprets any mistake as “failure to cooperate.” In Los Angeles, an algorithm calculates the comparative vulnerability of tens of thousands of homeless people in order to prioritize them for an inadequate pool of housing resources. In Pittsburgh, a child welfare agency uses a statistical model to try to predict which children might be future victims of abuse or neglect. Since the dawn of the digital age, decision-making in finance, employment, politics, health and human services has undergone revolutionary change. Today, automated systems―rather than humans―control which neighborhoods get policed, which families attain needed resources, and who is investigated for fraud. While we all live under this new regime of data, the most invasive and punitive systems are aimed at the poor. In Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor (St. Martin's, 2018), Virginia Eubanks systematically investigates the impacts of data mining, policy algorithms, and predictive risk models on poor and working-class people in America. The book is full of heart-wrenching and eye-opening stories, from a woman in Indiana whose benefits are literally cut off as she lays dying to a family in Pennsylvania in daily fear of losing their daughter because they fit a certain statistical profile. The U.S. has always used its most cutting-edge science and technology to contain, investigate, discipline and punish the destitute. Like the county poorhouse and scientific charity before them, digital tracking and automated decision-making hide poverty from the middle-class public and give the nation the ethical distance it needs to make inhumane choices: which families get food and which starve, who has housing and who remains homeless, and which families are broken up by the state. In the process, they weaken democracy and betray our most cherished national values. This deeply researched and passionate book could not be more timely. John Danaher is a lecturer the National University of Ireland, Galway. He is also the host of the wonderful podcast Philosophical Disquisitions. You can find it here on Apple Podcasts. Learn more about your ad choices. Visit megaphone.fm/adchoices
In Automating Inequality, Virginia Eubanks systematically investigates the impacts of data mining, policy algorithms, and predictive risk models on poor and working-class people in America. In this talk, Eubanks explores some of her book's heart-wrenching and eye-opening stories, from a woman in Indiana whose benefits are literally cut off as she lays dying to a family in Pennsylvania in daily fear of losing their daughter because they fit a certain statistical profile. “This book is downright scary,” says Naomi Klein, “but with its striking research and moving, indelible portraits of life in the ‘digital poorhouse,' you will emerge smarter and more empowered to demand justice.”
Dot Citizen is back, and this week is a special Book Chat episode! We're discussing Virginia Eubank's book: Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. A book that is as much a work of searing investigative journalism as it is brilliant political analysis, Automating Inequality examines how our attempts to engineer away human biases are creating a new "digital poorhouse." Zach and Chrisella talk about the points from the book that stuck with them, the questions it raised, and the ethics of technology automation in a world of systemic inequalities. We want to hear your thoughts! Send us your feedback on Twitter or join our community on Flick Chat!
I had the good fortune of interviewing San Francisco DA Candidate Chesa Boudin, right before the Nov 5th election. District Attorney for San Francisco is an incredibly powerful position (just ask Kamala Harris!) from which many lives will either be saved and given a second chance, or more bodies (black, white, brown) will continue to be fed into the carceral system. Moreover, if a truly progressive DA, like Chesa is elected it will allow him to enact policies (notably an end to cash bail) that will challenge the rest of America to emulate, exceed or at least consider what changes could be made to the US criminal justice system, that would focus less on CRIMINAL and more on justice. Chesa was kind enough to join the Arts of Travel Podcast to discuss his vision on how he wants to reform criminal justice; and willing to take tough questions. We discuss homelessness, mental health, the work of Ruth Wilson Gilmore (prison abolition), The work of Virginia Eubanks (predictive algorithms and tech), and activists like Standing Rock and J20 For more on Chesa, I highly recommend checking out his website (and if you're in San Fransisco, volunteering!) you can find that here: https://www.chesaboudin.com/ And for articles used to research this interview, I highly recommend: • https://www.nytimes.com/2019/04/17/magazine/prison-abolition-ruth-wilson-gilmore.html • https://www.yalelawjournal.org/forum/the-punishment-bureaucracy • https://lausan.hk/2019/reading-guide-police-prison-abolition-hong-kong/ • https://theappeal.org/san-francisco-deputy-public-defender-chesa-boudin-announces-run-for-district-attorney/ PS: Its the position of Asia Art Tours that many DAs (such as Cy Vance, Eric Holder, Kamala Harris and others) , claim to be 'progressives' when they are anything but! We despise DA's like Vance, Holder and Harris, who will happily punish the weak but are terrified of prosecuting the powerful, and dare to call themselves "progressive". Though we support Boudin now, should he turn out to be a timid, milquetoast, DA, we will happily invite critics on to excoriate his record. For now he has our support... but we'll be watching to see if his words on the campaign trail, match his deeds if he makes it into office).
Tess Posner, CEO of AI4ALL, gives an overview of the field of AI and discusses representation in AI careers, the ramifications of not having diversity in tech, the role of allies, and the future of work. AI4ALL AI4ALL Summer Programs "Decoding Diversity"- Intel study on the financial and economic returns to diversity in tech "Tech Leavers Study"- from Kapor Center for Social Impact "There is a diversity crisis in AI, but together we can fix it."- Tess Posner Algorithms of Oppression- Safiya Umoja Noble Automating Inequality- Virginia Eubanks The Gender Shades Project AOC on Automation (SxSW 2019) Opportunity@Work Open Learning Tess on Twitter See open positions at thoughtbot! Become a Sponsor of Giant Robots!
Bio Kriti Sharma (@sharma_kriti) is an Artificial Intelligence expert and a leading global voice on ethical technology and its impact on society. She built her first robot at the age of 15 in India and has been building innovative AI technologies to solve global issues, from productivity to inequality to domestic abuse, ever since. Kriti was recently named in the Forbes 30 Under 30 list and was included in the Recode 100 List of Key Influencers in Technology in 2017. She was invited as a Civic Leader to the Obama Foundation Summit. She is a Google Anita Borg Scholar and recently gave expert testimony on AI Policy to the UK Parliament in the House of Lords. While much of Silicon Valley worry about doomsday scenarios where AI will take over human civilization, Kriti Sharma has a different kind of concern: What happens if disadvantaged groups don’t have a say in the technology we’re creating? In 2017, she spearheaded the launch of the Sage Future Makers Lab, a forum that will equip young people around the world with hands-on learning for entering a career in Artificial Intelligence. Earlier this year, she founded AI for Good, an organization creating the next generation technology for a better, fairer world. Kriti also leads AI and Ethics at Sage. Resources AI for Good Kriti's Ted Talk: How to Keep Human Bias out of AI Automating Inequality by Virginia Eubanks Winners Take All by Anand Giridharadas News Roundup Amazon shareholder effort to restrict company’s facial recognition fails Two Amazon shareholder resolutions to curb Rekognition—with a K—the company’s facial recognition platform—failed to garner shareholder approval last week. One proposal would have required the company to determine whether the technology violates civil liberties before rolling it out to law enforcement. The other resolution would have required Amazon to conduct a study of human rights violations posed by Rekognition. While Amazon is reluctant to address these issues, Google and Microsoft have pledged not to sell their facial recognition to law enforcement. U.S. spy chief warns U.S. businesses about China The Financial Times reports that U.S. National Security Advisor Dan Coates has been warning U.S.-based companies about doing business with China. Coates has even gone as far as sharing classified information with executives. The classified briefings come amidst a U.S. trade war with China which includes a ban of China-based tech company Huawei from doing business in the U.S. because of a cozy relationship it allegedly had with Iran and the fact that China is alleged to be using the company’s components to spy on the U.S. The Financial Times says the briefings have largely focused on the espionage and intellectual property threats China poses. Senate passes anti-robocall bill A bi-partisan bill introduced by Senators Ed Markey and John Thune, that would slap robocall offenders with a fine of $10,000 per call, passed the Senate with a vote of 97 to 1 on Thursday. The legislation also increases penalties for scammers and works to combat number blocking. The bill is called the Telephone Robocall Abuse Criminal Enforcement and Deterrence (TRACE) Act and now heads to the House where Democrat Frank Pallone’s got a similar bill in the works. Google tweaks abortion ad policy Google has tweaked it policy for abortion ads after several misleading abortion ads showed up on the platform. Now, the company’s saying that it will certify advertisers who want to place abortion-related ads as either abortion providers or non-providers. Any advertiser that doesn’t fall into one of those categories won’t be able to run abortion ads on Google. Events Wed., 5/29 AT&T/Carnegie Mellon Livestream: Privacy in the World of Internet of Things 1pm-2:30pm Fri., 5/31 Privacy and Civil Liberties Oversight Board Public Forum on the USA FREEDOM Act 10:00AM-12:30PM Reagan Building 1300 Pennsylvania Ave., NW Mon., 6/3 Federal Communications Commission Consumer Advisory Committee Meeting 9:00AM 445 12th St., SW
Virginia Eubanks is an associate professor of political science at UAlbany's Rockefeller College of Public Affairs & Policy. On this episode of the UAlbany News Podcast, Eubanks shares about her book, 'Automating Inequality: How High-Tech Tools Profile, Police and Punish the Poor.' In the book, she details three examples of technology failing to streamline welfare programs: • an effort to automate eligibility processes for public assistance programs in Indiana •an electronic registry of the homeless in California •a statistical model in Pennsylvania that attempts to predict child maltreatment These automated public service systems are designed to serve some of the country’s most vulnerable populations, such as those living in poverty or contending with poor health, while at the same time saving the government time and money. But these technologies can leave poor families feeling tracked, targeted and trapped. Eubanks explains how these systems fail to remove human bias, exacerbate inequality and perpetuate a "Digital Poorhouse" for working-class people in America. The UAlbany News Podcast is hosted and produced by Sarah O'Carroll, a Communications Specialist at the University at Albany, State University of New York, with production assistance by Patrick Dodson and Scott Freedman. Have a comment or question about one of our episodes? You can email us at mediarelations@albany.edu, and you can find us on Twitter @UAlbanyNews.
Virginia Eubanks, Automating Inequality: How High-Tech Tools Profile, Police and Punish the Poor by Centre for Ethics, University of Toronto
Podcast Description “The Poor People’s Human Economic Rights Campaign…the way they define poverty is: if at any point you have lacked access to one of your basic economic human rights.”Virginia Eubanks is an Associate Professor of Political Science at the University at Albany, SUNY. She is the author of Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor; Digital Dead End: Fighting for Social Justice in the Information Age; and co-editor, with Alethia Jones, of Ain’t Gonna Let Nobody Turn Me Around: Forty Years of Movement Building with Barbara Smith. Her writing about technology and social justice has appeared in Scientific American, The Nation, Harper’s, and Wired. For two decades, Eubanks has worked in community technology and economic justice movements. She was a founding member of the Our Data Bodies Project and a Fellow at New America. She lives in Troy, NY. Twitter Virginia Eubanks Become a #causeascene Podcast sponsor because disruption and innovation are products of individuals who take bold steps in order to shift the collective and challenge the status quo.Learn more >All music for the #causeascene podcast is composed and produced by Chaos, Chao Pack, and Listen on SoundCloud. Listen to more great #causeascene podcasts full podcast list >
Holiday Rewind: Since the 2008 financial crash, one thing we've learned is that there exist in the US not just one economy, but many, as well as many kinds of economic actors. From platform cooperatives to cryptocurrency, people are building economic alternatives to climate devastation and capitalist extraction. So says Nathan Schneider, crusader for the internet of ownership and author of Everything for Everyone: the Radical Tradition That Is Shaping The Next Economy. Plus, Virginia Eubanks on how government and corporations are erasing social services through unequal digital practices. Music: “You Are the One the World Becomes” by Morley, from her new album ‘1000 Miles'. Support theLFShow, 10 Years of Making Power Through Media!
Juhan Sonin, designer, researcher, and MIT lecturer. Juhan specialized in software design and system engineering. He has worked at Apple, National Center for Supercomputing Applications, Massachusetts Institute of Technology (MIT), and MITRE. I had the opportunity to record this episode in Juhan’s GoInvo studio office, where he is the company’s Creative Director. Website: https://www.goinvo.com/ WE MUST SET HEALTHCARE FREE: Opensourcehealthcare.org Udemy Blockchain/Healthcare Course ($125 off with HEALTHUNCHAINED coupon): https://www.udemy.com/blockchain-and-healthcare/?couponCOde=HEALTHUNCHAINED Show Notes •Software Design and System Engineering •Asynchronous telemedicine •People don’t really care about their health until we are unwell •Blockchain use case to access medical records and proxy it from anywhere with internet •Location of conception will be part of your life (health) data •Ownership and co-ownership models for health data •Data Use Agreements •Open Genome Project •You’ve put your data out on the internet and your genetic data is open-sourced. Have you had any unexpected consequences from that decision? •Health Data Standards •Open-source Standard Health Record: http://standardhealthrecord.org/ •Data exchange problems are not only business and technology issues but generally human issues •Determinants of Health •Robot doctors and the future of healthcare •Black-box healthcare algorithms should be •Open source is the only way for Medicine https://medium.com/@marcus_baw/open-source-is-the-only-way-for-medicine-9e698de0447e •Primary Care Manifesto •Patients’ interests in owning their own health •Favorite books: The Elements of Style by William Strunk Jr; Automating Inequality by Virginia Eubanks; Democracy in Chains by Nancy MacLean; The Color of Law by Richard Rothstein News Corner: https://hitinfrastructure.com/news/aetna-ascension-sign-on-to-healthcare-blockchain-alliance On Dec 3rd, two new organizations announced that they will be joining the Alliance to be part of it’s first pilot project which seeks to determine if applying blockchain technology can help ensure the most current information about healthcare providers is available in the provider directories maintained by health insurers. The two organizations are Aetna, one of the top 3 health insurance companies in the US with $60 billion in revenue in 2017 AND Ascension, the largest Catholic health system in the world and the largest non-profit health system in the US. To me this is really exciting news because Aetna recently merged with CVS Health making the combined provider directory information from these organizations huge.
Our guest today has been on a long crusade to raise awareness about how our digital tools are continuing and exacerbating the problems we already have around poverty and inequality. Virginia Eubanks is an Associate Professor of Political Science at the University at Albany, SUNY. She is the author of the book “Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor”.
Berkman Klein Center for Internet and Society: Audio Fishbowl
Virginia Eubanks joins us for a rousing conversation about her timely and provocative book, Automating Inequality. In Automating Inequality, Eubanks systematically investigates the impacts of data mining, policy algorithms, and predictive risk models on poor and working-class people in America. The book is full of heart-wrenching and eye-opening stories, from a woman in Indiana whose benefits are literally cut off as she lays dying to a family in Pennsylvania in daily fear of losing their daughter because they fit a certain statistical profile. "This book is downright scary,” says Naomi Klein, “but with its striking research and moving, indelible portraits of life in the ‘digital poorhouse,’ you will emerge smarter and more empowered to demand justice.” More info on this event here: https://cyber.harvard.edu/events/2018-10-23/automating-inequality
In this episode I talk to Virginia Eubanks. Virginia is an Associate Professor of Political Science at the University at Albany, SUNY. She is the author of several books, including Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor and Digital Dead End: Fighting for Social Justice in the Information Age. Her writing … More Episode #47 – Eubanks on Automating Inequality
In this episode I talk to Virginia Eubanks. Virginia is an Associate Professor of Political Science at the University at Albany, SUNY. She is the author of several books, including Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor and Digital Dead End: Fighting for Social Justice in the Information Age. Her writing about technology and social justice has appeared in The American Prospect, The Nation, Harper’s and Wired. She has worked for two decades in community technology and economic justice movements. We talk about the history of poverty management in the US and how it is now being infiltrated and affected by tools for algorithmic governance. You can download the episode here or listen below. You can also subscribe to the show on iTunes or Stitcher (the RSS feed is here). Show Notes0:00 - Introduction1:39 - The future is unevenly distributed but not in the way you might think7:05 - Virginia's personal encounter with the tools for automating inequality12:33 - Automated helplessness?14:11 - The history of poverty management: denial and moralisation22:40 - Technology doesn't disrupt our ideology of poverty; it amplifies it24:16 - The problem of poverty myths: it's not just something that happens to other people28:23 - The Indiana Case Study: Automating the system for claiming benefits33:15 - The problem of automated defaults in the Indiana Case37:32 - What happened in the end?41:38 - The L.A. Case Study: A "match.com" for the homeless45:40 - The Allegheny County Case Study: Managing At-Risk Children52:46 - Doing the right things but still getting it wrong?58:44 - The need to design an automated system that addresses institutional bias1:07:45 - The problem of technological solutions in search of a problem1:10:46 - The key features of the digital poorhouse Relevant LinksVirginia's HomepageVirginia on TwitterAutomating Inequality'A Child Abuse Prediction Model Fails Poor Families' by Virginia in WiredThe Allegheny County Family Screening Tool (official webpage - includes a critical response to Virginia's Wired article)'Can an Algorithm Tell when Kids Are in Danger?' by Dan Hurley (generally positive story about the family screening tool in the New York Times).'A Response to Allegheny County DHS' by Virginia (a response to Allegheny County's defence of the family screening tool)Episode 41 with Reuben Binns on Fairness in Algorithmic Decision-MakingEpisode 19 with Andrew Ferguson about Predictive Policing #mc_embed_signup{background:#fff; clear:left; font:14px Helvetica,Arial,sans-serif; } /* Add your own MailChimp form style overrides in your site stylesheet or in this style block. We recommend moving this block and the preceding CSS link to the HEAD of your HTML file. */ Subscribe to the newsletter
10 years since the financial crash we've learned that there exists in the US not just one economy, but many, as well as many kinds of economic actors. From platform cooperatives to cryptocurrency, people are continuously building economic alternatives. So says Nathan Schneider, crusader for collective ownership and author of "Everything for Everyone: the Radical Tradition That Is Shaping The Next Economy." Plus, professor and author Virginia Eubanks on how government and corporations are erasing social services through unequal digital practices. Music: “You Are the One the World Becomes” by Morley, from her new album ‘1000 Miles'. Support theLFShow
Ergänzung zu Folge 5: Dort gibt es einen Teil über und mit Virginia Eubanks. Er beruht auf viel Lektüre und diesem Recherche-Interview mit Virginia auf Englisch. Darin spricht sie ausführlich über ihr Buch "Automating Inequality - How High Tech Tools Profile, Police, and Punish the Poor".
Often the algorithms that shape our lives feel invisible, but every now and then you really notice them. Your credit card might get declined when you’re on vacation because a system decides the behavior seems suspicious. You might buy a quirky gift for your cousin, and then have ads for that product pop up everywhere you go online. In education, schools and colleges even use big data to nudge students to stay on track. As we create this data layer around us, there’s more and more chance for systems to misfire, or to be set up in a way that consistently disadvantages one group over another. That potential for systemic unfairness is the concern of this week’s podcast guest, Virginia Eubanks. She’s an associate professor of political science at SUNY Albany and a longtime advocate for underprivileged communities as well as an expert on tech. She’s the author of Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor, which The New York Times called “riveting” and noted that that’s an unusual accomplishment for a book about policy and tech. EdSurge connected with Eubanks this month to ask her about her explorations of technology’s unintended consequences, and about what people in education should consider as they leverage big data systems.
Most of us have a growing sense that something's deeply wrong with the way that digital data is used to track and monitor us. But most of us don't realize that the poorest among us are particularly vulnerable. Virginia Eubanks argues that such data is used to criminalize and turn the poor away from public resources to which they are entitled. Resources: Virginia Eubanks, Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor St. Martin's Press, 2018 The post Punishing the Poor appeared first on KPFA.
Today, decision-making for of social service and safety net programs - everything from Medicaid to food stamps, housing and rental assistance to child welfare - is controlled not by human beings, but by models programmed to follow a pre-determined set of criteria. What if these automated systems are actually a form of discrimination, working to perpetuate, rather than eliminate, the inequality they aim to address? Virginia Eubanks, Ph.D., explains.
Virginia Eubanks, Associate Professor of Political Science at the University at Albany, SUNY, discusses her new book. Since the dawn of the digital age, decision-making in finance, employment, politics, health and human services has undergone revolutionary change. Today, automated systems—rather than humans—control which neighborhoods get policed, which families attain needed resources, and who is investigated for fraud. While we all live under this new regime of data, the most invasive and punitive systems are aimed at the poor. Automating Inequality systematically investigates the impacts of data mining, policy algorithms, and predictive risk models on poor and working-class people in America. The book is full of heart-wrenching and eye-opening stories, from a woman in Indiana whose benefits are literally cut off as she lays dying to a family in Pennsylvania in daily fear of losing their daughter because they fit a certain statistical profile.
Chelsea Barabas: How to balance AI and criminal justice (Ep. 136) MIT Research Scientist Chelsea Barabas and Joe Miller discuss how to balance AI and criminal justice to affect better defendant outcomes. Bio Chelsea Barabas (@chels_bar) is a research scientist at MIT, where she examines the spread of algorithmic decision making tools in the US criminal justice system. Formerly, Chelsea was the Head of Social Innovation with the MIT Media Lab’s Digital Currency Initiative. She has worked on a wide range of issues related to the use of emerging technologies to serve the public good around the world. Chelsea’s graduate research at MIT was on understanding the U.S.’s ongoing struggle to cultivate and hire a diverse technical workforce, and she conducted her graduate thesis in partnership with Code2040. She attended Stanford as an undergraduate, where she earned a B.A. in Sociology. Resources Chelsea Barabas, Karthik Dinakar, Joichi Ito, Madars Virza, and Jonathan Zittrain. 2018. Interventions over Predictions: Reframing the Ethical Debate for Actuarial Risk Assessment. In Proceedings of the Conference on Fairness, Accountability, and Transparency (FAT*) conference (FAT* 2018). ACM, New York, NY, USA. Virginia Eubanks, Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor (2018) News Roundup T-Mobile and Sprint announce merger plans T-Mobile and Sprint have announced merger plans again. It’s a $27 billion deal that would include Softbank giving up control of Sprint. The combined company would be called T-Mobile and, with 98 million subscribers, the combined company would become the second largest wireless carrier, behind Verizon’s 116 million. Brian Fung and Tony Romm report in the Washington Post. Comcast launches bidding war against Fox for Sky Comcast announced that it would seek to acquire European pay TV provider Sky for $31 billion. The Murdoch’s 21st Century Fox already has a bid for Sky on the table, but it’s $15 billion lower than Comcast’s, even though Fox already has a 39% stake in Sky. 21st Century Fox had rejected a separate bid by Comcast to acquire Fox’s entertainment assets, which Disney is now planning to purchase for $52.4 billion, which was also lower than Comcast’s proposal. Shalani Ramachandran, Amol Sharma and David Benoit report in the Wall Street Journal. EU investigates Apple’s Shazam bid EU antitrust regulators are investigating whether Apple’s bid for music identification service Shazam is anticompetitive. Apple had announced back in December that it was looking to acquire Shazam for an undisclosed amount. The EU is concerned the acquisition could limit consumer choice. Foo Yun Chee has more at Reuters. Senate confirms Nakasone to Lead NSA/Cyber Command The Senate unanimously confirmed U.S. Army Command Chief Lt. Gen. Paul Nakasone to serve as both the head of the National Security Administration and U.S. Cyber Command. He’ll replace Mike Rogers. Nakasone will also get a fourth star. SEC fines company formerly known as Yahoo! $35 million The Securities and Exchange Commission fined Altaba, the company that now owns Yahoo!’s remaining assets, over $35 million. The fine is for failing to disclose a 2014 data breach that compromised the data of over 500 million Yahoo! users. Jacob Katrenakes reports in the Verge. FTC warns app firms about collecting children’s data The Federal Trade Commission has warned app firms in China and Sweden about collecting the data of U.S. children. The Children’s Online Privacy Protection Act prohibits the collection of such data and applies to foreign companies. The China based- Gator Group and Sweden-based Tinitell, both sell smartwatches to children. In other news related to Children's privacy, YouTube has announced new parental controls for YouTube kids. Parents will now be able to limit recommendations and suggestions will now be made by humans. Did Diamond and Silk commit perjury? In testimony before the House Judiciary Committee last week, conservative African American internet personalities Diamond and Silk said under oath that President Trump’s 2016 presidential campaign never paid them. But there’s a 2016 Federal Election Commission (FEC) filing showing that the campaign paid them $1,275 for “field consulting”. Harper Neidig reports in the Hill. CBC members to meet in Silicon Valley to discuss diversity The Congressional Black Caucus is sending the largest delegation of lawmakers it has ever sent to Silicon Valley to discuss diversity. Just 3% of Silicon Valley tech workers are black, according to a Center for Investigative Reporting study. Shirin Ghaffary reports in Recode. Facebook warns SEC about more data misuse In a Securities and Exchange Commission filing, Facebook indicated that additional reports of the misuse of user data are likely forthcoming. The social media giant said it is conducting a third-party audit which it anticipates will reveal additional improprieties. Google’s Sergey Brin warns about AI threat Finally, Google co-founder Sergey Brin warned in the company’s annual Founders’ Letter about the future of AI and the fact that it is already transforming everything from self-driving cars to planetary discovery. Brin said he is optimistic about Artificial Intelligence and said that Alphabet is giving serious consideration to the ways in which AI will affect employment, how developers can control for bias in their algorithms, and the potential for AI to “manipulate people.” James Vincent notes in the Verge that Brin’s letter does not discuss the dangers of using AI for military intelligence, although the company has said its technology would be used for “non-offensive purposes only”. Still, several employees at the company are urging Alphabet to withdraw from its plans to work with the Pentagon.
In this weeks episode we interviewed Virginia Eubanks and talked about her newest book, "Automating Inequality." Enjoy the enlightening discussion along with some less enlightening tech news about garlic bread and Amazon.
Virginia Eubanks systematically investigates the impacts of data mining, policy algorithms, and predictive risk models on poor and working-class people in America. Virginia Eubanks is an associate professor of Political Science at the University at Albany, SUNY. For two decades, she has worked in community technology and economic justice movements. And she is also a founding member of the Our Data Bodies Project and a fellow at New America.
Will AI just wind up automating inequality?, a Policy Options podcast. Proponents of automation say the developments will create a more efficient and advanced society, but there are concerns that the changes will not affect all citizens equally. According to Virginia Eubanks, the automation of social and welfare services in the United States is creating a "digital poorhouse,” deepening class divides and diverting poor and working-class people from accessing public resources. Eubanks joined the podcast to discuss her new book Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. She is an associate professor of political science at the University at Albany, SUNY. Download for free. New episodes every second Tuesday. Tweet your questions and comments to @IRPP. Read the Policy Options feature series on the Ethical and Social Dimensions of AI.
Virginia Eubanks speaks about her most recent book Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. Eubanks systematically shows the impacts of data mining, policy algorithms, and predictive risk models on poor and working-class people in America. The book is full of heart-wrenching and eye-opening stories, from a woman in Indiana whose benefits are literally cut off as she lays dying to a family in Pennsylvania in daily fear of losing their daughter because they fit a certain statistical profile. The U.S. has always used its most cutting-edge science and technology to contain, investigate, discipline and punish the destitute. Like the county poorhouse and scientific charity before them, digital tracking and automated decision-making hide poverty from the middle-class public and give the nation the ethical distance it needs to make inhuman choices: which families get food and which starve, who has housing and who remains homeless, and which families are broken up by the state. In the process, they weaken democracy and betray our most cherished national values.
inSocialWork - The Podcast Series of the University at Buffalo School of Social Work
This episode is the second of two parts that explore social justice in the information age. In it, Dr. Virginia Eubanks continues her discussion on this topic with a question and answer exchange with members of University at Buffalo School of Social Work community.
inSocialWork - The Podcast Series of the University at Buffalo School of Social Work
This episode is the first of two with Dr. Virginia Eubanks. In it she discusses her work in understanding technology in the lives of low-income communities as well as how technology is used to manage the poor. She highlights an attempt to use technology to change the eligibility and case management processes for financial assistance as an example of why this topic is an important social justice issue.
“The poor” aren't other people – they're us. According to recent scholarship, by the time we're 75 years old, 59 percent of us will fall below the poverty line at some point in our lives. Factoring in related experiences like near-poverty, unemployment, or use of public assistance, that number climbs to a staggering 80 percent. In this episode, Ford Academic Fellow and SUNY-Albany professor Virginia Eubanks talks with New America Managing Editor Fuzz Hogan about the biggest thing we can do to address inequality in this country: recognizing that poverty is a majority issue and something that impacts us all.