POPULARITY
Algorithms are pervasive in our society and make thousands of automated decisions on our behalf every day. The possibility of digital discrimination is a very real threat, and it is very plausible for discrimination to occur accidentally (i.e. outside the intent of the system designers and programmers). Christian Sandvig joins us in this episode to talk about his work and the concept of auditing algorithms. Christian Sandvig (@niftyc) has a PhD in communications from Stanford and is currently an Associate Professor of Communication Studies and Information at the University of Michigan. His research studies the predictable and unpredictable effects that algorithms have on culture. His work exploring the topic of auditing algorithms has framed the conversation of how and why we might want to have oversight on the way algorithms effect our lives. His writing appears in numerous publications including The Social Media Collective, The Huffington Post, and Wired. One of his papers we discussed in depth on this episode was Auditing Algorithms: Research Methods for Detecting Discrimination on Internet Platforms, which is well worth a read.
We take another look at algorithms. Tim Hwang explains how Uber’s algorithms generate phantom cars and marketplace mirages. And we revisit our conversation with Christian Sandvig who, last year asked Facebook users to explain how they imagine the Edgerank algorithm works (this is the algorithm that powers Facebook’s news feed). Sandvig discovered that most of his subjects had no idea there even was an algorithm at work. Plus James Essinger and Suw Charman-Anderson, tell us about Ada Lovelace, the woman who wrote the first computer program (or as James puts it – Algorithm) in 1843.
With Michelle Meyer, a scholar of bioethics and law and a longtime listener of this show, we talk about human testing and Facebook. There’s a lot to talk about, but it doesn’t dissuade us from our customary, introductory nonsense, this time including a gift from listener Michelle, Star Wars, Joe’s mangling of last names, and Joe — and this actually happened — eating dog food. If you hate fun and want to get right to the colloquium part of America’s Faculty Colloquium, it starts a little after 23 minutes in. Should corporations be able to experiment on its customers and employees without their consent? Don’t they all do that, and haven’t they always? Don’t we all do that? Does it matter whether Facebook is more like a burrito stand or a utility? Mmmm… burritos. This show’s links: Michelle Meyer’s web page, faculty profile, and writing America’s Team The excellent Phantom Menace poster Vindu Goel, Facebook Tinkers with Users’ Emotions in News Feed Experiment, Stirring Outcry Christian Sandvig, Karrie Karahalios, and Cedric Langbort, Uncovering Algorithms: Looking Inside the Facebook News Feed Michelle Meyer, Everything You Need to Know about Facebook’s Controversial Emotion Experiment (Wired) About social comparison theory and emotional contagion Adam Kramer, Jamie Guillory, and Jeffrey Hancock, Experimental Evidence of Massive-Scale Emotional Contagion Through Social Networks The Belmont Report The Common Rule Michelle Meyer and Christopher Chabris, Please, Corporations, Experiment on Us (N.Y. Times) James Grimmelmann, Illegal, Immoral, and Mood-Altering (Medium) Michelle Meyer et al., Misjudgements Will Drive Social Trials Underground (Nature) Michelle Meyer, Two Cheers for Corporate Experimentation: The A/B Illusion and the Virtues of Data-Driven Innovation Michele Meyer, More on the A/B Illusion: IRB Review, Debriefing, Power Asymmetries and a Challenge for Critics Special Guest: Michelle Meyer.
When I was in Beijing last summer I dropped by the Microsoft research campus to talk with Dr. Yu Zheng. He studies the air pollution in his city, and the noise pollution in mine. Using algorithms he is able to predict what kinds of noises New Yorkers are most likely to hear in their neighborhoods, take a look at his Citynoise map. His algorithms could one day help city planners curb air pollution and noise or as Christian Sandvig notes they could be used by the GPS apps on our mobile devices to keep us from walking through neighborhoods perceived to have loud people hanging around outside. Christian Sandvig studies algorithms which is hard to do, most companies like Facebook and Google don’t make their algorithms public. In a recent study he asked Facebook users to explain how they imagine the Edgerank algorithm works (this is the algorithm that powers Facebook’s news feed). Sandvig discovered that most of his subjects had no idea there even was an algorithm at work. When they learned the truth, it was like a moment out of the Matrix. But none of the participants remained angry for long. Six months later they mostly reported satisfaction with the algorithms that determine what the can and can’t see. Sandvig finds this problematic, because our needs and desires often don’t match with the needs and desires of the companies who build the algorithms. “Ada’s Algorithm” is the title of James Essinger’s new book. It tells the remarkable story about Ada Lovelace the woman who wrote the first computer program (or as James puts it – Algorithm) in 1843. He believes Ada’s insights came from her “poetical” scientific brain. Suw Charman-Anderson, the founder of Ada Lovelace day, tells us more about this remarkable woman.
Step into a world where information floats through the open air — around your house, around your town, around the world — just waiting for you to reach up and grab it. Music, movies, phone calls from loved ones, the sound of your baby crying — it all travels in the charged particle space known as Spectrum. Your average consumer might interact with a dozen or more devices on a daily basis, designed specifically to both emit and pluck information to and from the air. But how much do we really understand spectrum? There are fights going on right now between regulators, media and telecommunications companies, and consumers about how spectrum is bought, sold, and used. Today’s guest argues that we need to go back to basics to effectively settle these battles. Christian Sandvig is a Berkman Center Fellow and director of the Project on Public Policy and Advanced Communication Technology at the University of Illinois Urbana-Champaign. On this week’s show David Weinberger and Sandvig elegantly deconstruct Spectrum, and give us some ideas as to how we can use this space more effectively. CC Music this week: Jeremiah Jacobs: Bounce Loops