POPULARITY
Do Spotify’s algorithms make a listener’s music taste, or does taste make the algorithm? Nick Seaver embedded himself as an ethnographer at a music recommendation software firm to learn about the the very real way very specific people influence the algorithms that power our automated world. Nick Seaver directs the program in Science, Technology, and… Continue reading 80 *Slaps Roof of Algorithm* You Can Fit so Much Taste in This Thing with Nick Seaver
Today's episode features a discussion with Nick Seaver, a professor at Tufts University and the author of Computing Taste: Algorithms and the Makers of Music Recommendation from the University of Chicago Press. Nick is an anthropologist who studies how people use technology to make sense of cultural things. His book is the product of ethnographic observation and conversations with developers working on music recommendation algorithms and other systems designed to understand and cater to user preferences. His research gives us a better understanding of the motivations of the executives and engineers designing systems to command our attention, which he considers to be “a currency, a capacity, a filter, a spotlight, and a moral responsibility.”
In the digital economy, recommendation algorithms get…a LOT of attention. To some, they're the special sauce behind everything from Spotify's personalized playlists to Tik Tok's “For You” page. For others, they represent a dark, vibe-generating demiurge slowly sapping music's social power. But for all the discussion of how these programs are transforming our world(s), there's surprisingly little analysis of what—exactly—they are, or how they're meant to work. Answering these seemingly simple questions is the goal of Nick Seaver's new book “Computing Taste,” which explores the identities, goals, and practices of the programmers behind these technologies. Far from Machiavellian manipulators, the coders he describes are surprisingly idealistic music-lovers, desperately trying to analyze an almost infinitely complex cultural practice. Their failures to do so—and the ideologies they adopted as a result—would have enormous implications for the development of digital music, remaking genres, redefining listening, and shaping the platforms at the heart of the modern industry. Put it this way—we'll definitely never look at a "Discover Weekly" playlist the same way again.
On the morning of Friday, March 10, 2023 Nick Seaver and Ana Carolina met over Zoom to talk about his new book Computing Taste: Algorithms and Makers of Music Recommendation, which was published in 2022 by the University of Chicago Press. Transcript available at https://blog.castac.org/2023/04/an-anthropology-of-algorithmic-recommendation-systems/ (This episode is available in additional languages on Platypus, The CASTAC Blog.)
We are joined by Nick Sever – author of the new book Computing Taste: Algorithms and the Makers of Music Recommendation – to discuss his excellent ethnographic research on the creation of algorithmic recommendation systems. Nick spent a long time getting to really know the people who make these systems and his book offers so much original, granular detail about the various practices, theories, and relationships that influence the engineers behind these algorithms. Nick's book also contains extremely interesting and surprising applications of classic anthropological theory to contemporary algorithmic technology. ••• Nick's book – Computing Taste: Algorithms and the Makers of Music Recommendation https://press.uchicago.edu/ucp/books/book/chicago/C/bo183892298.html ••• Nick's twitter – https://twitter.com/npseaver Subscribe to hear more analysis and commentary in our premium episodes every week! https://www.patreon.com/thismachinekills Hosted by Jathan Sadowski (www.twitter.com/jathansadowski) and Edward Ongweso Jr. (www.twitter.com/bigblackjacobin). Production / Music by Jereme Brown (www.twitter.com/braunestahl)
The seventeenth episode of CURSED WITH GOOD IDEAS, ninety-three minutes of collaborative filtering with Dino Chang, Gabriele de Seta, Patrick Harrison, and Asa Roast asking uninformed questions to Nick Seaver. In this episode: cybernetics, algorithms, recommender systems, and methodological cope. Hums and croaks courtesy of postgraduate mobility. LINKS: - Nick Seaver's "Computing Taste": https://press.uchicago.edu/ucp/books/book/chicago/C/bo183892298.html - Nick Seaver's scholarship: https://scholar.google.com/citations?user=GORu8nQAAAAJ&hl=en - Lana Swartz's "New Money": https://yalebooks.yale.edu/book/9780300233223/new-money/ - Bernard Dionysius Geoghegan's "Code": https://www.dukeupress.edu/code Support CWGI: https://en.liberapay.com/CWGI/
The people who make music recommender systems have lofty goals: they want to broaden listeners' horizons and help obscure musicians find audiences, taking advantage of the enormous catalogs offered by companies like Spotify, Apple Music, and Pandora. But for their critics, recommender systems seem to embody all the potential harms of algorithms: they flatten culture into numbers, they normalize ever-broadening data collection, and they profile their users for commercial ends. Drawing on years of ethnographic fieldwork, anthropologist Nick Seaver describes how the makers of music recommendation navigate these tensions: how product managers understand their relationship with the users they want to help and to capture; how scientists conceive of listening itself as a kind of data processing; and how engineers imagine the geography of the world of music as a space they care for and control. Computing Taste: Algorithms and the Makers of Music Recommendation (U Chicago Press, 2022) rehumanizes the algorithmic systems that shape our world, drawing attention to the people who build and maintain them. In this vividly theorized book, Seaver brings the thinking of programmers into conversation with the discipline of anthropology, opening up the cultural world of computation in a wide-ranging exploration that travels from cosmology to calculation, myth to machine learning, and captivation to care. Nick Seaver is Assistant Professor in the Department of Anthropology and the director of the Science, Technology, and Society program at Tufts University. Mathew Gagné is Assistant Professor in the Department of Sociology and Social Anthropology at Dalhousie University. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/new-books-network
The people who make music recommender systems have lofty goals: they want to broaden listeners' horizons and help obscure musicians find audiences, taking advantage of the enormous catalogs offered by companies like Spotify, Apple Music, and Pandora. But for their critics, recommender systems seem to embody all the potential harms of algorithms: they flatten culture into numbers, they normalize ever-broadening data collection, and they profile their users for commercial ends. Drawing on years of ethnographic fieldwork, anthropologist Nick Seaver describes how the makers of music recommendation navigate these tensions: how product managers understand their relationship with the users they want to help and to capture; how scientists conceive of listening itself as a kind of data processing; and how engineers imagine the geography of the world of music as a space they care for and control. Computing Taste: Algorithms and the Makers of Music Recommendation (U Chicago Press, 2022) rehumanizes the algorithmic systems that shape our world, drawing attention to the people who build and maintain them. In this vividly theorized book, Seaver brings the thinking of programmers into conversation with the discipline of anthropology, opening up the cultural world of computation in a wide-ranging exploration that travels from cosmology to calculation, myth to machine learning, and captivation to care. Nick Seaver is Assistant Professor in the Department of Anthropology and the director of the Science, Technology, and Society program at Tufts University. Mathew Gagné is Assistant Professor in the Department of Sociology and Social Anthropology at Dalhousie University. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/anthropology
The people who make music recommender systems have lofty goals: they want to broaden listeners' horizons and help obscure musicians find audiences, taking advantage of the enormous catalogs offered by companies like Spotify, Apple Music, and Pandora. But for their critics, recommender systems seem to embody all the potential harms of algorithms: they flatten culture into numbers, they normalize ever-broadening data collection, and they profile their users for commercial ends. Drawing on years of ethnographic fieldwork, anthropologist Nick Seaver describes how the makers of music recommendation navigate these tensions: how product managers understand their relationship with the users they want to help and to capture; how scientists conceive of listening itself as a kind of data processing; and how engineers imagine the geography of the world of music as a space they care for and control. Computing Taste: Algorithms and the Makers of Music Recommendation (U Chicago Press, 2022) rehumanizes the algorithmic systems that shape our world, drawing attention to the people who build and maintain them. In this vividly theorized book, Seaver brings the thinking of programmers into conversation with the discipline of anthropology, opening up the cultural world of computation in a wide-ranging exploration that travels from cosmology to calculation, myth to machine learning, and captivation to care. Nick Seaver is Assistant Professor in the Department of Anthropology and the director of the Science, Technology, and Society program at Tufts University. Mathew Gagné is Assistant Professor in the Department of Sociology and Social Anthropology at Dalhousie University. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/sociology
The people who make music recommender systems have lofty goals: they want to broaden listeners' horizons and help obscure musicians find audiences, taking advantage of the enormous catalogs offered by companies like Spotify, Apple Music, and Pandora. But for their critics, recommender systems seem to embody all the potential harms of algorithms: they flatten culture into numbers, they normalize ever-broadening data collection, and they profile their users for commercial ends. Drawing on years of ethnographic fieldwork, anthropologist Nick Seaver describes how the makers of music recommendation navigate these tensions: how product managers understand their relationship with the users they want to help and to capture; how scientists conceive of listening itself as a kind of data processing; and how engineers imagine the geography of the world of music as a space they care for and control. Computing Taste: Algorithms and the Makers of Music Recommendation (U Chicago Press, 2022) rehumanizes the algorithmic systems that shape our world, drawing attention to the people who build and maintain them. In this vividly theorized book, Seaver brings the thinking of programmers into conversation with the discipline of anthropology, opening up the cultural world of computation in a wide-ranging exploration that travels from cosmology to calculation, myth to machine learning, and captivation to care. Nick Seaver is Assistant Professor in the Department of Anthropology and the director of the Science, Technology, and Society program at Tufts University. Mathew Gagné is Assistant Professor in the Department of Sociology and Social Anthropology at Dalhousie University. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/music
The people who make music recommender systems have lofty goals: they want to broaden listeners' horizons and help obscure musicians find audiences, taking advantage of the enormous catalogs offered by companies like Spotify, Apple Music, and Pandora. But for their critics, recommender systems seem to embody all the potential harms of algorithms: they flatten culture into numbers, they normalize ever-broadening data collection, and they profile their users for commercial ends. Drawing on years of ethnographic fieldwork, anthropologist Nick Seaver describes how the makers of music recommendation navigate these tensions: how product managers understand their relationship with the users they want to help and to capture; how scientists conceive of listening itself as a kind of data processing; and how engineers imagine the geography of the world of music as a space they care for and control. Computing Taste: Algorithms and the Makers of Music Recommendation (U Chicago Press, 2022) rehumanizes the algorithmic systems that shape our world, drawing attention to the people who build and maintain them. In this vividly theorized book, Seaver brings the thinking of programmers into conversation with the discipline of anthropology, opening up the cultural world of computation in a wide-ranging exploration that travels from cosmology to calculation, myth to machine learning, and captivation to care. Nick Seaver is Assistant Professor in the Department of Anthropology and the director of the Science, Technology, and Society program at Tufts University. Mathew Gagné is Assistant Professor in the Department of Sociology and Social Anthropology at Dalhousie University. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/science-technology-and-society
The people who make music recommender systems have lofty goals: they want to broaden listeners' horizons and help obscure musicians find audiences, taking advantage of the enormous catalogs offered by companies like Spotify, Apple Music, and Pandora. But for their critics, recommender systems seem to embody all the potential harms of algorithms: they flatten culture into numbers, they normalize ever-broadening data collection, and they profile their users for commercial ends. Drawing on years of ethnographic fieldwork, anthropologist Nick Seaver describes how the makers of music recommendation navigate these tensions: how product managers understand their relationship with the users they want to help and to capture; how scientists conceive of listening itself as a kind of data processing; and how engineers imagine the geography of the world of music as a space they care for and control. Computing Taste: Algorithms and the Makers of Music Recommendation (U Chicago Press, 2022) rehumanizes the algorithmic systems that shape our world, drawing attention to the people who build and maintain them. In this vividly theorized book, Seaver brings the thinking of programmers into conversation with the discipline of anthropology, opening up the cultural world of computation in a wide-ranging exploration that travels from cosmology to calculation, myth to machine learning, and captivation to care. Nick Seaver is Assistant Professor in the Department of Anthropology and the director of the Science, Technology, and Society program at Tufts University. Mathew Gagné is Assistant Professor in the Department of Sociology and Social Anthropology at Dalhousie University. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/technology
The people who make music recommender systems have lofty goals: they want to broaden listeners' horizons and help obscure musicians find audiences, taking advantage of the enormous catalogs offered by companies like Spotify, Apple Music, and Pandora. But for their critics, recommender systems seem to embody all the potential harms of algorithms: they flatten culture into numbers, they normalize ever-broadening data collection, and they profile their users for commercial ends. Drawing on years of ethnographic fieldwork, anthropologist Nick Seaver describes how the makers of music recommendation navigate these tensions: how product managers understand their relationship with the users they want to help and to capture; how scientists conceive of listening itself as a kind of data processing; and how engineers imagine the geography of the world of music as a space they care for and control. Computing Taste: Algorithms and the Makers of Music Recommendation (U Chicago Press, 2022) rehumanizes the algorithmic systems that shape our world, drawing attention to the people who build and maintain them. In this vividly theorized book, Seaver brings the thinking of programmers into conversation with the discipline of anthropology, opening up the cultural world of computation in a wide-ranging exploration that travels from cosmology to calculation, myth to machine learning, and captivation to care. Nick Seaver is Assistant Professor in the Department of Anthropology and the director of the Science, Technology, and Society program at Tufts University. Mathew Gagné is Assistant Professor in the Department of Sociology and Social Anthropology at Dalhousie University. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/popular-culture
The people who make music recommender systems have lofty goals: they want to broaden listeners' horizons and help obscure musicians find audiences, taking advantage of the enormous catalogs offered by companies like Spotify, Apple Music, and Pandora. But for their critics, recommender systems seem to embody all the potential harms of algorithms: they flatten culture into numbers, they normalize ever-broadening data collection, and they profile their users for commercial ends. Drawing on years of ethnographic fieldwork, anthropologist Nick Seaver describes how the makers of music recommendation navigate these tensions: how product managers understand their relationship with the users they want to help and to capture; how scientists conceive of listening itself as a kind of data processing; and how engineers imagine the geography of the world of music as a space they care for and control. Computing Taste: Algorithms and the Makers of Music Recommendation (U Chicago Press, 2022) rehumanizes the algorithmic systems that shape our world, drawing attention to the people who build and maintain them. In this vividly theorized book, Seaver brings the thinking of programmers into conversation with the discipline of anthropology, opening up the cultural world of computation in a wide-ranging exploration that travels from cosmology to calculation, myth to machine learning, and captivation to care. Nick Seaver is Assistant Professor in the Department of Anthropology and the director of the Science, Technology, and Society program at Tufts University. Mathew Gagné is Assistant Professor in the Department of Sociology and Social Anthropology at Dalhousie University. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/book-of-the-day
Episode Notes What music do you like? How do you know? And how does your favorite music streaming app know – or not? Questions of music taste and how the people creating music recommender systems define it motivate Nick Seaver's book, Computing Taste: Algorithms and the Makers of Music Recommendation. On today's show, I'm joined by Nick to discuss his book, his own taste in music, and what he learned as he talked, worked, and hung out with the people behind the algorithms. Nick Seaver, PhD is an Assistant Professor of Anthropology at Tufts University, where he is also the director of the program in Science, Technology, and Society.
“Sticky websites are traps designed to capture people.”Artisanal traps have two faces: where hunter and prey, two entities, meet. Computers are a kind of human trap. Can we see the image of the hunter and the prey in these computer-traps in the same way as the anthropologic traps? The interface is in fact the meeting point in between worlds.Engagement is in the very structure of how our technical products work. This means that reimagining the relations between designers, coders, and users is the way to “get out of the trap”. Nick Seaver is an anthropologist of technology who studies how people use technology to make sense of cultural things. In this talk he explores how we throughout history have created traps. We used to do it to capture animals, today we are capturing human minds through digital screens, a phenomenon called “addiction by design” and “sticky websites”.
Earlier this week, my colleague Adam Mastroianni published an essay on what he called "cultural oligopoly." An increasingly smaller number of artists create an increasingly larger percentage of what we watch, read, and listen to. Mastroianni presents data showing that through the year 2000 only about twenty-five percent of a single year's highest grossing movies were spinoffs, franchises, or sequels. Now it's somewhere in the neighborhood of 75%. He has similar data for hit TV shows, books, and music. Why is this happening?My guest today is Nick Seaver, who is a cultural anthropologist at Tufts University. And for the last decade or so, Nick has studied the social processes underlying the creation of music recommender systems, which form the algorithmic basis for companies like Spotify and Pandora. I've admired Nick's work for a long time. And as an anthropologist, he is interested not necessarily in the nitty gritty details of how these algorithms are constructed, but rather in who is constructing them and what these people believe they are doing when they make decisions about how the algorithms ought to work.The core of Nick's work centers around taste, and how these companies and their algorithms subtly shape not only what we consume, but what we like. When Nick started this line of work in the early 2010s, it really wasn't clear how big of an impact these recommender systems would have on our society. Now, his expertise gives an evermore incisive look at the central themes of many large societal conversations around the content we consume and our everyday digital existence. But I came into this conversation with Mastroianni's question at the top of my mind, and I think Nick's research can give a crucial insight, at least into one piece of the puzzle.One of Nick's papers relates an ethnographic study of music recommender system engineers. In the interest of protecting the identity of his informants, he gives the company a fictional name, but it bears conspicuous resemblance to Spotify. As a naive observer, one might think that the way these engineers think about their audience is in terms of demography: this kind of person likes this kind of music. If they can figure out the kind of person you are, they can recommend music that you'll probably like. But that turns out not to be the dimension of largest variance.Instead, Nick introduces the concept of “avidity.” Essentially, how much effort is a listener willing to put in to find new music? This turns out to be the first distinction that these engineers make between listeners. And it forms a pyramid. On the bottom you have what one of his informants called the “musically indifferent.” This makes up the majority of listeners. Their ideal listening experience is “lean-back.” They want to press play, then leave the whole thing alone. It is a passive listening experience — no skipping songs, no wondering what other tracks might be on the album. From there, it goes from “casual” and “engaged” listeners to the top of the pyramid, which is “musical savant.” These are “lean-in” listeners who are taking an active role in discovering new and different kinds of music.“The challenge,” Nick writes, “is that all of these listeners wanted different things out of a recommender system.” Quoting one of his informants, codename Peter, he says: “in any of these four sectors, it's a different ball game in how you want to engage them.” As Nick summarizes it: “what worked for one group might fail for another.”Nick continues here: "as Peter explained to me, lean-back listeners represented the bulk of the potential market for music recommendation in spite of their relatively low status in the pyramid. There were more of them. They were more in need of the kind of assistance recommenders could offer and successfully capturing them could make 'the big bucks' for a company."Nick relates the slightly more forthcoming perspective of another engineer, codename Oliver: "it's hard to recommend shitty music to people who want shitty music," he said, expressing the burden of a music recommendation developer caught between two competing evaluative schemes: his own idea about what makes good music and what he recognizes as the proper criteria for evaluating a recommender system.In the course of our conversation, Nick and I cover not only his studies of music recommender systems, but also his more recent studies taking an anthropological approach to attention. We tend to think of attention as this highly individualized process. For example, of gazing into the screen of your phone or turning your head to identify the source of an unexpected noise. But attention is also a social and cultural process. We attend collectively to certain stories, certain memes, certain ideas. What exactly the connection is between these two forms of attention is not obvious. And Nick's current line of work is an attempt to draw it out.But the larger theme here is that music recommender systems are one battle in the larger war for our collective attention. What Spotify, Netflix, and Twitter all have in common is that their success is proportional to the extent to which they can dominate our attention. This is known in Silicon Valley as the idea of "persuasive technology." And one way to begin to understand the origins of cultural oligopolies starts with Nick's observation about avidity. The vast majority of listeners or viewers tend to go with the default option with which they're presented. Another way of putting it is that their preferred mode is habitual autopilot.While recommender systems make up just one part of this content ecosystem. This principle remains stable across its many different layers. The more we go with our habitual default options, the more control these platforms have over us. The more we rely on these companies to define our tastes for us, the more homogenous our tastes will become.Nick's forthcoming book is “Computing Taste.” It comes out in December 2022. Keep an eye out for it. And if you enjoy this episode, you can subscribe to my Substack newsletter at againsthabit.com or leave a five star review on iTunes.Thank you for listening. Here is Nick Seaver.Cody: One of your current areas of interest is attention. And while I think this is a topic that is a pillar of how we understand our own modern lives and definitely has a long history of study in fields like psychology, it's not really something that anthropologists have covered as much in a direct way. So I'm curious to get your current perspective on why people talk about attention so much, what this word might really mean, and what an anthropological take on it might show us.[00:07:46] Nick: Yeah. My interest in attention stemmed from the earlier work that I did, sort of my PhD dissertation project and first book, which was about the developers of music recommender systems. And one of the things you realize if you, you know, study recommender systems at all, is that people are really interested in attention.They're interested in ways that you measure — how you measure if someone is listening to some music, what they like on the basis of their listening habits, your interest in trying to encourage them to listen more, to do all this stuff with their attention. And that was going to lurking in the background for me for a long time.So when I had a chance to design a new seminar to teach at Tufts for our anthropology undergraduates I thought, you know, okay, I want to learn more about attention and try to find stuff about it. So I proposed a course, which I called "how to pay attention," uh, which was a little bit of a click baity title. We don't really do attention hacks or anything. And it was a chance for me to read really broadly across media studies about across history, across psychology, cognitive science, uh, and some anthropology art history and so on to think about like, what is this thing? Like, what's this, this concept that seems so important for the way that people describe anything in the world now.And as an anthropologist, I was, uh, struck by that because, you know, when you find a concept that does so much work for people, it's — I would argue it's hard to find one that is doing more work in the present moment than attention — you know you've got something culturally rich. But a lot of the ways we talk about attention in public, the kind of popular discourse around attention is very narrow. It's very individualizing. It's very sort of a thing that happens in individual brains.So the line I like to give, uh, is that, you know, the question is: what would it look like to take an anthropological approach to attention? Well, it would look like putting attention in a social context and in a cultural context.And my thumbnail definitions of those are, you know, society is this sort of world of relationships and roles in which people live. It's where you have bosses and spouses and professors and students and pets and doctors and sheriffs and all these other kinds of roles that people occupy. And we clearly pay attention within those social structures, right? We pay attention to the same things as each other. If I'm sitting in a classroom with students, they're paying attention to me and each other in certain ways that are governed by our social roles and relationships. And we also pay attention in a cultural context, which means we pay attention in a world where we value certain things, sort of arbitrarily where we make associations between certain kinds of entities and other entities.So we might say, oh, let's, you know, focus our attention over here. And we talk about our attention as though it's a kind of lens or an optical instrument, or we'll talk about attention as being like a filter, right? We have information overload because there's not enough filtering happening between information and the world in our heads.So these are all cultural phenomenon. There's nothing intrinsically attention-like about them. And to my mind studying how people make sense of attention in the present moment in these cultural contexts, uh, is just a fascinating question. So that's the sort of how I got into it and where I think an anthropological approach is different from the sort of stereotypical psychological approach.Not that all psychologists are like this, um, but you know, the stereotypical psychology approach would be, let's do experiments with reaction times and individual people, you know, in a lab setting. And that's not really what I'm interested in. I'm really interested in the fact that people talk about attention all the time and they use it to explain all sorts of things and they think that it's really important.[00:11:12] Cody: There's definitely a trope in psychology that whatever you are studying. Whether it's memory or visual search or whatever it is, you can kind of at always some point just boil it down to, you know, some explanation: Oh, well this is what the person is attending to. This is, this is what their attention is focused on. But it's not actually — it's often kind of just a hand-waving way of, of saying, oh, well, yeah, it's what they're concentrating on without having, having any specific idea of what that really means. So I'm kind of curious what, what does putting the idea of attention in a social and cultural context — what do you think we've misunderstood about attention by individualizing and overlooking those social and cultural contexts?[00:12:01] Nick: I would say one thing is to note that there are lots of folks working in the sort of intersection of philosophy and cognitive science who are very interested in that kind of circularity of, uh, of explanation that you just described. Right. That are like: wait a minute, what does attention mean then? One of the ones that I am familiar with her work — Carolyn Dicey Jennings is one such philosopher who works in close collaboration with cognitive scientists and is sort of interested in offering a philosophically rigorous account of attention that isn't just like the thing that you point to when you've given up on giving explanations.But one reason I love reading and cognitive science around this is that you've started to realize that it seems really obvious what attention is. And of course, the famous line that everyone has to quote in all of their articles and books seems to be from William James, the godfather of American psychology who says everyone knows what attention is.And then gives you the sort of basic definition of, you know, it's when you, uh, focus on something and sort of don't focus on other things. But of course, when you push on attention, it's not really clear what it is. And it's sort of a grab bag concept that pulls together all sorts of stuff, right? It includes your ability to focus for a long time or so your sort of endurance. It includes vigilance, right? It includes the sheer sort of, uh, arousal state. Like if you're really sleepy, you're maybe not as attentive. It also includes that basic filtering capacity, the ability to, you know, in a crowded room, to listen to the person who's talking to me, instead of hearing all of the other stuff that's happening. There's all these things that you may not necessarily want to, or need to combine into a single concept.But there's not really internal coherence there. But while that's sort of a problem for psychologists, they right. They say we want to be studying one thing. We don't want to be accidentally mixing a bunch of different references. It's really normal in a cultural context, right? For any given symbol, say attention as a symbol here, to mean lots of different things and to be specifically a way to sort of draw together a bunch of different discourses in one place.So to my mind, that got me thinking, well, you know, attention just is a cultural phenomenon, just like as a defined thing. Like the fact that we think of, uh, you know, a first grader's ability to sit in their chair in the classroom for a long time, we think of that as being the same thing as my ability to, you know, listen to you and not just have my mind wander off to some other thing, while we're talking — those don't have to be the same as each other. And yet we think of them as being totally connected to each other.Another example I like to give often to talk about the sort of various layers at which attention works — in the way that, you know, in sort of common usage — has to do with Donald Trump, which is not the most fun example but there was a lot of attentional discourse around Trump, which ranged from when he was elected this sense of like, oh, you know, the press was not paying attention to the right people. This was a surprise to some people because there was not collective attention to the right parts of society. There was not an awareness that was happening.So there's an attention that's not an individual's attention, right? That's like everybody's attention. But what is that? That's not the same thing as what happens in the brain.All of those things tangled together through this weirdo concept that nobody seems to really question. We really take it for granted as like an obvious, important thing.[00:15:10] Cody: You mentioned in one of your papers, this metaphor that I'm really interested in. And it's that the way we usually talk about attention is in terms of "paying" attention, which is based in an economic metaphor. and certainly I hear a lot of people talking about like, "okay, well your most valuable asset is your time. No, no, no. Actually wait, that's just the convention. Really, your most valuable asset is your attention, which is kind of this cycle, psychological function of time." But anyway, that's kind of how we normally talk about attention, but you propose this idea that actually the sort of verb there should be "doing" attention as in some sort of action forward notion of what it means to attend.So can you say a little bit more about what that means?[00:16:00] Nick: Clearly the economic metaphor is in many ways the dominant attentional metaphor at the moment. Of course, there's a sense of paying attention. And there's also this idea that we live in an attention economy, right. And the classic explanation for what that means is from Herbert Simon, who is a sort of cognitive scientist, political scientist, economist, et cetera, working in the sort of late post-war period in the United States where he says, you know, you might say we live in an information economy. But that's not really true because we have tons of information. Information is not scarce, but information consumes attention. And therefore attention is the scarce resource. And if economics is the study of how to allocate scarce resources, that means that attention is the thing that is being economized.That's not an argument we have to agree with necessarily, but that's the sort of groundwork for thinking about how attention itself might be an economic kind of thing and how it's become really, really natural I think for lots of people across all sorts of political orientations and disciplinary affiliations to think of their attention as being really like naturally economic, right? We might question all sorts of applications of economic logics to other domains, but attention is a hard nut to crack. It really feels like, you know, sure, we don't like this way that people like try to economize every last part of our lives, but attention isn't that just, you know, you have a limited amount of it. You have a limited amount of time. What else can you, can you have? And so I think one of the things you're pointing to in your, in your question, is this history in the social sciences have a real skepticism around the role of money in society.So the classic spot for this is Georg Simmel, the sociologist writing around the turn of the 20th century, who gives what my PhD advisor used to call the money as acid hypothesis, which was this argument that when you introduce sort of money and, and, you know, uh, assigning prices to things into domains where it didn't exist before, it tends to reduce everything to the monetary as like a lowest common denominator. Right?You start to think of everything in terms of how much it's worth. And that feels not great in a lot of domains. It allows some people to do some things very strategically. Um, but generally we, we take that as a sort of sad, sad thing that money has to sort of dissolve some of the richness of social interaction.Um, and it becomes sort of the, you know, the basis for everything. It's the source of the phrase, you know, time is money, right? This idea of time is money. That's why it's important. But when you're pointing at is now we've got a kind of shift in the way that that discourse happens, right? It's not really the case that time is money. It's more, that money lets you buy time. And some people are suggesting that the basic thing, the sort of most fundamental value thing is your time or maybe your attention.And that is so interesting to me because now we've got the attention as acid hypothesis, which is that attention and this sort of an accountant, any kind of social life in terms of how much attention we're paying to what, um, it becomes the, the framework in which basically anything, uh, can be, can be expressed — in an almost, it feels more fundamental than money to some people, right? It feels more essential. If money is an arbitrary and position, attention is just the real thing.And as anthropologist, my interest is not so much in deciding whether that's true or not. But in cataloging and noting the way that that works, the way that people talk about it, because it's something that's pretty emergent at the moment. But it's not quite obvious to folks like what, what it's going to mean. Like what's going to happen, as people take this more and more seriously.[00:19:32] Cody: So, as you alluded to at the beginning, attention is kind of this big, big topic that we all understand is this governing force in our lives. We're not really sure what it is in either a colloquial sense or a professional academic sense. But it's definitely, whatever it is, it's critical to whatever we're doing over here in psychology.And you began to understand that through your research in music recommender systems. And that has been your main area of study for the past 10 years or so the kind of recommender systems and algorithms used by platforms like Spotify and Pandora and all that sort of stuff. So you've done a series of in-depth ethnographic studies, which will come together in your book, Computing Taste, which I'm really looking forward to reading when it's out this December. Um, but I want to get into some of that material now.[00:20:28] Nick: Sure.[00:20:29] Cody: So one of my favorite papers of yours is called "Seeing Like an Infrastructure: avidity and difference in algorithmic recommendation." So can you tell me a little bit about this concept of avidity and how it plays out in the way engineers think about musical recommenders systems.[00:20:48] Nick: So that piece, seeing like an infrastructure, came about — it's going to be partly in this book, but the basic gist of it was this: I wanted to know how the people building recommender systems for music in particular thought about their users. This is sort of basic stuff. But it's very important, right?The way you build your technology, uh, is going to be shaped by the people that you think use it. Um, a side question that sort of rose to great public prominence during the time that I was working on this project, you know, over the past, like you said, 10 or 12 years was the question of diversity within these fields.So it is, you know, a well-known problem, certainly by now, um, that there is a lot of demographic homogeneity in tech companies and among the people who build these software systems. And many people suggest that the shortcomings are some of the shortcomings of these systems, um, or, you know, biased outputs, some of the racist outcomes we get from some machine learning systems, maybe directly traceable to that lack of diversity on the teams of the people who, who build them.Uh, so aside question here for me was how did the people building these systems understand diversity, uh, because there's more than one way to think about what diversity means and what kind of effect it might have on the technologies that you build. So one of the things I realized was that when people talked about music listeners, as you know, developers of recommender systems, they were very well aware that the people who used a recommender system were not really like the people who built the recommender system.And that's a kind of realization that doesn't always happen. It's been the subject of critique in lots of domains. Some people call the absence of that the iMethodology, which is what we use to say, you know, someone builds a system because it meets their own needs and they assume that they are, uh, like their users.So you get this class of startup ideas, you know, like, um, laundry delivery, uh, which is because, you know, you've got a bunch of dudes who have just graduated from college and they don't want to do their own laundry, and they're trying to solve their own problems, right. This kind of sector and, uh, style of development.But the people working on music recommendation seems pretty aware, uh, that they, they're not like the people who are using this. So the question then is in how — and well, the main thing that people would talk about when they talked about how they were different from their users and in how their users might be different from each other was what I ended up calling avidity, which is sort of my term, um, for a collection of ideas that you could sum up basically as how into music are people, right?How, how avidly do they seek out new music? How much do they care about music? How much should they want to listen to music? You know, how much work do they want to put into, uh, finding things to listen to and a recommender system, as you might guess, uh, is generally, uh, geared, especially these days toward less avid listeners, right? They're intended for people who don't really want to put that much effort into deciding what to listen to. If you knew what you wanted to listen to, you would not need an algorithmic recommendation.But on the other hand, the people who worked in these companies, they generally were very, very enthusiastic about music. And so when they were building recommender systems, they understood themselves as having to build those for someone that was not like them, which poses this question: how do you know what your users are like then? If they're not like you, what are you going to do?And so in short, the argument and the pieces that they come to understand their users primarily through the infrastructures that they build. So they learn things about their users, through the data collection apparatus or through the infrastructure that they create. An infrastructure is designed to capture things like how much you listen, at where you click, you know, the frequency of your listening to certain artists and so on. And in that data collection, what's most obvious? Avidity.How much you listen, how much clicking you do, because here's a database that's, you know, full of click events, listening events and so on. And so I argue in that piece that avidity is both a kind of cultural theory about how people are different from each other, but also something that's very closely tied to the specific infrastructure that they work on.So they want to try to be rational. They want to try to be objective. They don't want to try to build from their own personal experience. They're aware of that shortcoming. But the solution for that is in this sort of circular solution of using the actual data collection infrastructure that they've been building on. So they kind of reinforce this vision of avidity at the center, in the place of, you know, other kinds of variety that some of their critics might care about such as, uh, demographic homogeneity and so on.[00:25:22] Cody: Yeah, so that to me is such a fascinating insight. It's like, okay, if you're someone who doesn't have any preconceptions about what this might be like, you might come in and think, okay, well, if I were going to segment people up to recommend music to them, I would look for demographic qualities. I might look for things that I think would correspond to interest in certain genres, all of that, all of that sort of thing. But, based off of what you're saying, this dominant way of understanding people is through the amount of effort they're willing to put in to find something that they do not already know about.And you give an account from one of your informants who says they kind of have this pyramid : at the bottom is the musically indifferent than you have casual and engaged listeners and then musical savant at the top. And then in each of these four sectors, you have a totally different way of how you're trying to engage them and what it might mean to have a successful recommendation for them. And that to me just seems, uh, like a very interesting way of conceptualizing what it means to, to be engaged with music and to understand the different kinds of, of ways in which people are listening to a combination of what they like and what they might potentially like.[00:26:43] Nick: Yeah, absolutely. I think that, uh, maybe one thing that will help put us in some context is to think a bit about the history of algorithmic recommendation. Because you might think, yeah, like you said, that, uh, the first place you would go to sort of segment listeners to music would be demography because that's of course in the dominant mode of, of segmenting audiences for music, uh, ever since, you know, the origin of the recorded music industry. It's been a very, very dominant frame in the production of certain genres, you know, radio stations, stores, labels, charts, all the rest of it.There's a bunch of rich history of essentially race, uh, in the categorizing of, of music. And I'm talking here specifically at the United States, but you have similar dynamics globally. Um, but a very central sort of point of concern within the overall recommender systems world — and this includes things beyond music — is that using demographic categories for personalization is bad, right? That it's biased at best, that it's racist at worst. And that what recommender systems do — and this is an argument people are making in this field from its very origins in the mid 1990s — is provide a way for people to sort of escape from the bounds of demographic profiling. So it's very important to people in this field that they don't use demography, uh, the sort of recommender systems as the anti demographic thing are — it's a trope that's through, you know, it exists all the way through this, through this field from, from back then until, until the present.Um, what's striking about it, of course, is that, uh, in a world where people have race and they have gender and they have class. Those features do emerge in sort of proxy form in the data, right? So you, it is not always hard to guess someone's demographic qualities, uh, from what they listen to. You know, it's not deterministic relationship, but there's certainly a correlation there.So it is possible for demographics to re-emerge in this data, right. For them to think, oh, you know, they, these look like sort of feminine listening habits and so on. Um, there's a lot of work in, in, in how those categories emerge and how they can shift around over time. Um, but it's very important that people are working in this field that they don't take demography into account.In part because they're worried about doing what they describe as racial profiling. But even if that would be a sensible way to start, right — to think, well, there is certainly a racial pattern in production of music and, and listening patterns. They really hold that off limits intentionally.[00:29:11] Cody: One of the things that I've heard you talk about before in other podcasts interviews is that your job as an anthropologist is not simply to infiltrate these companies and collect secret facts about how the algorithms work. Your job is something closer to trying to describe the cultural processes, underlying their creation and figure out how the people who build these recommender systems understand what it is they're doing.So as you say, the more detailed you get on describing the algorithm itself, the more transient data information is. for example, how Facebook is, is weighting one aspect of the newsfeed on any given day — that could change tomorrow, but the underlying cultural and social constructs are more stable and in a way more fundamental to what it means for our society in our, in a larger sense.So I kind of want to bring in another paper that you've written in this sort of line, which is "Captivating Algorithms: recommender systems as traps" in which you compare the way Silicon valley engineers talk about their products and anthropological studies of literal animal traps. And so most tellingly, you have this quote, which I love, it's from a paper from near 1900 by an anthropologist named Otis Mason, I believe, which reads: the trap itself is an invention in which are embodied most careful studies in animal mentation and habits. The hunter must know for each species it's food it's likes and dislikes its weaknesses and foibles. A trap in this connection is an ambuscade, a temptation, irresistible, allurement. It is a strategy."So he's describing how the people he's studied think" about trapping animals. And in a sense, uh, you know, you're saying that you're leveraging the animal's own psychology against itself.Your point in this paper is that this is essentially the same language, or at least a very similar language, to what many people use in describing the quote "persuasive technologies" being built today. So can you expand on that idea a little bit and say what the anthropologist's perspective on studying these kinds of technologies looks like?[00:31:29] Nick: I love that line from, from Mason. I think it's very rich, uh, in helping us think about what we might be doing with technology from an anthropological point of view. Like I've been talking about one of the central concerns I have is how the people building these systems think about the, the, their users, uh, and one of the common things that they do then when they talk about what they're, what they're up to, is they talk about trying to capture them, right.They try to talk about capturing their attention, to bring attention back in. They talk about capturing market share. There's all of these captivation metaphors. And of course they don't literally mean that they're trying to, you know, cat trap you in a box or drop you in a hole through a layer of leaves or something like that.But one of the things that anthropologists get to do, which is fun and I think useful, uh, is draw broader comparisons in the people that we are talking to and talking about than they draw, to sort of put things in comparison, across cultural contexts. And so comparing these, you know, machine learning systems that are imagined to be high tech, the reason for the high valuation of all of these big tech companies, uh, thinking about them, not as being some brand new thing, that's never been seen before and requires a whole new theory of technology to understand, but thinking of them as being part of a continuum of technologies, that includes digging a hole in the ground and putting some sharp sticks in it. That I find really, uh, enticing, because it's going to help us think about these systems as just technologies, right? They're ordinary in a lot of ways, despite some of their weird qualities. So the basic argument of the traps paper is that we have this anthropology of trapping that suggests, okay, well, what is a trap? It's a weird kind of technology that really foregrounds, uh, the psychological, uh, involvement of the entities that's trying to trap, right? A mouse trap doesn't work. If the mouse doesn't do what it's supposed to do, uh, in the same way that your, you know, iPhone won't work, if you don't use the iPhone in the way you're supposed to. And this is in some ways a now classic argument within science and technology studies that you really have to configure a user for a technology in order for technology to work. There's no such thing as a technology that just works in isolation from a context of use. And so reminding ourselves of that fact, uh, is really handy in this domain because there's a lot of work on algorithms and AI that falls prey to this idea that, you know, oh, they're brand new, we never used to, we didn't want to go to technologies as being, you know, really determining of our situations and of advancing according to their own, their own logics before, but now it's true. Now algorithms are truly autonomous. And that's not really true, right. There are people who work on them who build them, who changed them over time. And they're doing that with a model of prey in mind.So I'm drawing on a little bit of an expansion of that anthropology of trapping tradition by an anthropologist named Alfred Gell, who has a very famous article in anthropology, where he talks about artwork as being a kind of trap. Also a similar, you know, the idea of like a good, a good work of art is going to produce a psychological effect on its viewers.But it's going to do that using technical means, right? So, and, uh, really intricately carved statue could cause someone to sort of stand still and look at it. And we don't want to forget that that statue, in addition to being quote unquote, art, uh, is also technology, right? It's also an artifact that's been created by people using tools.And it is in some sense, a tool in its own right for producing an effect in a viewer. And so I like to use this anthropology of trapping literature to think a little bit more expansively about questions that have really been coming up lately around ethics and persuasion in digital media. So we have documentaries, organizations, and so on, like I'm thinking "The Social Dilemma" from the center for humane technology is the sort of most prominent one, that suggests that, you know, Facebook is like a slot machine. It is trying to get you addicted to it and is trying to produce bad effects in your mind. YouTube is doing this as well.They're incentivizing people to make outrageous content because they're trying to maximize the amount of time that people spend on their sites. Now, these are all stories about digital technology that really fairly explicitly figure them as trap-like in the sense that I've been describing . Facebook is designed to make you do things against your will, uh, which are also against your best interest. So they have the trick you using them. And so we see that kind of trap metaphor out in the wild there, um, in critiques that people will make of these systems. So it was really striking to me to see that in both critiques, but also just in the self descriptions of people working in this space.It was not weird for people working in music in particular to say: yeah, of course, I want to get people addicted to listening to music. And it maybe didn't even seem that bad. But is it really bad if you listen to more music than you used to listen to, is that worthy of being called an addiction? Is that really a problem?But thinking about trapping in this sort of broad anthropological way, I hope, um, steps us back from this binary question. You know, are these things harmful? Are they coercive or not? And into a gray or a space where we say, you know, sort of all technologies have a bit of persuasion and coercion mixed into them.They all sort of demand certain things of their users, but they can't really demand them entirely. And so if we step back, we can start to think of, um, technologies as existing, within a broader field of psychological effects of people trying to get other people to do what they want them to do. And it sort of field of persuasion, um, where we don't have to say, okay, well, you know what the problem is, recommender systems is they really, you know, deny you agency, which they can't. They can't ultimately deny you agency entirely. But they do depend on you playing a certain role in relation to them.[00:37:22] Cody: Cody here. Thanks for listening to the show. I'd love to get your thoughts on this episode. One of the challenges, as you might imagine, as a writer and podcast producer, is that it's hard to get direct feedback from your readers and listeners, what they like or don't like what's working well or needs to be rethought.You can tell a little bit about this from metrics like views or downloads, but it isn't very nuanced. So I've created an avenue for getting that kind of feedback: a listener survey available with every podcast episode. If you have feedback on what you found most interesting or what you thought could be improved, I'd love to hear it.You can find the link in the show notes or at survey.Againsthabit.com. That's survey.againsthabit.com. Now back to the show.What do you think the role of habits are in everything that we're talking about here? Because it seems largely that the psychology that engineers are relying on when they're building their products, when they're thinking about persuasive technologies, when they're trying to trap a user, it's largely the psychology of habits and habit formation.So I don't know. What do you, what do you make of that? And, you know, what's what does that sort of suggest to you about how we should think about these technologies and the way they're exploiting our habitual psychology?[00:38:47] Nick: That's a very nice connection. There is a historian of science named Henry Cowell who is working on some of this history of the psychology of habit in relation to attention , which might be interesting. But from my point of view, in sort of anthropology side of things, when I think of habit, I think of what we often talk about in the social sciences as a, as habitus, which sounds a fancy way of saying the sort of collection of habits that you acquire as part of becoming an inculturated person.So as you grow up, you learn a bunch of habitual things. It's not the sort of small-scale habits of like, you know, self-help books where they say, oh, if you remember to, uh, you know, put your toothbrush out in a certain spot in the morning, it'll trigger you to brush your teeth on time, but rather it's something broader than that, right? Which is that we have a bunch of tendencies in the ways that we behave in the ways that we respond to the outside world and the way we use our bodies that are those, those are all solidified in us over time. And so if you ever have the experience of culture shock of going to a place where people don't have quite the same habits as you do, it becomes very obvious that what seems totally natural and comfortable and regular to you, it doesn't seem that way to, to other people.And so technologies are part of that broader field of habits or habitus in that a lot of the kind of habits that we have are sort of organized around technological implements, right? So very explicitly people working in this field, um, folks like Nir Eyal who's book, Hooked, is plainly about this, about how companies can learn to sort of incite habits and their users, they suggest that, you know, what, what you want to do, if you want your company to become really successful is you want to make users use it habitually. Something like, you know, users will open up Facebook, um, before they've even consciously thought about what they're doing. And I'm sure plenty of people have had the same experience of, you know, being on Twitter or on Facebook, closing the window on their browser, opening a new window on their browser and going immediately back to that website before realizing, wait, what am I doing?That kind of unthinking habitual behavior is where that intersection of persuasion and coercion sort of happens. Right. If someone's making me do that, um, that's probably not quite what I want. It takes place within the sort of broader field of overall habits. And arguably, and this is something that people in the social sciences have argued for a while now, your taste is also part of this, right? So you learn to like certain things. It's very easy for people to learn, to, you know, uh, to dislike a style of music, for instance, such that when it comes on the radio, you'll turn the radio off immediately and be like, that's horrible. You know, I can't imagine that anyone else would like this, but of course other people do like it. Which just gives lie to the idea that there's something objective going on under there.But technology and recommender systems in particular and the way that I try to think about them in my book and through my, uh, articles, uh, I want to try to think about recommender systems as really occupying that in-between space between technology and taste, or as you know, the title of my book, computing and taste. Cause we often talk about those domains as though they're really separate from each other, right? Computers are rational, they're quantitative, they're logical. Whereas taste is subjective. It's individual, it's expressive, it's inexpressible through numbers. Those two ideas, you know, we think of them as being really opposed. There's no accounting for taste and so on.And yet they come together in recommender systems, uh, in a way that some people fault because they think that you shouldn't do that. You shouldn't cross the streams from these two, these two different domains. Um, but which I think of as not being that weird, if we think of taste as being a sort of set of habits as being part of this kind of, you know, apparatus through which we live our lives, and we think of technology as also being part of this broader scene of habits and habituation, right? Technologies are not, uh, separate from, from the human world. Computers did not invent themselves and they do not program themselves. So actually all of this is getting played with together, uh, in a way that's not that weird if you think about it. Now, it may be done in ways that we don't like, and it may have effects that we don't want. But it's important. It was important for me to try to give an anthropological account of recommenders systems that didn't start from the premise that, oh, this is impossible. Like you can't do this. Everybody knows that human expression and feeling cannot be worked on through the computer. Because it's pretty clear that it can be worked on through the computer. What's not clear is what that means for how we understand computers and for how we understand taste.[00:43:24] Cody: Okay. Here is an easy question then. What is your theory of taste?[00:43:32] Nick: Ooh. Okay. This is a fun question. So my theory of taste, I have to start with the, with the, the sort of default social science theory of taste. The default social science theory of taste is what we would call the homology thesis, which is that there is a homology or a sort of structural similarity between class and taste. So fancy people like fancy things and less fancy people like less fancy things. If you like the opera, or if you like country music that tells me something about who you are. That's the sort of canonical, a social scientific argument.And in that case taste is really not the thing that most people think it is where it's like, oh, this is just my personal preferences. It's actually something that sort of determined by your social status. Now that's a fairly vulgar account of that theory, but I think it's fairly widely shared among lots of people that taste is effectively arbitrary. And at the end of the day, it really just reflects your sort of social position, maybe also, you know, your race. But certainly essentially like how fancy you are in a sort of class based system.My thinking on taste is largely informed by a tradition in sociology that is usually called the pragmatics of taste, which suggests that sure, maybe that happens, that homology thing. But the problem with that homology thesis is that it doesn't tell you how or why fancy people come to like fancy things or why people in any social group come to acquire the tastes that are associated with that group. And so what these folks do, um, usually through fairly rich ethnographic observation, which is maybe why I like them, um, is they try to describe all of the conditions by which people come to acquire taste. And so they have these studies of, you know, uh, opera fans. There's a book by Claudio Benzecry about how opera fans learn to become opera fans, um, or how, you know, people who listen to, uh, vinyl records set up their little listening stations in their home. There's a lot of stuff that people do to try to, uh, instrument their taste, to, to orchestrate encounters with music in particular.And so I'm really invested in that idea of taste as something that you do rather than something that you just sort of have. Uh, and as something that's very much entangled with technology, a favorite example of mine is, you know, we have a sense of what it means to have taste right now, right? What music do you pick on Spotify or something like that. But if we go back, you know, 50 years, uh, what it meant to have tasted in music might have to do with what radio stations you listen to, uh, what records you bought at the record store records. You know, they're all the same shape. They're all the same color. Basically the more or less cost the same so when you're picking among them all you're doing is expressing yourself, right? You're just making a cultural claim. But what it meant to have tasted that moment was really entangled with technologies, the radio, the LP. Go back a hundred years before that you don't have recorded music. So can anyone have a taste in music then? Certainly not in the way we can now. At the very least taste would mean something different. And so I'm really interested in the idea that what tastes even is is totally entangled with these techniques by which we come to acquire and encounter, uh, cultural objects.So that is a very long-winded way of saying that I think of taste as being this kind of emergent thing that people do in particular settings with particular tools. And one of the tools that they use nowadays is recommender systems.[00:46:47] Cody: One of the things I'm interested in along this line is whether or not our tastes are becoming more monolithic. So my colleague, Adam Mastroianni has a recent essay on this. He puts together these data showing that through the year 2000, about 25% of a year's highest grossing movies were spinoffs, franchises, or sequels. But now, uh, closer to 2020, it's somewhere in the neighborhood of 75%. And he has similar data for TV shows, books, and music as well.So what role do you think recommender systems might be playing in this and in particular, are platforms like Spotify, Netflix, and the like funneling us into these kind of genre enclaves, where they find it legitimately difficult to point us towards something that is at the same time, both new and something that we'll like. What do you make of that, and is that a function of recommender systems as you've come to understand them?[00:47:51] Nick: Well, it's a great question because you're pointing out that the basic tension at the heart of recommender systems . Which is that they're about helping people find a music that they don't know about yet. So there's an assumption that you're, that you like more than, you know. but they're based on this idea that you won't like everything, right?So it has something to do with what you are already know. There's this tension between the constraints, profiling someone and saying, okay, what do you like? And that idea that what you might do with that profiling is broaden people's horizons. And that's a real tension. It's something that I think a lot of critics don't appreciate, that there is a commitment to broadening horizons in this field. Whether or not they achieve them is another question.But that's something that people in the field are really concerned with and trying to figure out: wait a minute, we're sort of pigeonholing people, but we don't want to pigeon hole them. We want to help them. And forever, we've always been saying that recommender systems are about, you know, like, like we were talking about earlier about, you know, cracking you out of a given categories to help you find new things. Or they used to say, you know, 20 years ago that recommender systems would help you go down the "long tail." They would help you find more obscure things that you would never find otherwise, because there were too many things, you just wouldn't have a way to know about these less popular objects.Of course, now we have a lot of concern — this is not a new concern — but the continuing concern about monoculture, about a kind of similarity. And algorithms have emerged as one of the kinds of entities we might blame for why that is, of course, because you know, oh, you like that, you want more like that. There's this kind of valorization of the similar in recommender systems that maybe seems like a cause for this problem more globally.I think it's certainly part of an overall apparatus of cultural production, which is very risk averse now. So one of the things you see in this context of, you know, every movie occurring within the Marvel cinematic universe or whatever. I think you can't really say a recommender system did that. Because certainly a recommender system didn't get to decide what was happening there. But you do have, you know, industries that are organized around trying to maximize their, their successes, and clearly are finding, you know, success, uh, in doing what they're doing and doing what, uh, Mastroianni calls that oligopoly of production.So I think one thing that points us to is the importance of looking at the overall system, you know, recommender systems are a more and more prominent part of cultural circulation now, but they're not everything. And so we don't want to say, oh, it was the algorithm. So it points us to that. But it also points us to this other really interesting, like philosophical question, is you mentioned this idea of genre enclaves, which is a lovely way to put what other people would describe as like filter bubbles. And one funny thing about recommender systems is that if I know enough to recognize a filter bubble, to put you into one, to recognize similarities, such that I can put you there, that means that I have enough data, if I'm a recommender system, to take you out of it. I know what similar is. That means that I know what different is also. And so within that very same system, in theory, I should be able to use the recommender system in a different way, not to give you exactly the same thing, but rather to very on-purpose, um, give you something else to give you something that is different. That's already entailed in the idea that I know enough to put you in a filter bubble in the first place.So in some sense, the, the problem may not be with the technology itself, but with this particular style of implementation, right. We could be implementing recommender systems that more aggressively are about spreading people away from the similar, and that's something you would do with more or less the same system you have now just tuned in a, in a slightly different way.Why is it not tuned in a different way? Well, that's not an algorithm thing, right? That's a business decision. Uh, the algorithm could go either way. It doesn't really care.[00:51:34] Cody: That seems like it comes back to the distinction that your engineering interviewee was talking about where you have the pyramid, with the sort of least engaged, they want to, as he says, lean back, put the music on and then just not really have to do anything to have to make any decisions, find new stuff, skip songs.And then you have the lean in musical savant and more engaged listeners. And clearly the vast majority of listeners and our viewers are going to be in that bottom chunk of the pyramid. And you have the highest probability of reaching the largest number of people by catering to that listener or viewer as your default option, rather than saying, oh, I'm going to try and shape the musical tastes of the youth in a way that exposes them to the meritorious histories of, of jazz and the, you know, unexpected sides of hip hop and all that sort of stuff. So it seems to me like that's a big current in all that's happening here.[00:52:38] Nick: Yeah, I would say one of the sort of stories that emerges over the course of my whole book is this transformation of music recommendation from the sort of first contemporary recommender system named as such in the mid 1990s, um, to the present. Where in the beginning, those early recommender systems were designed around the idea that the user was a really enthusiastic or avid listener, right? You were like really into music. You were going to put in some effort, you were going to open up a recommender system and try to use it specifically to find new stuff, right? You are almost by definition, a kind of crate digger, uh, in that context. Cause it was like more work to use a recommender system than to just turn on the radio. So you already had a way to not put a lot of effort. And uh, so you were in. You know, contemporary industry terms would, would put it, uh, you were a lean forward listener, right? You were someone who was sort of, uh, enthusiastically pursuing a new music.And then over time, since then, just what you described has happened, right? This sort of default assumption of what a user for these systems should be like, um, became something different, right? It became this lean-back listener. It became this person who like, eh, they might not even listen to music at all. So we need to find some way to, you know, entice them into doing it. And a recommended system was maybe a way of doing that. So you open up your Spotify or whatever, and you see, as long as you see something that you're like, sure, I'll listen to that. Then that would catch that person who otherwise may not listen at all. And that's a big change and it comes along alongside a change in data practices, to sort of loop back to this, uh, seeing like an infrastructure question, because those early recommender systems, what data did they have? They had data that you proactively gave them about what you liked, right? You would have to go in and explicitly rate artists, or if it was movies, uh, you know, you know, five stars on Netflix or whatever. And over time, those explicit ratings really get mostly replaced by what they would call implicit ratings. So the idea that listening to a song means that you like it a little bit. You listen to it a lot that becomes more of a sign that you like it. And this is the kind of logic we're very familiar with now in this sort of big data moment, right? This is what big data is all about. This idea that these behavioral traces are, uh, more real. They're easier for people to do. I don't have to explicitly rate something you to sort of know on the basis of what I'm doing. Or you think, you know, uh, what I like, and you might suggest that's a better account of what I like, you know. I might go on Netflix and, you know, give five stars to all of the fancy, classy people movies, but I never watched them. And if you kept recommending them to me, I wouldn't really use Netflix as much, but what I really want is, you know, 1990s action movies. And if you saw what I actually watched, you would know that that's a common argument that they'll make. So we have that transition in sort of three different things at the same time. The change in the kind of data that's available to recommender systems, right? This sort of like trace data of user behavior. We have this change in the economics of, uh, the online media industry right where everything's sort of become streaming and it's not, you know, Netflix used to be a DVD rental company, and then now it becomes something else, right, where they want you to spend more time on it. And that will feed back into getting more data. And then the third thing that comes around is this changing how we know things are, how the people building these systems, know things about their users, which are all entangled together in this sort of emergence of, uh, sort of modern data collection apparatus. And they're all mutually reinforcing cycles.So that's a really big change, I think in the way those, those systems work. And if people are looking for ways out of it, I think that one way that an anthropology of this can be useful is to really foreground and describe what exactly the situation is that we're in.And so one thing I tend to argue is that if we want to get out of some of this really aggressive data collection situation, which happens obviously in domains beyond music and in many other domains where it's much more significant. One thing we might want to think about then is how to intervene in these imaginations of users, right? In the vision of the user, as someone who doesn't really want to get involved, who we sort of tricked into listening, and therefore we have to capture as much data about them as possible because they're not going to give the data to us on purpose. If we change that model, if we change the way that we think about people, then I think that's a key part of the overall edifice of data collection and why data is seen as so valuable now.[00:57:08] Cody: I see that as, as tying into what we were talking about earlier with the model of the individual that the engineers are using is based off of basically the psychology of habits. And so data are most valuable in understanding how to exploit habitual systems and how to essentially, to go back to your metaphor use products as traps for habits and attention, whatever attention may be.And so it seems like part of what you're saying or another, a rephrasing of, of what you're saying an implication may be, is that the more we're able to put in to achieve that higher effort level of avidity, to engage more in a direct and meaningful and thoughtful way with whatever content we're consuming, the less we rely on habit, the less we can be exploited by an understanding of what we habitually do. And the more we can kind of be liberated from the cycle of collect data, exploit it, go further down the rabbit hole of social media and digital content consuming our attention and our lifestyles.[00:58:31] Nick: Yeah. And I think just to like loop back to what we talked about earlier this is one reason why I think having a kind of cultural understanding of the logics behind these systems and how people think is really useful, because a lot of the critiques of these systems we've seen now are couched in the sort of same habits science, behaviorist framework as the systems they're criticizing. So people who say, oh, you know, Facebook's a slot machine or whatever really believe that the best way to model human behavior is still that same behaviorist habit model, that same, you know, press a lever, give you a treat, rat in a cage kind of model. And I think that that model is really constraining in what kinds of futures we can imagine for what humans are going to do. And it really limits us to a certain narrow set of technical interventions. And so by trying to name that by trying to step back and say, what is this, what is this model of the human that's involved in these systems? I want to try, and this is something I'm trying to do with in my newer work on attention, to think about the sort of arbitrariness of those models, and how, if we want to imagine different futures, we might need to think about some of these foundational assumptions differently as well. I'm not sure that we're going to lever press our way out of a sort of behaviorist hellscape that we find ourselves in now.[00:59:54] Cody: Nick. It's been a great pleasure to talk, and I appreciate your perspective on all these things. I could probably go on asking you questions about this space of topics for the next two hours, but you've been really generous with your time. So thanks for taking the time to talk.[01:00:09] Nick: Thanks so much. It was a pleasure.[01:00:11] Cody: That was my conversation with Nick Seaver.I hope you enjoyed it. One of the topics that we didn't get around to is the connection between avidity and anthropological field work itself. It's a topic I know Nick has thought about in his work on attention, and it is also one of the things that I personally most admire about anthropology.My own field, psychology suffers from a historical lack of attention dedicated toward Western people. We study American college students. We assume that whatever we find there will apply to the rest of the world. The field has started to correct this in recent years, but I believe it's an assumption that's built into the psychological worldview in ways that are important and difficult to eradicate.But the premise of the field of anthropology, starting with historical figures like Tylor and Malinowski, is that attending to what other people are up to is actually a lot of work. It's not just enough to be vaguely interested in what other people are doing, especially far away people, but you actively have to search out the best possible vantage from which to observe and make sense of their behavior. To me, that's an application of this basic idea of attention as effort.So in this case, avidity — the amount of effort we're willing to put into acquire new information or seek new experiences — is not only crucial when it comes to the kind of content we consume, but crucial to our ability to understand people with different perspectives.This nods toward one of the foundations of our polarized society. We tend to be, especially as Americans, intuitive psychologists. We assume that the minds of people far away from us mostly look like the minds of people who are in our immediate vicinity. Then we're shocked to find that people who don't occupy our same cultural milieu think in a way that's totally foreign to us.Maybe we need to operate less in our default mode as intuitive psychologists and instead explore what it might mean to operate as intuitive anthropologists.I'd love to know what you thought of this episode. If you want to give me some feedback, you can go to survey.againsthabit.com. If you'd like to subscribe to my Substack newsletter for more content, you can go straight to againsthabit.com.This episode was edited and produced by Emily Chen. I'm Cody Kommers, and thanks for listening to Against Habit. This is a public episode. If you’d like to discuss this with other subscribers or get access to bonus episodes, visit codykommers.substack.com/subscribe
Algorithms run our lives these days, from Netflix binges to predictive policing. And that includes algorithmic recommenders––like Spotify's Discover Weekly and Pandora––that shape how we consume music. How does algorithmic music recommendation work and, perhaps more importantly, who makes it work? After all, algorithms are made and tweaked by people, who work at tech companies and have their own ideas and values. An interview with anthropologist Nick Seaver, who has conducted years of ethnographic fieldwork to understand who creates algorithmic recommenders, and why they do what they do.Nick Seaver is assistant professor of anthropology at Tufts University.Show notes and more over at soundexpertise.org!Questions? Thoughts? Share them with Will on Twitter @seatedovation
Algorithms and artificial intelligence are on the menu for our 36th adventure in anthropology! In this episode, we present two conversations with two great Science and Technology Studies scholars: Dr Nick Seaver and Dr Thao Phan. Dr Seaver, an Assistant Professor of Anthropology at Tufts University, examines themes of taste and attention in his research, drawing on his ethnographic research with US-based developers of algorithmic music recommender systems. Dr Phan is a Postdoctoral Research Fellow at Deakin University, where her research who focuses on gender, AI, and algorithmic cultures. -- For more on our sparkling guests, see: https://twitter.com/npseaver Seaver, Nick. "What should an anthropology of algorithms do?." Cultural anthropology 33.3 (2018): 375-385. https://journal.culanth.org/index.php/ca/article/download/ca33.3.04/90 https://twitter.com/thao_pow Phan, Thao. "Amazon Echo and the aesthetics of whiteness." Catalyst: Feminism, Theory, Technoscience 5.1 (2019): 1-38. https://catalystjournal.org/index.php/catalyst/article/download/29586/24800 -- Conversations in Anthropology is a podcast about life, the universe, and anthropology produced by David Boarder Giles, Timothy Neale, Cameo Dalley, Mythily Meher and Matt Barlow. This podcast is made in partnership with the American Anthropological Association and supported by the Faculty of Arts & Education at Deakin University. Find us at conversationsinanthropology.wordpress.com or on Twitter at @AnthroConvo
Christopher Oscarson, Brigham Young University, on St. Lucia. Tommy Noonan, founder of ReviewMeta, on fake reviews. Nick Seaver from Tufts University on the value and price of our attention. Leslee Thorne-Murphy, BYU, on a spooky Christmas. Tina Seelig is a Professor at Stanford University and author of “What I Wish I Knew When I Was 20.”
What really is an algorithm? This episode dives a bit deeper into this topic that we’ve talked about here before and relates to much of our research on emerging technology. I wanted to know more, in ordinary terms, not only what an algorithm is, but how they work, and what the prevalence of the use of artificial intelligence and algorithms mean for consumer and cooperative finance. Filene’s Research Director, Taylor Nelms brings us his friend Dr. Nick Seaver, Assistant Professor of Anthropology at Tufts University, who has written a book about the making of algorithmic music recommendations, to provide us some clarity on this cloudy topic. In this episode, we explore how algorithms can be a series of steps to follow to turn an input into an output – that’s an algorithm, or it can be more complex machine learning software process that learns from the source data and makes use of it to provide a future prediction – that’s an algorithmic system. Nick also teaches us why you can’t just “set it and forget it” with these types of algorithmic systems, and what can go wrong if you do that. The coolest part is when we explore how amazing it would be for credit unions to be the ones to make data portable for their members, let them decide to participate in the data’s use and have power over how they are represented in the data… Just don’t be caught calling an algorithmic system an algorithm… Visit filene.org/trusttech for more information on the "Future of Trust" event mentioned in this episode.
On the heels of the day’s graduate program information session, join us for our annual colloquium featuring alumni of CMS, discussing their lives from MIT to their careers today. Nick Seaver, Assistant Professor of Anthropology at Tufts University and a 2010 graduate of Comparative Media Studies, is an anthropologist of technology, whose research focuses on the circulation, reproduction, and interpretation of sound. He holds a Ph.D. from the University of California, Irvine. His dissertation research examined the development of algorithmic music recommendation, and at CMS, he wrote a thesis on the history of the player piano. Colleen Kaman is a user experience strategist at IBM Interactive Experience, skilled in storytelling, user research, learning design, and persuasive technologies. Her expertise is in developing products, services, and campaigns that help users make better decisions and accomplish tasks more effectively and efficiently. Sean Flynn is the Program Director for the Points North Institute, a Maine-based organization supporting nonfiction storytellers through artist development initiatives and, most prominently, the Camden International Film Festival and Points North Forum. He received his master’s degree in Comparative Media Studies in 2015 and worked as a researcher at the MIT Open Documentary Lab. Sean began his filmmaking career as a producer and cinematographer working on two feature-length documentaries, both of which had their premiere at the Tribeca Film Festival and aired on national television.
Brigitte Madrian of Harvard Univ looks back on the financial crisis ten years later. Robert Dillon, University City School District, discusses good classroom design. BYU's John Kauwe figures out why Shawn Bradley is so tall. The Apple Seed host Sam Payne share a Mark Twain story. Lorraine Johnson of LymeDisease.org explains why Lyme Disease still lacks a vaccine. Tufts Univ's Nick Seaver explains how tech companies battle for our attention.
Corina and Angel talk to Nick Seaver about his research with music recommender systems and understanding the cultures, tastes and relationships created through and with those systems. Looking at what taste means and why it is important to the design of algorithms for music recommender systems. Mentioned in Podcast: Seaver, Nick. 2015. “The nice thing about context is that everyone has it.” Media, Culture, and Society 37(7): 1101–1109. Maffesoli, Michel. 1996. “The Time of the Tribes: The Decline of Individualism” in Mass Society, SAGE Publications Ltd Nick's work: Seaver, Nick. 2017. Algorithms as Culture: Some Tactics for the Ethnography of Algorithmic Systems. In “Algorithms in Culture,” edited by Morgan Ames and Massimo Mazzotti, special issue, Big Data & Society. Seaver Nick. 2017. Arrival. In “Correspondences: Proficiency,” edited by Andrés García Molina and Franziska Weidle. Cultural Anthropology website, June 27, 2017. Follow his work at: https://ase.tufts.edu/anthropology/people/seaver.htm http://nickseaver.net/ @npseaver on Twitter
From Google search to Facebook news, algorithms shape our online experience. But like us, algorithms are flawed. Programmers write cultural biases into code, whether they realize it or not. Author Luke Dormehl explores the impact of algorithms, on and offline. Staci Burns and James Bridle investigate the human cost when YouTube recommendations are abused. Anthropologist Nick Seaver talks about the danger of automating the status quo. Safiya Noble looks at preventing racial bias from seeping into code. And Allegheny County’s Department of Children and Family Services shows us how a well-built algorithm can help save lives. Algorithms aren’t neutral. They’re really just recipes; expressions of human intent. That means it’s up to us to build the algorithms we want. Read more on how we can make algorithms more accountable. IRL is an original podcast from Mozilla. For more on the series go to irlpodcast.org. Leave a rating or review in Apple Podcasts so we know what you think.
The algorithmic infrastructures of the internet are made by a weird cast of characters: rock stars, gurus, ninjas, wizards, alchemists, park rangers, gardeners, plumbers, and janitors can all be found sitting at computers in otherwise unremarkable offices, typing. These job titles, sometimes official, sometimes informal, are a striking feature of internet industries. They mark jobs as novel or hip, contrasting starkly with the sedentary screenwork of programming. But is that all they do? In this talk, drawing on several years of fieldwork with the developers of algorithmic music recommenders, Seaver describes how these terms help people make sense of new kinds of jobs and their positions within new infrastructures. They draw analogies that fit into existing prestige hierarchies (rockstars and janitors) or relationships to craft and technique (gardeners and alchemists). They aspire to particular imaginations of mastery (gurus and ninjas). Critics of big data have drawn attention to the importance of metaphors in framing public and commercial understandings of data, its biases and origins. The metaphorical borrowings of role terms serve a similar function, highlighting some features at the expense of others and shaping emerging professions in their image. If we want to make sense of new algorithmic industries, we’ll need to understand how they make sense of themselves. Nick Seaver is assistant professor of anthropology at Tufts University. His current research examines the cultural life of algorithms for understanding and recommending music. He received a masters from MIT's Comparative Media Studies program in 2010 for research on the history of the player piano.
Conversations from the November 2013 American Anthropological Association meeting in Chicago. Tomomi Yamaguchi talks about right-wing activists in Japan. Nick Seaver explains the cultural importance of algorithms. Walter Callaghan shares his personal journey to studying PTSD in Canadian soldiers. And Shan-Estelle Brown discusses the aesthetic experiences some drug users have with their opioid replacement therapy.
Conversations from the November 2013 American Anthropological Association meeting in Chicago. Tomomi Yamaguchi talks about right-wing activists in Japan. Nick Seaver explains the cultural importance of algorithms. Walter Callaghan shares his personal journey to studying PTSD in Canadian soldiers. And Shan-Estelle Brown discusses the aesthetic experiences some drug users have with their opioid replacement therapy.
Today's teaching is presented by Michelle Seaver, guest teacher and student of Alan Wallace, while Alan is out of town for 24 hours. Alan will return to his regular teaching schedule Saturday morning. Michelle Seaver attended one of the first 100-day shamatha retreats with Alan, continued on in a private 18-month retreat, and is now responsible for bringing mindfulness-based education to the Phuket International Academy School (currently K-8, with plans to be PreK-12). The guided meditation is on compassion at the deepest level - the aspiration to be free of suffering caused by grasping. The meditation begins at 1:00 in the recording. Following the guided meditation, Michelle and Nick Seaver, her husband and CEO of the Phuket International Academy who also attended one of Alan's first 100-day shamatha retreats and followed it with an 18-month retreat, field these questions from the group: 1. Can you tell us about your experience on a long-term shamatha retreat? 2. About the meditation on awareness of awareness: when trying to focus on awareness, I realize that there is another awareness that is focusing on awareness, so this is no longer awareness. Can you help? 3. What is the mission of the mind centre? 4. Is debate, logic, and thinking included in your mindfulness based program in the school?