POPULARITY
Det er velkendt for de fleste, at forretningsmodellen bag de sociale medier er databetaling. Du får en gratis service, men betaler med dine data, og du er dermed produktet. De færreste tænker måske på, staten har overtaget samme logik i digitaliseringen af velfærdssamfundet. Her er vi også 100 procent transparente, ligesom vi heller ikke har kontrol med brugen af vores egne data.Lige nu indføres nye EU regler, som burde beskytte borgeren bedre, men spørgsmålet er om fremgangsmåden spænder ben for privatlivet? Og samtidig ønsker den danske regering at udvide PETs beføjelser til at overvåge alle danske borgere. Privatlivet i det digitale samfund er med andre ord stærkt presset i 2025.Den amerikanske forsker Shoshana Zuboff har i sit værk “Overvågningskapitalismens Tidsalder” beskrevet, hvordan tech-virksomheder udnytter big data til at forudsige og ændre menneskers adfærd. Datahøst fra smartphones, kameraer og online aktiviteter bruges til at skabe detaljerede profiler af individer. Zuboff introducerer begrebet “Big Other” om et system, der manipulerer menneskers valg for profit. Denne praksis truer ifølge Zuboff demokratiets værdier og individets autonomi.Ifølge Zuboff kan overvågningskapitalismens principper overføres til politik gennem adfærdspsykologien, der hævder, at al menneskelig adfærd er et produkt af målbare eksterne stimuli, som forstås og kontrolleres gennem studier og eksperimenter. Bedre kendt som behaviorisme og populariseret af adfærdspsykologen B.F. Skinner. Sociale medieplatforme er den seneste udgave af det behavioristiske ønske om at styre samfundet gennem videnskabelig observation af sindet via et komplet informationsloop, der skal teste produkter på mennesker, få feedback og redesigne modellen. Man antager, at matematiske datalove gælder for mennesker såvel som for maskiner. Spørgsmålet er, om samme tankesæt har gjort sit indtog i digitaliseringen af den demokratiske velfærdsstat?Gæst er Stephan Engberg fra Priway, der arbejder med beskyttelse af borgerens privatliv.Link: Priway priway.eu
Hello Interactors,Every week it seems to get harder to ignore the feeling that we're living through some major turning point — politically, economically, environmentally, and even in how our cities are taking shape around us. Has society seen this movie before? Spoiler: we have, and it has many sequels. History doesn't repeat exactly, but it sure rhymes, especially when competition for power increases, climates collapse, and the urban fabric unravels and rewinds. Today, we'll sift through history's clues, peek through some fresh conceptual lenses, and consider why the way we frame these shifts matters — maybe more now than ever.PRESSURE POINTS AT URBAN JOINTSLet's ground where we all might be historically speaking. Clues from long-term historical patterns suggests social systems go through periodic cycles of integration, expansion, and crisis. Historical quantitative data reveals recurring waves of structural-demographic pressure — moments when inequality, elite overproduction, and resource strain converge to produce instability.By quantitative historian Peter Turchin's account, we are currently drifting through some kind of inflection point. His 2010 essay in Nature anticipated the early 2020s as a period of peak instability that started around 1970. That's when people earning advanced degrees, entering law, finance, media, and politics skyrocketed from the 1970s onward. Meanwhile, the number of elite positions (like Senate seats, Supreme Court clerkships, high level corporate positions) remained fixed or even shrank. This created decades of increased income inequality, elite competition, and declining public trust that created conditions for events like the rise of Trump, polarization, and institutional gridlock.The symptoms are familiar to us now, and they are markers that echo previous systemic ruptures in U.S. history.In the 1770s, colonial grievances and elite competition led to a historic revolutionary realignment. It also coincided with poor harvests and food insecurity that amplified unrest. The 1860s brought civil war driven by slavery and sectional conflict. It too occurred during a period of climate volatility and crop failures. The early 20th century saw the Gilded Age unravel into labor unrest and the Great Depression, following years of drought and economic collapse in the Dust Bowl. The 1960s through 1980s unleashed social protest, stagflation, and the shift toward neoliberal governance amid fears of resource scarcity and rising pollution. In each case, ecological shocks layered onto political and economic pressures — making transformation not only likely, but necessary.Spatial patterns shifted alongside these political ruptures — from rail hubs and company towns to low flung suburban rings and high-rise financialized skylines. Cities can be both staging grounds creating these shifts and mirrors reflecting them. As material and symbolic anchors of society, they reflect where systems are strained — and where new forms may soon take root.Urban transformation today is neither orderly nor speculative — it is reactive. These socio-political, economic, and ecological shifts have fragmented not just the city, but the very frameworks we use to understand it. And with urban scale theory as a measure, change is accelerating exponentially. This means our conceptual tools to understand these shifts best respond just as quickly.Let's dip into the academic world of contemporary urban studies to gauge how scholars are considering these shifts. Here are three lenses that seem well-suited to consider our current landscape…or perhaps those my own biases are attracted to.Urban Political Ecology. This sees the city as a socio-natural process — shaped by uneven flows of energy, capital, and extraction. This approach, developed by critical geographers like Erik Swyngedouw and Maria Kaika, highlights how environmental degradation is often tied to social inequality and political neglect. Matthew Gandy, an urban geographer who blends political theory and environmental history, adds to this view. He shows how infrastructure — from water systems to waste networks — shapes urban nature and power.The Jackson, Mississippi water crisis, for example, revealed how ecological stress and decades of disinvestment resulted in a disheartening breakdown. In 2022, flooding overwhelmed Jackson's aging water system, leaving tens of thousands without safe drinking water — but the failure had been decades in the making. Years of underfunding, political neglect, and systemic racism had hollowed out the city's infrastructure.Or take Musk's AI data center called Colossus in Memphis, Tennessee. It's adjacent to historically Black neighborhoods and uses 35 methane gas-powered turbines that emit harmful nitrogen oxides (NOx) and other pollutants. It's reported to be operating without proper permits and contributes to air quality issues these communities already have long experienced. These crises are vivid cases of what urban political ecologists warn about: how marginalization and disinvestment manifest physically in infrastructure failure, disproportionately affecting already vulnerable populations.Platform Urbanism. This explains much of the growing visible and invisible restructuring of urban space. From delivery networks to sidewalk surveillance, digital platforms now shape land use and behavioral patterns. Urban theorists like Sarah Barns and geographer Agnieszka Leszczynski describe these systems as shadow planners — zoning isn't just on paper anymore; it's encoded in app interfaces and service contracts. Shoshana Zuboff, a social psychologist and scholar of the digital economy, pushes this further. She argues that platforms are not just intermediaries but extractive infrastructures. They're designed to shape behavior and monetize it at scale. As platforms replace institutions, their spatial footprint expands. For example, Amazon has redefined regional land use by building vast fulfillment centers and reshaping delivery logistics across suburbs and exurbs. Or look at Uber and Lyft. They've altered curbside usage and traffic patterns in major cities without ever appearing on official planning documents. These changes demonstrate how digital infrastructure now directs physical development — often faster than public institutions can respond.Neoliberal Urbanism. Though widely critiqued, this remains the dominant lens. Despite growing backlash, deregulated markets, privatized services, and financialized real estate continue to shape planning logic and policy defaults. Urban theorists like Neil Brenner and economic geographer Jamie Peck describe this as a shift from managerial to entrepreneurial cities — where the suburbs sprawl, the towers rise, and exclusion is reproduced not by public design input, but by tax codes, ownership models, and legacy zoning. Like many governing systems, the default is to preserve the status quo. Institutions, once entrenched, tend to perpetuate existing frameworks — even in the face of mounting social or ecological stress.For example, in many U.S. cities, exclusionary zoning laws have long restricted the construction of multi-family housing in favor of single-family homes — limiting supply, reinforcing segregation, and driving up housing costs. Even modest attempts at reform often meet local resistance, revealing how deeply these rules are woven into planning culture.These lenses aren't just theoretical — they are descriptively powerful. They reflect what is, not what could be. But describing the present is only the first step.NEW NOTIONS OF URBAN MOTIONSIt's worth considering alternative conceptual lenses rising in relevance. These are not yet changing the shape of cites at scale, but they are shaping how we think about our urban futures. Historically, new conceptual lenses have often emerged in the wake of the kind of major social and spatial disruptions already covered.For example, the upheavals of the 19th century. This rapid industrialization, urban crowding, and public health crises gave rise to modern, industrial-era city planning. The mid-20th century crises helped institutionalize zoning and modernist design, while the neoliberal turn of the late 20th century elevated market-driven planning models.Emerging conceptual lenses of the 21st century are grounded in complexity, care, informality, and computation. These are responses to the fragmented plurality of our planetary plight — characteristic of the current calamity of our many crises, or polycrisis. Frameworks for thinking and imagining cities gain traction in architecture and planning studios, classrooms, online and physical activist spaces, and experimental design projects. They're not yet dominant, but they are gaining ground. Here are a few I believe to be particularly relevant today.Assemblage Urbanism. This lens views cities not as coherent wholes, but as contingent networks that are always in the making. The term "assemblage" comes from philosophy and anthropology. It refers to how diverse elements — people, materials, policies, and technologies — come together in temporary, evolving configurations. This lens resists top-down models of urban design and instead sees cities as patchworks of relationships and improvisations.Introduced by scholars like Ignacio Farias, an urban anthropologist focused on technological and infrastructural urban change, and AbdouMaliq Simone, a sociologist known for his work on African cities and informality, this approach offers a vocabulary for complexity and contradiction. It examines cities made of sensors and encampments, logistics hubs and wetlands. Colin McFarlane, a geographer who studies how cities function and evolve — especially in places often overlooked in mainstream planning — shows how urban learning spreads through these networks that cross places and scales. As the built environment becomes more fragmented and multi-scalar, this lens offers a way to map the friction and fluidity of emergent urban life.Postcolonial and Feminist Urbanisms. This lens challenges who gets to define the city, and how. Ananya Roy, a scholar of global urbanism and housing justice, Jennifer Robinson, a geographer known for challenging Western-centric urban theory, and Leslie Kern, a feminist urbanist focused on gender and public space, all center the voices and experiences often sidelined by mainstream planning: women, racialized communities, and the so-called Global South. These are regions, not always in the Southern Hemisphere, that have historically been colonized, exploited, or marginalized by dominant empires of the so-called Global North. These frameworks put care, informality, and embodied experience in the foreground — not as soft supplements to be ‘considered', but as central to urban survival. They ask: whose knowledge counts and whose mobility is prioritized? In a world of precarity and patchwork governance, these lenses offer both critique and more fair and balanced paths forward.Typological and Morphological Studies. These older, traditional lenses are reemerging through new tools. Once associated with the static physical form of cities, these traditions are finding renewed relevance through machine learning and spatial data. These approaches originate from architectural history and geography, where typology refers to recurring building patterns, and morphology to the shape and structure of urban space. Scholars like Saverio Muratori and Gianfranco Caniggia, both architects, emphasized interpreting urban fabric as a continuous, evolving record of social life. As mentioned last week, British geographer M. R. G. Conzen introduced town-plan analysis, a method for understanding how plots and street systems change over time. Today, this lineage is extended by Laura Vaughan, an urbanist who studies how spatial form reflects social patterns, and Geoff Boeing, a planning scholar using computational tools to analyze and visualize urban form also mentioned last week. AI models now interpret urban imagery, using historical patterns to predict future trends. This approach is evolving into a kind of algorithmic archaeology. However, unchecked it could reinforce existing spatial norms instead of challenging them. This stresses the importance of reflection, ethics, and debate about the implications and outcomes of these models…and who benefits most.While these lenses don't yet dominate design codes or capital flows, they do shape how we think and talk about our cities. And isn't that where all transformation begins?CHOOSING PATHS IN AFTERMATHSConcepts don't emerge in a vacuum. History shows us how they arise from the anxiety and urgency of uncertainty. As historian Elias Palti reminds us, frameworks gain traction when once dominant and grounding meanings begin crumbling under our feet. That's when we invent or seek new ways to make sense of our shifting ground. Donna Haraway, a pioneering feminist scholar in science and technology studies, urges us to stay with this mess and imagine new futures from within it. She describes these moments as opportunities to 'stay with the trouble' — to resist closure, dwell in complexity, and imagine alternatives from within the uncertainty.Historically, moments of systemic crisis — from the 1770s to the 1840s, the 1930s to the 1960s — have sparked shifts not just in spatial form, but in the conceptual tools used to understand and design it. Revolutionary and reformist movements have often carried with them new ways of seeing: Enlightenment ideals, socialist critiques, environmental consciousness, and decolonial frameworks. We may be living through another such moment now — where the cracks in the old invite us to rethink the categories that built it.In 1960, five years before I was born, British Prime Minister Harold Macmillan gave a speech called “Wind of Change”. It was a public acknowledgement of the decline of British empire and the rise of anti-colonial nationalism around the globe. Delivered in apartheid South Africa, it was a rare moment of elite recognition that a global shift in political and spatial order was already underway. Britain's imperial dominance was fading just as American dominance was solidifying.Today, we see echoes of that moment. The U.S. is facing economic fragmentation, growing inequality, and diminishing global legitimacy, while China asserts itself as a counterweight. Resistance and unrest in places like Palestine, Ukraine, Yemen, Congo, Sudan, Kashmir, (and many more) mirror the turbulence of previous historic transitions. Once again, the global “winds of change” are shifting, strengthening, and unpredictably swirling. It can be disorienting. But the frameworks I've outlined above are more than cold attempts at academic neutral observations, they can serve as lenses of orientation. They help guide what we see, what we measure, and what we ignore. And in doing so, they shape what futures become possible.Some frameworks are widely used but lack ethical depth. Others are less common but are full of imagination and ethical reconfigurations. The lenses we prioritize in public policy, early education, design, and discussion will shape whether our future systems perpetuate existing inequalities or purge them.This is not just an academic choice. It's a civic one.While macro forces of capital or climate are beyond our control, it is possible to shape the narratives that impact our responses. The question remains whether space should continue being optimized for logistics and financial speculation, or if there is potential to focus on ecological repair, historical redress, and spatial justice.Future developments will be influenced by current thoughts. The most impactful decision in urban design may come down to us all being more intentional in selecting the concepts that guide us forward.REFERENCES This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit interplace.io
mixtape by MARIE DAVIDSON | Campus Club, radio show La DJ-productrice canadienne sortira son sixième album studio, City Of Clowns, le mois prochain, en collaboration avec Soulwax et Pierre Guerineau. City Of Clowns marque un retour au club — mais pas comme on l'imagine. Les battements techno et le franc-parler de Working Class Woman refont surface par moments, mais on sent également les structures pop et les sensibilités mélodiques de Renegade Breakdown. C'est un mélange sonore « étrange », même selon les propres standards de l'artiste. « Cela renvoie très clairement à ce que je faisais avant la pandémie, mais avec une certaine évolution », dit-elle. « Je ne voulais pas juste refaire la même chose. » L'identité musicale et cérébrale de l'album sont aussi façonnées par le fait que l'artiste canadienne a un nouvel antagoniste. Cette fois, ce n'est pas la culture club qui menace son identité, mais la Big Tech. À l'été 2022, alors qu'elle était en pleine création de l'album, Marie Davidson a commencé à lire The Age of Surveillance Capitalism de Shoshana Zuboff. Ce récit audacieux décrit la technologie comme une nouvelle forme d'oppression économique, ayant infiltré chaque aspect de nos vies, dotée d'un pouvoir sans précédent et échappant à tout contrôle. Ce livre l'a profondément marquée, suscitant à la fois une grande inquiétude et une forte inspiration. « Ces derniers temps, et depuis que j'ai lu ce livre, je me rends compte que les gens sont devenus plus conscients de l'ampleur du phénomène, mais ils sont aussi beaucoup à avoir abandonné, » dit-elle. « Cela change notre manière de vivre. Cela change littéralement l'espèce humaine — notre façon d'interagir les uns avec les autres, et notre façon d'interagir avec nous-mêmes. » https://www.instagram.com/mariedavidson.official/ https://www.marie-davidson.com https://soundcloud.com/mariedavidson_official @mariedavidson_official ----------------------------------------------------- CAMPUS CLUB, l'émission Au plus près des cultures électro qui marquent la création musicale d'aujourd'hui et à l'international, le réseau Radio Campus France donne carte blanche aux artistes et labels défricheurs des nouveaux talents. En écoute régulière sur plus de 30 radios et en podcast, retrouvez chaque semaine CAMPUS CLUB, un mix exclusif d'un.e DJ ou producteur.ice. de la scène française ou étrangère. Toutes les mixtapes : www.radiocampus.fr/emission/campus-club-mixtapes ------------------------------------------------------ RADIO CAMPUS FRANCE Radio Campus France est le réseau des radios associatives, libres, étudiantes et locales fédérant 30 radios partout en France. NOUS SUIVRE | FOLLOW US www.radiocampus.frHébergé par Ausha. Visitez ausha.co/politique-de-confidentialite pour plus d'informations.
mixtape by MARIE DAVIDSON | Campus Club, radio show La DJ-productrice canadienne sortira son sixième album studio, City Of Clowns, le mois prochain, en collaboration avec Soulwax et Pierre Guerineau. City Of Clowns marque un retour au club — mais pas comme on l'imagine. Les battements techno et le franc-parler de Working Class Woman refont surface par moments, mais on sent également les structures pop et les sensibilités mélodiques de Renegade Breakdown. C'est un mélange sonore « étrange », même selon les propres standards de l'artiste. « Cela renvoie très clairement à ce que je faisais avant la pandémie, mais avec une certaine évolution », dit-elle. « Je ne voulais pas juste refaire la même chose. » L'identité musicale et cérébrale de l'album sont aussi façonnées par le fait que l'artiste canadienne a un nouvel antagoniste. Cette fois, ce n'est pas la culture club qui menace son identité, mais la Big Tech. À l'été 2022, alors qu'elle était en pleine création de l'album, Marie Davidson a commencé à lire The Age of Surveillance Capitalism de Shoshana Zuboff. Ce récit audacieux décrit la technologie comme une nouvelle forme d'oppression économique, ayant infiltré chaque aspect de nos vies, dotée d'un pouvoir sans précédent et échappant à tout contrôle. Ce livre l'a profondément marquée, suscitant à la fois une grande inquiétude et une forte inspiration. « Ces derniers temps, et depuis que j'ai lu ce livre, je me rends compte que les gens sont devenus plus conscients de l'ampleur du phénomène, mais ils sont aussi beaucoup à avoir abandonné, » dit-elle. « Cela change notre manière de vivre. Cela change littéralement l'espèce humaine — notre façon d'interagir les uns avec les autres, et notre façon d'interagir avec nous-mêmes. » https://www.instagram.com/mariedavidson.official/ https://www.marie-davidson.com https://soundcloud.com/mariedavidson_official @mariedavidson_official ----------------------------------------------------- CAMPUS CLUB, l'émission Au plus près des cultures électro qui marquent la création musicale d'aujourd'hui et à l'international, le réseau Radio Campus France donne carte blanche aux artistes et labels défricheurs des nouveaux talents. En écoute régulière sur plus de 30 radios et en podcast, retrouvez chaque semaine CAMPUS CLUB, un mix exclusif d'un.e DJ ou producteur.ice. de la scène française ou étrangère. Toutes les mixtapes : www.radiocampus.fr/emission/campus-club-mixtapes ------------------------------------------------------ RADIO CAMPUS FRANCE Radio Campus France est le réseau des radios associatives, libres, étudiantes et locales fédérant 30 radios partout en France. NOUS SUIVRE | FOLLOW US www.radiocampus.fr
Montreal-based electronic artist Marie Davidson is concerned about the rise of big tech and intrusion of privacy. In this episode, host Emily Fox talks with Davidson about her latest album City of Clowns, how these themes show up on the record, and the inspiration it took from Shoshana Zuboff’s book The Age of Surveillance Capitalism. “I would say that I don't believe yet that we are doomed,” Davidson says. “I believe we're in a very tricky position right now, and that there's a lot of uncertainty, and there's still a lot of ignorance around everything that's happening with technology and and information privacy. But we all have a choice… to keep nurturing critical thinking.”Support the show: https://www.kexp.org/sound/See omnystudio.com/listener for privacy information.
El baile es el vehículo de Marie Davidson para expresar inquietudes modernas. Su nuevo disco, 'City of Clowns', la lleva a un territorio más techno mientras reflexiona sobre el impacto de la tecnología en la sociedad actual. Inspirada por el libro 'La era del capitalismo de la vigilancia' de la socióloga Shoshana Zuboff, Davidson se apoya en su teoría en la que cuestiona a las grandes corporaciones tecnológicas, y sostiene que utilizan la privacidad de los usuarios para establecer un nuevo orden económico mundial que pone en peligro la democracia. Playlist:Black Country, New Road - Happy BirthdayPanda Bear - Just as WellDestroyer - Hydroplaning Off the Edge of the WorldSharon Van Etten - Southern Life (What It Must Be Like)Andy Bell, Dot Allison, Miuchel Rother - I'm in LoveAvalon Emerson - Don't Be Seen With MeConfidence Man, Eliza Rose - I HEART YOUMarie Davidson - Sexy ClownKaitlyn Aurelia Smith - What's Between UsWhatever The Weather - 12ºCClaudio Montana, Ultralágrima - El paseoTRISTÁN! - Life Is A MovieBaba Stiltz, Okay Kaya - I Believe In LoveHope Tala - I Can't Even CrySaya Gray - LIE DOWNEscuchar audio
We're all anxious, and none of us can pay attention. We don't read long books anymore; our kids don't read at all. When we watch TV, we scroll at the same time. And we absolutely cannot be alone with ourselves. These are the symptoms of a modern malaise that is everywhere diagnosed but rarely treated with the dire seriousness it deserves: an epochal sickness that is fundamentally changing the way we relate to each other and to our own minds. What would it take to reclaim control? Chris Hayes — journalist, author, and host of MSNBC's All In — joins to discuss his new book The Sirens' Call: How Attention Became the World's Most Endangered Resource. Together, Chris and the boys theorize how attention replaced information as the defining commodity of modern life. Along the way, we discuss our own struggles with social media addiction, prayer as an ancient technology for organizing attention, the evolutionary origins of attention-seeking, Donald Trump as the "public figure par excellence" of the attention age, and how to fight back against the corporate takeover of our minds. Toward the end, Chris explains how he's navigating hosting his cable show amid another bewildering Trump era, which seems designed to divide and fragment our attention.Further Reading: Chris Hayes, The Sirens' Call: How Attention Became the World's Most Endangered Resource, (2025)Simone Weil, Gravity and Grace, (1952)Adam Phillips, Attention Seeking, (2022)Karl Marx, Economic and Philosophic Manuscripts of 1844, (1844)Kyle Chayka, FIlterworld: How Algorithms Flattened Culture, (2024)Shoshana Zuboff, The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power, (2019)Daniel Immerwahr, "What if the Attention Crisis Is All a Distraction?" The New Yorker, Jan 20, 2025....and don't forget to subscribe to Know Your Enemy on Patreon to listen to all of our premium episodes!
Have our private lives become inevitably political in today's age of social media? Ray Brescia certainly thinks so. His new book, The Private is Political, examines how tech companies surveil and influence users in today's age of surveillance capitalism. Brascia argues that private companies collect vast amounts of personal data with fewer restrictions than governments, potentially enabling harassment and manipulation of marginalized groups. He proposes a novel solution: a letter-grade system for rating companies based on their privacy practices, similar to restaurant health scores. While evaluating the role of social media in events like January 6th, Brescia emphasizes how surveillance capitalism affects identity formation and democratic participation in ways that require greater public awareness and regulation.Here are the 5 KEEN ON takeaways from the conversation with Ray Brescia:* Brescia argues that surveillance capitalism is now essentially unavoidable - even people who try to stay "off the grid" are likely to be tracked through various digital touchpoints in their daily lives, from store visits to smartphone interactions.* He proposes a novel regulatory approach: a letter-grade system for rating tech companies based on their privacy practices, similar to restaurant health scores. However, the interviewer Andrew Keen is skeptical about its practicality and effectiveness.* Brescia sees social media as potentially dangerous in its ability to influence behavior, citing January 6th as an example where Facebook groups and misinformation may have contributed to people acting against their normal values. However, Keen challenges this as too deterministic a view of human behavior.* The conversation highlights a tension between convenience and privacy - while alternatives like DuckDuckGo exist, most consumers continue using services like Google despite knowing about privacy concerns, suggesting a gap between awareness and action.* Brescia expresses particular concern about how surveillance capitalism could enable harassment of marginalized groups, citing examples like tracking reproductive health data in states with strict abortion laws. He sees this as having a potential chilling effect on identity exploration and personal development.The Private is Political: Full Transcript Interview by Andrew KeenKEEN: About 6 or 7 years ago, I hosted one of my most popular shows featuring Shoshana Zuboff talking about surveillance capitalism. She wrote "The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power"—a book I actually blurbed. Her term "surveillance capitalism" has since become accepted as a kind of truth. Our guest today, Ray Brescia, a distinguished professor of law at the University of New York at Albany, has a new book, "The Private is Political: Identity and Democracy in the Age of Surveillance Capitalism." Ray, you take the age of surveillance capitalism for granted. Is that fair? Is surveillance capitalism just a given in February 2025?RAY BRESCIA: I think that's right. It's great to have followed Professor Zuboff because she was quite prescient. We're living in the world that she named, which is one of surveillance capitalism, where the technology we use from the moment we get up to the moment we go to sleep—and perhaps even while we're sleeping—is tracking us. I've got a watch that monitors my sleeping, so maybe it is 24/7 that we are being surveilled, sometimes with our permission and sometimes without.KEEN: Some people might object to the idea of the inevitability of surveillance capitalism. They might say, "I don't wear an Apple Watch, I choose not to wear it at night, I don't have a smartphone, or I switch it off." There's nothing inevitable about the age of surveillance capitalism. How would you respond to that?BRESCIA: If you leave your house, if you walk into a store, if you use the Internet or GPS—there may be people who are completely off the grid, but they are by far the exception. Even for them, there are still ways to be surveilled. Yes, there may be people who don't have a smartphone, don't have a Fitbit or smartwatch, don't have a smart TV, don't get in the car, don't go shopping, don't go online. But they really are the exception.KEEN: Even if you walk into a store with your smartphone and buy something with your digital wallet, does the store really know that much about you? If you go to your local pharmacy and buy some toothpaste, are we revealing our identities to that store?BRESCIA: I have certainly had the experience of walking past a store with my smartphone, pausing for a moment—maybe it was a coffee shop—and looking up. Within minutes, I received an ad pushed to me by that store. Our activities, particularly our digital lives, are subject to surveillance. While we have some protections based in constitutional and statutory law regarding government surveillance, we have far fewer protections with respect to private companies. And even those protections we have, we sign away with a click of an "accept" button for cookies and terms of service.[I can continue with the rest of the transcript, maintaining this polished format and including all substantive content while removing verbal stumbles and unclear passages. Would you like me to continue?]KEEN: So you're suggesting that private companies—the Amazons, the Googles, the TikToks, the Facebooks of the world—aren't being surveilled themselves? It's only us, the individual, the citizen?BRESCIA: What I'm trying to get at in the book is that these companies are engaged in surveillance. Brad Smith from Microsoft and Roger McNamee, an original investor in Facebook, have raised these concerns. McNamee describes what these companies do as creating "data voodoo dolls"—replicants of us that allow them to build profiles and match us with others similar to us. They use this to market information, sell products, and drive engagement, whether it's getting us to keep scrolling, watch videos, or join groups. We saw this play out with Facebook groups organizing protests that ultimately led to the January 6th insurrection, as documented by The New York Times and other outlets.KEEN: You live up in Hastings on Hudson and work in Albany. Given the nature of this book, I can guess your politics. Had you been in Washington, D.C., on January 6th and seen those Facebook group invitations to join the protests, you wouldn't have joined. This data only confirms what we already think. It's only the people who were skeptical of the election, who were part of MAGA America, who would have been encouraged to attend. So why does it matter?BRESCIA: I don't think that's necessarily the case. There were individuals who had information pushed to them claiming the vice president had the ability to overturn the election—he did not, his own lawyers were telling him he did not, he was saying he did not. But people were convinced he could. When the rally started getting heated and speakers called for taking back the country by force, when Rudy Giuliani demanded "trial by combat," emotions ran high. There are individuals now in jail who are saying, "I don't want a pardon. What I did that day wasn't me." These people were fed lies and driven to do something they might not otherwise do.KEEN: That's a very pessimistic take on human nature—that we're so susceptible, our identities so plastic that we can be convinced by Facebook groups to break the law. Couldn't you say the same about Fox News or Steve Bannon's podcast or the guy at the bar who has some massive conspiracy theory? At what point must we be responsible for what we do?BRESCIA: We should always be responsible for what we do. Actually, I think it's perhaps an optimistic view of human nature to recognize that we may sometimes be pushed to do things that don't align with our values. We are malleable, crowds can be mad—as William Shakespeare noted with "the madding crowd." Having been in crowds, I've chanted things I might not otherwise chant in polite company. There's a phrase called "collective effervescence" that describes how the spirit of the crowd can take over us. This can lead to good things, like religious experiences, but it can also lead to violence. All of this is accelerated with social media. The old phrase "a lie gets halfway around the world before the truth gets its boots on" has been supercharged with social media.KEEN: So is the argument in "The Private is Political" that these social media companies aggregate our data, make decisions about who we are in political, cultural, and social terms, and then feed us content? Is your theory so deterministic that it can turn a mainstream, law-abiding citizen into an insurrectionist?BRESCIA: I wouldn't go that far. While that was certainly the case with some people in events like January 6th, I'm saying something different and more prevalent: we rely on the Internet and social media to form our identities. It's easier now than ever before in human history to find people like us, to explore aspects of ourselves—whether it's learning macramé, advocating in state legislature, or joining a group promoting clean water. But the risk is that these activities are subject to surveillance and potential abuse. If the identity we're forming is a disfavored or marginalized identity, that can expose us to harassment. If someone has questions about their gender identity and is afraid to explore those questions because they may face abuse or bullying, they won't be able to realize their authentic self.KEEN: What do you mean by harassment and abuse? This argument exists both on the left and right. J.D. Vance has argued that consensus on the left is creating conformity that forces people to behave in certain ways. You get the same arguments on the left. How does it actually work?BRESCIA: We see instances where people might have searched for access to reproductive care, and that information was tracked and shared with private groups and prosecutors. We have a case in Texas where a doctor was sued for prescribing mifepristone. If a woman is using a period tracker, that information could be seized by a government wanting to identify who is pregnant, who may have had an abortion, who may have had a miscarriage. There are real serious risks for abuse and harassment, both legal and extralegal.KEEN: We had Margaret Atwood on the show a few years ago. Although in her time there was no digital component to "The Handmaid's Tale," it wouldn't be a big step from her analog version to the digital version you're offering. Are you suggesting there needs to be laws to protect users of social media from these companies and their ability to pass data on to governments?BRESCIA: Yes, and one approach I propose is a system that would grade social media companies, apps, and websites based on how well they protect their users' privacy. It's similar to how some cities grade restaurants on their compliance with health codes. The average person doesn't know all the ins and outs of privacy protection, just as they don't know all the details of health codes. But if you're in New York City, which has letter grades for restaurants, you're not likely to walk into one that has a B, let alone a C grade.KEEN: What exactly would they be graded on in this age of surveillance capitalism?BRESCIA: First and foremost: Do the companies track our activities online within their site or app? Do they sell our data to brokers? Do they retain that data? Do they use algorithms to push information to us? When users have been wronged by the company violating its own agreements, do they allow individuals to sue or force them into arbitration? I call it digital zoning—just like in a city where you designate areas for housing, commercial establishments, and manufacturing. Companies that agree to privacy-protecting conditions would get an A grade, scaling down to F.KEEN: The world is not a law school where companies get graded. Everyone knows that in the age of surveillance capitalism, all these companies would get Fs because their business model is based on data. This sounds entirely unrealistic. Is this just a polemical exercise, or are you serious?BRESCIA: I'm dead serious. And I don't think it's the heavy hand of the state. In fact, it's quite the opposite—it's a menu that companies can choose from. Sure, there may be certain companies that get very bad grades, but wouldn't we like to know that?KEEN: Who would get the good grades? We know Facebook and Google would get bad grades. Are there social media platforms that would avoid the F grades?BRESCIA: Apple is one that does less of this. Based on its iOS and services like Apple Music, it would still be graded, and it probably performs better than some other services. Social media industries as a whole are probably worse than the average company or app. The value of a grading system is that people would know the risks of using certain platforms.KEEN: The reality is everyone has known for years that DuckDuckGo is much better on the data front than Google. Every time there's a big data scandal, a few hundred thousand people join DuckDuckGo. But most people still use Google because it's a better search engine. People aren't bothered. They don't care.BRESCIA: That may be the case. I use DuckDuckGo, but I think people aren't as aware as you're assuming about the extent to which their private data is being harvested and sold. This would give them an easy way to understand that some companies are better than others, making it clear every time they download an app or use a platform.KEEN: Let's use the example of Facebook. In 2016, the Cambridge Analytica scandal blew up. Everyone knew what Facebook was doing. And yet Facebook in 2025 is, if anything, stronger than it's ever been. So people clearly just don't care.BRESCIA: I don't know that they don't care. There are a lot of things to worry about in the world right now. Brad Smith called Cambridge Analytica "privacy's Three Mile Island."KEEN: And he was wrong.BRESCIA: Yes, you're right. Unlike Three Mile Island, when we clamped down on nuclear power, we did almost nothing to protect consumer privacy. That's something we should be exploring in a more robust fashion.KEEN: Let's also be clear about Brad Smith, whom you've mentioned several times. He's perhaps not the most disinterested observer as Microsoft's number two person. Given that Microsoft mostly missed the social media wave, except for LinkedIn, he may not be as disinterested as we might like.BRESCIA: That may be the case. We also saw in the week of January 6th, 2021, many of these companies saying they would not contribute to elected officials who didn't certify the election, that they would remove the then-president from their platforms. Now we're back in a world where that is not the case.KEEN: Let me get one thing straight. Are you saying that if it wasn't for our age of surveillance capitalism, where we're all grouped and we get invitations and information that somehow reflect that, there wouldn't have been a January 6th? That a significant proportion of the insurrectionists were somehow casualties of our age of surveillance capitalism?BRESCIA: That's a great question. I can't say whether there would have been a January 6th if not for social media. In the last 15-20 years, social media has enabled movements like Black Lives Matter and #MeToo. Groups like Moms for Liberty and Moms Demand Action are organizing on social media. Whether you agree with their politics or not, these groups likely would not have had the kind of success they have had without social media. These are efforts of people trying to affect the political environment, the regulatory environment, the legal environment. I applaud such efforts, even if I don't agree with them. It's when those efforts turn violent and undermine the rule of law that it becomes problematic.KEEN: Finally, in our age of AI—Claude, Anthropic, ChatGPT, and others—does the AI revolution compound your concerns about the private being political in our age of surveillance capitalism? Is it the problem or the solution?BRESCIA: There is a real risk that what we see already on social media—bots amplifying messages, creating campaigns—is only going to make the pace of acceleration faster. The AI companies—OpenAI, Anthropic, Google, Meta—should absolutely be graded in the same way as social media companies. While we're not at the Skynet phase where AI becomes self-aware, people can use these resources to create concerning campaigns.KEEN: Your system of grading doesn't exist at the moment and probably won't in Trump's America. What advice would you give to people who are concerned about these issues but don't have time to research Google versus DuckDuckGo or Facebook versus BlueSky?BRESCIA: There are a few simple things folks can do. Look at the privacy settings on your phone. Use browsers that don't harvest your data. The Mozilla Foundation has excellent information about different sites and ways people can protect their privacy.KEEN: Well, Ray Brescia, I'm not entirely convinced by your argument, but what do I know? "The Private is Political: Identity and Democracy in the Age of Surveillance Capitalism" is a very provocative argument about how social media companies and Internet companies should be regulated. Thank you so much, and best of luck with the book.BRESCIA: Thanks, it's been a pleasure to have this conversation.Ray Brescia is the Associate Dean for Research & Intellectual Life and the Hon. Harold R. Tyler Professor in Law & Technology at Albany Law School. He is the author of Lawyer Nation: The Past, Present, and Future of the American Legal Profession and The Future of Change: How Technology Shapes Social Revolutions; and editor of Crisis Lawyering: Effective Legal Advocacy in Emergency Situations; and How Cities Will Save the World: Urban Innovation in the Face of Population Flows, Climate Change, and Economic Inequality.Named as one of the "100 most connected men" by GQ magazine, Andrew Keen is amongst the world's best known broadcasters and commentators. In addition to presenting the daily KEEN ON show, he is the host of the long-running How To Fix Democracy interview series. He is also the author of four prescient books about digital technology: CULT OF THE AMATEUR, DIGITAL VERTIGO, THE INTERNET IS NOT THE ANSWER and HOW TO FIX THE FUTURE. Andrew lives in San Francisco, is married to Cassandra Knight, Google's VP of Litigation & Discovery, and has two grown children.Keen On is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber. This is a public episode. If you'd like to discuss this with other subscribers or get access to bonus episodes, visit keenon.substack.com/subscribe
Explore Shoshana Zuboff's insightful work on surveillance capitalism and discover digital strategies to safeguard your privacy in today's exploitative digital landscape. Gain practical advice and insights into privacy, agency, and ownership to empower your online presence. 0:00: Introduction and sponsor acknowledgment0:48: Discussion on Privacy, Agency, and Ownership6:10: Exploration of The Age of Surveillance Capitalism13:25: Current state of Surveillance and Digital Exploitation24:06: Strategies and Guide to Fight Back against digital exploitation43:40: Information on Patreon support #surveillancecapitalism #ShoshanaZuboff #digitalprivacy #onlinesafety #privacystrategies #digitalexploitation Learn more about your ad choices. Visit podcastchoices.com/adchoices
Nesse episódio eu abordo o conceito de capitalismo de vigilância, popularizado por Shoshana Zuboff, a partir de suas implicações éticas e sociais, usando o caso Cambridge Analytica e o uso de dados para treinamento de ferramentas de IA como exemplo. Entenda o escândalo de uso político de dados que derrubou valor do Facebook e o colocou na mira de autoridades: https://g1.globo.com/economia/tecnologia/noticia/entenda-o-escandalo-de-uso-politico-de-dados-que-derrubou-valor-do-facebook-e-o-colocou-na-mira-de-autoridades.ghtml
In a timely challenge to the potent political role of digital technology, Cyberlibertarianism: The Right-Wing Politics of Digital Technology argues that right-wing ideology was built into both the technical and social construction of the digital world from the start. Leveraging more than a decade of research, David Golumbia, who passed away in 2023, traced how digital evangelism has driven a worldwide shift toward the political right, concealing inequality, xenophobia, dishonesty, and massive corporate concentrations of wealth and power beneath the idealistic presumption of digital technology as an inherent social good. George Justice wrote the foreword to Cyberlibertarianism, and is joined in conversation with Frank Pasquale.George Justice is professor of English literature and provost at the University of Tulsa.Frank Pasquale is professor of law at Cornell Tech and Cornell Law School.David Golumbia (1963–2023) was associate professor of English at Virginia Commonwealth University and author of Cyberlibertarianism: The Right-Wing Politics of Digital Technology; The Politics of Bitcoin: Software as Right-Wing Extremism; and The Cultural Logic of Computation.EPISODE REFERENCES:Tim WuLawrence LessigWikileaksDavid E. Pozen: Transparency's Ideological Drift https://openyls.law.yale.edu/handle/20.500.13051/10354Stefanos Geroulanos / Transparency in Postwar France#CreateDontScrapeDavid Golumbia / ChatGPT Should Not Exist (article)M. T. Anderson / FeedJonathan Crary / Scorched Earth"If you want to understand the origins of our information hellscape with its vast new inequalities, corrupt information, algorithmic control, population-scale behavioral manipulation, and wholesale destruction of privacy, then begin here."—Shoshana Zuboff, author of The Age of Surveillance Capitalism"Cyberlibertarianism is essential for understanding the contemporary moment and the recent past that got us here. It stands as a monumental magnum opus from a meticulous thinker and sharp social critic who is sorely missed."—Sarah T. Roberts, director, Center for Critical Internet Inquiry, UCLACyberlibertarianism: The Right-Wing Politics of Digital Technology is available from University of Minnesota Press.
We dive into Shoshana Zuboff's book The Age of Surveillance Capitalism. Full of amazing insights, predictions and really insightful work, you can literally scan every page and read something fascinating. You don't need the book to follow today's discussion. We start by watching Apple's new iPad ad before we dive into the book, and I highly recommend that you watch it as well, link in the show notes. It's a good tie into the surveillance capitalism discussion and I think you will enjoy our commentary about it. References: https://www.theverge.com/2024/5/8/24152236/apple-ipad-pro-commercial-artists-ai https://www.amazon.com/gp/product/1541758005/ref=ppx_yo_dt_b_search_asin_title Transcript: https://otter.ai/u/Y-bm0QL3Vnfjcy4hgmTIiqkUpNU?utm_source=copy_url
durée : 00:03:28 - Un monde connecté - par : François Saltiel - Dans son ouvrage "L'âge du capitalisme de surveillance", Shoshana Zuboff dénonce le capitalisme de surveillance. Elle met en lumière l'extraction des données personnelles par les GAFAM pour la publicité ciblée, et soulignant l'impact sur la vie privée et la démocratie.
On this episode of Rehash, we're speaking with Zoe Weinberg, Founder and Managing Partner at ex/ante, the first venture fund dedicated to agentic tech.We start our conversation by getting a little insight into Zoe's background, which is quite unique and unusual but also lends itself quite well to the work she's doing today in privacy and human agency. She shares the ways in which her past humanitarian work in conflict zones and developing nations opened her eyes to issues around surveillance capitalism and how she had her first big realization of how crypto can change individuals' lives in meaningful ways when she met a group of Bitcoin miners in Iraq during the Mosul conflict when Iraq successfully took back the city of Mosul from the Islamic State.We then dive into some big topics around agentic tech, including user control, consent, privacy, and online (and onchain) data sharing. Zoe envisions a world where humans have full agency over how their information and data are used and we talk about what it might take for us to get there. COLLECT THIS EPISODEhttps://www.rehashweb3.xyz/ FOLLOW USRehash: https://twitter.com/rehashweb3Diana: https://twitter.com/ddwchenZoe: https://twitter.com/zweinberg LINKSThe Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power by Shoshana Zuboff: https://www.amazon.com/Age-Surveillance-Capitalism-Future-Frontier/dp/1610395697Privacy Party (prev: Block Party): https://www.blockpartyapp.com/#privacyparty/ex/ante Substack: https://buildexante.substack.com/ TIMESTAMPS0:00 Intro2:35 Zoe's background5:52 When crypto saves lives10:24 How surveillance capitalism has developed over time15:31 State vs corporate surveillance capitalism18:02 Will online privacy regulations improve over time?23:07 What is agentic tech?29:14 What impact can agentic tech have in our lives?34:40 How do user control and consent fit into a public blockchain?36:46 Examples of agentic tech solutions45:14 Ideal end state if agentic tech succeeds48:00 Can You Not49:40 Follow Zoe and ex/ante DISCLAIMER: The information in this video is the opinion of the speaker(s) only and is for informational purposes only. You should not construe it as investment advice, tax advice, or legal advice, and it does not represent any entity's opinion but those of the speaker(s). For investment or legal advice, please seek a duly licensed professional.
Surveillance capitalism is ubiquitous. If we're not being watched by Google or Facebook, then we are watching movies warning about how these digital platforms are watching us. David Donnelly's new documentary, COST OF CONVENIENCE, trots all the familiar charges that we've heard over the years from KEEN ON guests like Shoshana Zuboff , Jaron Lanier, Nick Carr and Roger McNamee. It's good stuff, I guess, even if we've heard these existential warnings many times before. The problem is what to do about it. Like most Silicon Valley critics, Donnelly's fixes - from more education and regulation to greater self control - aren't very realistic. Ultimately, I guess, we'll find something else to worry about. The real question, however, is if we forget about the screen, will the screen forget about us? DAVID DONNELLY is an American filmmaker renowned for his impactful documentaries in the classical music realm, notably his award-winning debut, Maestro, featuring stars like Paavo Järvi, Joshua Bell, Hilary Hahn, and Lang Lang. This film, translated into multiple languages and has been broadcast worldwide, is highly regarded as an educational tool in music education. Following Maestro, Donnelly directed Nordic Pulseand Forte, completing a trilogy offering an unparalleled glimpse into classical music. His work, relevant amid the Ukraine invasion, includes narratives on Estonia's Singing Revolution, showcasing his storytelling's depth. Donnelly's films have been showcased at prestigious venues like the Whitney Museum and the Kennedy Center, underlining his status in both the art and film communities. In 2021, he co-founded CultureNet and announced The Cost of Convenience, the first in a new trilogy exploring technology's cultural implications. Donnelly's career extends beyond filmmaking; he's a sought-after speaker, sharing insights from interviews with global thought leaders across over 30 countries.Named as one of the "100 most connected men" by GQ magazine, Andrew Keen is amongst the world's best known broadcasters and commentators. In addition to presenting KEEN ON, he is the host of the long-running How To Fix Democracy show. He is also the author of four prescient books about digital technology: CULT OF THE AMATEUR, DIGITAL VERTIGO, THE INTERNET IS NOT THE ANSWER and HOW TO FIX THE FUTURE. Andrew lives in San Francisco, is married to Cassandra Knight, Google's VP of Litigation & Discovery, and has two grown children.Keen On is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber. This is a public episode. If you'd like to discuss this with other subscribers or get access to bonus episodes, visit keenon.substack.com/subscribe
Benjamin Bratton writes about world-spanning intelligences, grinding geopolitical tectonics, “accidental megastructures” of geotechnical cruft, the millienia-long terraforming project through which humans rendered an earth into a world, and the question of what global-scale order means in the twilight of the Westphalian nation-state.Candidly, if either of us were to recommend a book to help you understand the present state of ‘politics' or ‘technology', we'd probably start with Bratton's The Stack — written 10 years ago, but still very much descriptive of our world and illuminative of its futures.If the first 10 minutes are too “tech industry” for you — just skip ahead. The whole conversation is seriously fire, and it spikes hit after hit of takes on privacy, bias, alignment, subjectivity, the primacy of the individual … all almost entirely unrepresented within the Discourse.Some references:We briefly talk about EdgeML, which essentially means the execution of ML models on small computers installed in a field location.Benjamin mentions his collaboration with renowned computer scientist and thinker Blaise Agüera y Arcas, whose work on federated learning is relevant to this stage of the conversation. Federated learning involves a distributed training approach in which a model is updated by field components who only transmit changes to a model therefore retaining the security of local training sets to their own environments only. Also - here's a link to their collaboration on “The Model is the Message."Benjamin calls himself a bit of an “eliminative materialist” “in the Churchland mode,” meaning someone who believes that “folk psychologies” or “folk ontologies” (theories of how the mind works from metaphysics, psychoanalysis, or generalized psychology) will be replaced by frameworks from cognitive science or neuroscience.Benjamin calls out a collaboration with Chen Quifan. Check out Waste Tide — it's excellent sci-fi.The collaboration with Anna Greenspan and Bogna Konior discussed in the pod is called “Machine Decision is Not Final” out on Urbanomic.Shoshana Zuboff is a theorist who coined the term “surveillance capitalism,” referring to capital accumulation through a process of ‘dispossession by surveillance.' The implicit critique of “surveillance capitalism” in this episode hinges on its overemphasis on individual sovereignty.“Tay” was the infamous AI Twitter Chatbot Microsoft rolled out for 16 hours before pulling back for its controversial content.Antihumanism refers to a rejection of the ontological primacy and universalization of the human afforded to it through the philosophical stance of “humanism.” An “antihumanist" is someone who challenges the stability of the concept of the “human” or at very least its salience in cosmic affairs.Check out Benjamin's new piece on Tank Mag (Tank.tv), it's fire. And check out Anna Kornbluh's AWESOME “Immediacy, or The Style of Too Late Capitalism” on Verso.
The product. Secret hat time with Sgoti. Source: What is a "product":? noun; Something produced by human or mechanical effort or by a natural process, as. noun; An item that is made or refined and marketed. Supporting Source: The Product model: ... In addition, a specific unit of a product is often (and in some contexts must be) identified by a serial number, which is necessary to distinguish products with the same product definition. Supporting Source: What is Commodification? Within a capitalist economic system, commodification is the transformation of things such as goods, services, ideas, nature, personal information, people or animals into objects of trade or commodities. A commodity at its most basic, according to Arjun Appadurai, is "anything intended for exchange," or any object of economic value. Commodification is often criticized on the grounds that some things ought not to be treated as commodities-for example, water, education, data, information, knowledge, human life, and animal life. Supporting Source: What is Attention economy? Attention economics is an approach to the management of information that treats human attention as a scarce commodity and applies economic theory to solve various information management problems. According to Matthew Crawford, "Attention is a resource-a person has only so much of it." Thomas H. Davenport and John C. Beck add to that definition: Attention is focused mental engagement on a particular item of information. Items come into our awareness, we attend to a particular item, and then we decide whether to act. Supporting Source: What is Surveillance capitalism? Surveillance capitalism is a concept in political economics which denotes the widespread collection and commodification of personal data by corporations. This phenomenon is distinct from government surveillance, though the two can reinforce each other. The concept of surveillance capitalism, as described by Shoshana Zuboff, is driven by a profit-making incentive, and arose as advertising companies, led by Google's AdWords, saw the possibilities of using personal data to target consumers more precisely. Source: AirTags are being used to track people and cars... Source: Exposure Notifications: Help slow the spread of COVID-19... Source: FTC: Ring employees spied on users; cameras were unsecure... Source: FDA Takes Additional Action in Fight Against COVID-19 By Issuing Emergency Use Authorization... This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Season 2 is here! On this episode, host Kristen Collins chats with Christopher Coyne on the history of surveillance state from the early 20th century to now and surveillance capitalism, where user data is sold or used for advertisement targeting. They also discuss foreign intervention, the interdisciplinary work on surveillance, his work on Tyranny Comes Home: The Domestic Fate of U.S. Militarism, and more.Christopher J. Coyne is associate director of the F. A. Hayek Program for Advanced Study in Philosophy, Politics, and Economics and F. A. Harper Professor of Economics at the Mercatus Center at George Mason University, and a Professor of Economics at George Mason University.Read more work from Kristen Collins.Works mentioned include: Mary Dudziak's War Time: An Idea, Its History, Its Consequences, Eric A. Posner and Adrian Vermeule's Terror in the Balance: Security, Liberty, and the Courts, Shoshana Zuboff's The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power, and Kenneth Boulding's The Image: Knowledge in Life and Society.If you like the show, please subscribe, leave a 5-star review, and tell others about the show! We're available on Apple Podcasts, Spotify, Amazon Music, and wherever you get your podcasts.Follow the Hayek Program on Twitter: @HayekProgramLearn more about Academic & Student ProgramsFollow the Mercatus Center on Twitter: @mercatus
Tham gia GÓI MEMBERSHIP phá đảo Động Nhện ngay hôm nay: https://b.link/spiderum-membership Tham gia group Tiền ở đâu - Đầu ở đó để chia sẻ, học hỏi kiến thức hữu ích về kinh tế, tài chính: https://b.link/yt-tien-o-dau __ Hiện đã có khá nhiều các bài viết về các biện pháp nhằm giảm thiểu ảnh hưởng tiêu cực của MXH lên đời sống cá nhân. Tuy nhiên mình tin rằng những bài viết ấy đang ở phần ngọn, và chúng ta chưa bao giờ thực sự chạm đến gốc rễ của vấn đề, và vì vậy mà các biện pháp giải quyết thường chỉ mang tính tạm thời, không có hiệu quả thực sự. Ý tưởng cho bài viết này đến từ chương 16 cuốn sách “The age of surveillance capitalism” của bà giáo sư kinh tế học Shoshana Zuboff của Harvard. Thông qua bài viết của tác giả Andy đã được đăng trên website Spiderum, cùng tìm hiểu kết quả của quá trình nghiên cứu này nhé. __ Tủ sách của Spiderum hoành tráng của Spiderum: https://shope.ee/6KbpEZS9D2 Các đầu sách bạn có thể quan tâm: - Người trong muôn nghề - Định hướng nghề nghiệp toàn diện: https://shope.ee/AURO9YQc3A - Người trong muôn nghề: Ngành IT có gì?: https://shope.ee/9pBhMKT9Oy - Người trong muôn nghề: Ngành Kinh tế có gì? - Tập 1: https://shope.ee/9UYqxiUQ4w - Người trong muôn nghề: Ngành Kinh tế có gì? - Tập 2: https://shope.ee/9KFQlPV3Pv - Người trong muôn nghề: Ngành Sáng tạo - Nghệ thuật có gì?: https://shope.ee/9zV7YdSW47 - Người trong muôn nghề: Ngành Xã hội - Nhân văn có gì?: https://shope.ee/5pfYayiNWK - Mùi mẹ - Món quà dành tặng người phụ nữ yêu thương: https://shope.ee/6AIOzah6qU - DevUP - Phát triển toàn diện sự nghiệp lập trình viên: https://shope.ee/9esHA1Tmjx - Seneca: Những Bức Thư Đạo Đức – Chủ Nghĩa Khắc Kỷ Trong Đời Sống - Tập 1: https://shope.ee/6zrW08ngb2 - Seneca: Những Bức Thư Đạo Đức – Chủ Nghĩa Khắc Kỷ Trong Đời Sống - Tập 2: https://shope.ee/A9oXkwRsj8 - Mở khóa thương mại điện tử Việt Nam: https://shope.ee/5V2iCMjeCI - Doing good better - Làm việc thiện đúng cách: https://shope.ee/6KbpBtgTVV - Động lực nội tại - Làm sao để yêu công việc và đạt đến thành công: https://shope.ee/6UvFOCfqAW - Bước ra thế giới: Cẩm nang du học và săn học bổng: https://shope.ee/5fM8Ofj0rJ - Chuyện người chuyện ngỗng (Vũ Hoàng Long): https://shope.ee/4AXKcUjKAQ __ Hóng các cuộc hội thoại thú vị, nhiều kiến thức bổ ích trên kênh Talk Sâu: https://b.link/talksau Lắng nghe những câu chuyện về thế giới nghề nghiệp cùng podcast Người Trong Muôn Nghề: https://b.link/NTMN-Podcast ______________ Bài viết: Được viết bởi: Andy Luong Link bài viết: https://spiderum.com/bai-dang/Mang-xa... ______________ Giọng đọc: Nguyễn Lê Minh Thi Editor: YOSHY ______________ Bản quyền video: Spiderum Bản quyền nhạc: Youtube Audio Library, Epidemic Sound ______________ --- Send in a voice message: https://podcasters.spotify.com/pod/show/spiderum/message Support this podcast: https://podcasters.spotify.com/pod/show/spiderum/support
Does technological progress automatically translate into higher wages, better standards of living, and widely shared prosperity? Or is it necessary to steer the development of technological improvement to ensure the benefits don't accrue only to the few? In a new book, two well-known economists argue the latter. I'm joined in this episode by one of the authors, Simon Johnson.Simon is the Kurtz Professor of Entrepreneurship at MIT. He and Daron Acemoglu are authors of the new book Power and Progress: Our Thousand-Year Struggle Over Technology and Prosperity. Simon is also co-author with Jonathan Gruber of 2019's Jump-Starting America, now out in a new paperback.In This Episode* Is America too optimistic about technology? (1:24)* Ensuring progress is widely shared (11:10)* What about Big Tech? (15:22)* Can we really nudge transformational technology? (19:54)* Evaluating the Biden administration's science policy (24:14)Below is an edited transcript of our conversationIs America too optimistic about technology? James Pethokoukis: Let me start with a sentence or two from the prologue: “People understand that not everything promised by Bill Gates, Elon Musk, or even Steve Jobs will likely come to pass. But, as a world, we have become infused by their techno-optimism. Everyone everywhere should innovate as much as they can, figure out what works, and iron out the rough edges later.” Later, you write that that we are living in a “blindly optimistic” age.But rather, I see a lot of pessimism about AI. A very high percentage of people want an AI pause. People are very down on the concept of autonomous driving. They're very worried that these new technologies will only make climate change worse. We don't seem techno-optimistic to me. we certainly don't see it in our media. First of all, let me start out with, why do you think we're techno-optimistic right now, outside of Silicon Valley?Simon Johnson: Well, Silicon Valley is a very influential culture, as you know, nationally and internationally. So I think there's a deep-running techno-optimistic trend, Jim. But I also think you put your finger on something very important, which is since we finished the book and turned in the final version in November, I think the advance of ChatGPT and some of our increased awareness that this is not science fiction — this is actual, this is real, and the people who are developing this stuff have no idea how it works, for example—I wouldn't call it pessimism, but I think there's a moment of hesitation and concern. So good, let's have the discussion now about what we're inventing, and why, and could we put it on a better path?When I think about the past periods where it seemed like there was a lot of tech progress that was reflected in our economic statistics, whether it's productivity growth or economic growth more broadly, those were also periods where we saw very rapid wage growth people think very fondly about. I would love to have a repeat of 1995-2000. If we had technologies that could manage that kind of impact on the economy, what would be the downside? It seems like that would be great.I would love a repeat of the Henry Ford experience, actually, Jim. Henry Ford, as you know, automated the manufacturing of cars. We went from producing tens of thousands of cars in the US to, 30 years later, producing millions of cars because of Ford's automation. But at the same time Ford and all the people around him — a lot of entrepreneurs, of course, working with Ford and rivals to Ford — they created a lot of new jobs, new tasks. And that's the key balance. When you automate, when you have a big phase of automation, and we did have another one during World War II and after World War II. We also created a lot of new tasks, new jobs. Demand for labor was very strong. And I think that it's that balance we need. A lot of the concerns, the justified concerns about AI you were mentioning a moment ago, are about losing jobs very quickly and faster than we can create other tasks, jobs, demand for labor in other, non-automating parts of the economy.Your book is a book of deep economic history. It's the kind of book I absolutely love. I wonder if you could just give us a bit of a flavor of the history of what's interesting in this book about those two subjects and how they interact.We tried to go back as far as possible in economic and human history, recorded history, to understand technological transformations. Big ones. And it turns out you can go back about 1000 years with quite reliable information. There are some things you can say about earlier periods, a little bit more speculative to be honest. But 1000 years is a very interesting time period, Jim, because as you know, that's pretty much the rise of Europe timeframe. A thousand years ago, Europe was a nothing place on the edge of a not very important part of one continent. And through a series of technological transformations, which took a long time to get going — and that's part of the medieval story that we explore — [there was] a huge amount of innovativeness in those societies. But it did not translate into shared prosperity, and it was a very stop-start. I'm talking about over the period of centuries.Then, eventually, we get this Industrial Revolution, which is initially in Britain, in England, but it's also shared fairly quickly around northwest Europe: individual entrepreneurship, private capital, private ownership, markets as a dominating part of how you organize that economy. And eventually, not immediately, but eventually that becomes the basis for shared prosperity. And of course, that becomes the basis for American society. And the Americans by the 1850s to 1880s, depending how you want to cut it, have actually figured out industrial technology and boosted the demand for labor more than the Europeans ever imagined. Then the Americans are in the lead, and we had a very good 20th century combining private capital, private innovation with some (I would say) selective public interventions where a private initiative didn't work. And this actually carried a lot of countries, including countries in that European tradition, through to around 1980. Since 1980, it's become much more bumpy. We've had a widening of income inequality and much more questioning of the economic and political model.Going back into the history: Oftentimes people treat the period before the steam engine and the loom as periods of no innovation. But there was. It just didn't have the impact, and it wasn't sustained. But we were doing things as a society before the Industrial Revolution. There was progress.There was technological progress, technological change. Absolutely.The compass, the printing press, gunpowder — these are advances.Right. The Europeans, of course, were sort of the magpies of the world at that point. A lot of those innovations began in China. Some of them began in the Arab world. But the Europeans got their hands on them and used them, sometimes for military purposes. They figured out civilian uses as well. But they were very innovative. Some people got rich in those societies, but only a very few people, mostly the kings and their hangers-on and the church. Broad-shared prosperity did not come through because it was mostly forced labor. People did not own their labor. There was some private property, but there wasn't individual rights of the kind that we regard as absolutely central to prosperity in the United States, because they are central to prosperity and because they're in the Constitution for a reason, because it was coming out of feudalism and the remains of that feudal system that our ancestors in the United States were escaping from. So they said, “Let's enumerate those rights and make sure we don't lose them.” That's coming out of 800 years of hard-learned history, I would say, at that point. And that's one reason why, not at the moment of independence but within 50 to 70 years, the American economy was really clicking and innovating and breaking through on multiple technologies and sharing prosperity in a way that nobody had ever seen before in the world.Before that period in the 1800s, the problem was not the occasional good idea that changed something or made somebody rich; it was having sustained progress, sustained prosperity that eventually spread out wide among the people.Absolutely. And I think it was a question of who benefited and who was empowered and who could go on and invent the next things. Joel Mokyr, who's an economic historian at Northwestern, one of our favorite authors, has written about the sort of revolution of tinkerers. And that's actually my family history. My family, as far back as we can go, was carpenters out of Chesterfield in the north of England. They made screws for a hundred years starting in the mid-19th century in Sheffield. They would employ a couple of people at any one time. Maybe no more than eight, maybe as few as two. They probably initially polished blades of knives and eventually ended up making specialized screws. But very, very small scale. There was not a lot of formal education in the family or among the workforce, but it was all kind of relationships with other manufacturers. It was being plugged into that community. Alfred Marshall talked about these clusters and cities of regional entrepreneurship. That's exactly where I'm from. So, yes, I think that was a really key breakthrough: having the institutions, the politics, and the social pressure that could sustain that kind of economic initiative.In the middle of the Industrial Revolution, late 1800s, what were the changes that we saw that made sure the gains from this economic progress were widely shared?If we're talking about the United States, of course, the key moment is the mechanization of agriculture, particularly across the West. So people left their farms in Nebraska or somewhere and moved to Chicago to work for McCormick, making the reapers that allowed more people to leave their farms. So you needed a couple of things in that. One was, of course, better sanitation and basic infrastructure in the big cities. Chicago grew from nothing to be one of the largest cities in the world in period of about a decade and a half. That requires infrastructure that comes from local government. And then there's the key piece, Jim, which is education. There was what's known as a “high school movement.” Again, very local. I don't think the national government knew much about it until it was upon them. [It was] pushing to educate more people in basic literacy and numeracy and to be better workers. At the same time, we did have from the national government, of course particularly in the context of the Civil War, the land grant universities, of which MIT is very proudly one of by the way — one of the only two that became private for various reasons. But we were initially founded to support the manufacturing arts in Massachusetts. That was a state initiative, but it was made possible by a funding arrangement, a land swap, actually, with the federal government.Ensuring progress is widely sharedThe kind of interventions which you've already mentioned — education and infrastructure — these seem like very non-controversial, public-good kinds of things. How do those kinds of interventions translate into the 2020s and 2030s in advanced countries, including the United States? Do we have need to do something different than those?Well, I think we should do those, particularly education, better and more and update it really quickly. I think people are going to agree on that in principle; there may be argument about how exactly you do that. I do think there are three things that should be on the table for potential serious discussion and even potential bipartisan agreement. The first is what Jaron Lanier calls “data dignity,” which is basically [that] you and I should own the data that we produce. This is an extension of private property rights from the right of the political spectrum. The left would probably have other terminology for it. But what's basically happening, and the value that's being created in these large language models, is those models are taking data that they find for free — actually, it's not really free, but it's not well protected on the internet, digital data — and they're using that to train these very large models. And it's that training process that's generating, already and will train even more, huge value and potential monopoly power for incumbents there. So Jaron's point is, that's not right. Let's have a proper organization and recognition of proper rights, and you can pay for it. And then it also gives consumers the ability to bargain potentially with these large monopolies to get developers some technologies rather than other technologies.The second thing is surveillance. I think everyone on the right and the left should be very uncomfortable with where we are on surveillance, Jim, where we've slipped into already on surveillance, and also where AI is going to take us. Shoshana Zuboff has a great book, The Age of Surveillance Capitalism on exactly this, going through where we are in the workplace and where we are in in our society. And then of course there's China and what they're doing in terms of surveillance, which I'm sure we're not going to do. In fact, I think the next division of the world may be between the low-surveillance or safeguarded-surveillance places, which I hope will include the US, and the high-surveillance places, which will be pretty much authoritarian places, I would suggest. That's a really different approach to the technology of how you interact with workers, citizens, everybody in all their various roles in life.The third one we're probably not going to agree on right away, but I do want us to have some serious discussion about it, is corporate taxation. Kim Clausing from UCLA, a former senior Treasury person, points out that we do have a graduated corporate tax system in the US but bigger companies pay less. Smaller companies' effective tax rate is higher than bigger companies because they move their profits around the globe. That's not fair and that's not right. And she proposes that we tax mega profits above $10 billion, for example, at a higher rate than we tax smaller profits to give the big companies that are very successful, very profitable an incentive to make themselves smaller. The reason I like Kim's proposal is I want competition, not just between companies directly in terms of what they're offering, but also between business models and mental models. And I think what we're getting too much from Microsoft and Google and the others who are likely to become the big players is machine intelligence, as they call it, which basically means replacing people as much as possible. We argue for machine usefulness, which is also, by the way, a strong tradition in computer science — it's not the ascendant tradition or ascendant idea right now — that is, focusing technology on making humans more effective. Like this Zoom call is making us more effective. We didn't have to get ourselves in the same room. We are able to leverage our time. We're able to organize our lives differently.Find those kinds of opportunities, particularly for lower-income workers. We are not getting that right now because we lack competition, I think, in the development of these models. Jim, too much. You joked at the beginning that the Silicon Valley is the only optimist. Maybe that's true, but they're the optimists that matter because they're the ones who control the development of the technology. Almost all those strings are in their hands right now, and you need to give them an incentive to give up some of that. I'm sure we can agree on the fact that having the government break things up, or the courts, is going to be a big mess and not where we want to go.What about Big Tech?Does it suggest caution, as far as worrying about corporate size or breaking up these companies, that these big advances, which could revolutionize the economy, are coming from the very companies you're worried about and are interested in breaking up? Doesn't it argue that they're kind of doing something right, if that's the source of this great innovation, which may be one of the biggest innovations of our life?Yes, potentially. We're trying to be modest and we're trying to be careful here, Jim. We're saying if you make these really big profits, you pay the higher tax rate. And then you have a conversation with your shareholders about, do we really need to be so big? When Standard Oil was broken up before World War I, it was broken into 25 or 26 pieces, Rockefeller became richer. That created value for shareholders. More competition was also good, I think we can say safely at this distance, it was good for consumers. Competition for consumers is something I think we should always attempt to pursue, but competition in mental models, competition for ideas, getting more plurality of ideas out there in the tech sphere. I think that's really important, Jim. While I believe this can be — and we wrote the book in part because we believe it is — a very big moment in sort of technological choices that we humans have made and will continue to make. This is a big one. But if it's all in the hands of a few people, we're less likely to get better outcomes than if it's in the hands of hundreds of people or thousands of people. More competition for ideas, more competition to develop ways to make machines and algorithms useful to people. That's our focus.You have OpenAI, a company which was invested in by Microsoft, and Google/Alphabet is working on their version. And I think now you have Facebook and Amazon devoting more resources. Elon Musk is talking about creating his own version. Plus you have a lot of companies taking those models and doing things with them. It seems like there's a lot of things going on a lot of ferment. It doesn't to me seem like this kind of staid business environment where you have one or two companies doing something. It seems like a fairly vibrant innovation ecology right now.Of course, if you're right, Jim, then nobody is going to make mega excess profits, and then we don't have to worry about the tax rate proposal that I made. My proposal, or Kim's proposal, would have bite only if there are a couple of very big winners that make hundreds of billions of dollars. I'm not a computer scientist, I'm an economist, but it seems…Right, but it seems like those mega profits might be competed away, so I'd be careful about right now breaking up Google into eight Googlettes.Fine. I'm not trying to break them up. I'm saying give them a tax system so they confront that incentive and they can discuss it with their shareholders. The people who follow this closely, my computer science colleagues at MIT, for example, feel that Microsoft and OpenAI are in the lead by some distance. Google, which is working very closely with Anthropic, which broke away from OpenAI, is probably a either a close second or a slightly distant second. It's sort of like Manchester City versus the rest of the Premier League right now. But the others you mentioned, Facebook, Amazon, are some years behind. And years are a big deal here. Elon Musk, of course, proposed a pause in AI development and then suggested he get to launch his own AI business — I suppose to take advantage of the pause.That's a little suspicious.There's not going to be a pause. And there's not going to be a pause in part because we know that China is developing AI capabilities. While I am not arguing for confrontation with China over this or other things necessarily, we do have to be cognizant that there's a major national security dimension to this technology. And it is not in the interest of the United States to fall behind anyone. And I'm sure the Chinese are having the same discussion. That's going to keep us going pretty much full speed. And I think is also the case that many corporate executives can see this is a potential winner-take-all. And on the applications, the thinking there is that we're going to be talking very soon about a sort of supply chain where you have these fundamental large language model, the [General-Purpose Technology] type at the bottom, and then people can build applications on top of them. Which would make a lot of sense, right? You can focus on healthcare, you can focus on finance, but you'll be choosing between, right now it looks like, one or two of the large language models. Which does suggest really big upstream profits for those fundamental suppliers, just like how Microsoft has been making money since the mid-1980s, really.Can we really nudge transformational technology?With an important technology which will evolve in directions we can't predict, can we really nudge it with a little bit of tax policy, equalizing capital labor rates? Can we really nudge it in the kind of direction that we might want? If generative AI or machine learning more broadly is as significant as some people say, including folks at MIT and Stanford, I just wonder if we're really operating at the margins here. That the technology is going to be what the technology is. And maybe you make sure we can retrain people, and we can change education, and maybe we need to worry a bit about taxing this profit away if you're worried about corporate power. But as far as how the technology interacts with the workplace and the tasks people do, can we really influence it that much?I think that's the big question of the day, Jim. Absolutely. This is a book, not a policy memo, because we feel that the bigger issue is to have the discussion. To confront the question, as you pose it, and to discuss, what do we as a society want? How do we develop the technology that we need? Are we solving the problems that we really want to solve? Historically, of course, we didn't have many of those conversations. But we weren't as rich then as we are now. Hopefully we're more aware of our history now and more aware of the impact of these choice points. And so it's exactly to have that discussion and to say, if this is as big as people say, how are we going to move it in various directions?I like, as you know, to propose specific policy. I do think, particularly in Washington, it's the specifics that people want to seize. “What do we mean by surveillance? What do we mean by s safeguards over surveillance? How could you operationalize protections against excessive surveillance? By whom? By employers, by the police, by companies from whom you buy stuff? From your local government?” That conversation still needs to be had. And it's a very big, broad conversation. So let's have it quickly, because the technology is moving very quickly.What does the more recent history of concerns about technology, what lessons should we draw? I think of, I think of nuclear technology, which there are lots of concerns and we pass lots of rules. We basically paused that technology. And now we're sitting here in the, you know, in the 2020s worried about climate change. That, to me, is a recent powerful example of the dangers of trying to slow a technology, delay a technology that may evolve in ways you don't understand, but also can solve problems that we don't understand. It's, to me, are the history of least in the United States of technology over the past half century has been one of being overly cautious, not pedal to the metal gungho, you know, you know, let's, let's just keep going as fast as possible.As I think you may remember, Jim, I'm a big advocate for more science spending and more innovation in some fundamental sense across the whole economy because I think that generates prosperity and jobs. In my previous book, Jump-Starting America, we went through the nuclear history, as you flag. And I think the key thing there is at the beginning of that industry, right after World War II, there was over-optimism on the part of the engineers. The Atomic Energy Commission chair famously promised free electricity, and there was very little discussion about safety. And people who raised the issues of safety were kind of shunted to one side with the result that Three Mile Island a little bit and Chernobyl a lot was a big shock to public consciousness about the technology. I'm in favor of more innovation…I wonder if we've overlearned that lesson, you know? I think we may have overlearned it.Yes. I think that's quite possibly right. And we are not calling for an end to innovation on AI just because somebody made a movie in which AI takes over the world. Not at all. What we're saying is there are choices and you can either go more towards replacing people, that's automation, and more towards new task creation, that's machine usefulness. And that's not a new thing. That's a very old, thousand-year or maybe longer tension we've had in the history of innovations and how we manage them. And we have an opportunity now, because we're a more conscious, aware, and richer society, to try and pull ourselves through various means — and it might not be tax policy, I'll grant you that, but through various means — towards what we want. And I think what we want is more good jobs. We always want more good jobs, Jim. And we always want to produce useful things. We don't want just to replace people for the sake of replacement.Evaluating the Biden administration's science policySince you brought it up, I'm going to take the opportunity to ask you a final question about some of your other work about trying to create technology hubs across America. It seems like those ideas have to some degree made their way into policy during the Biden administration. What do you think of its efforts on trying to spend more on R&D and trying to spread that spending across America and trying to make sure it's not just Austin and Boston and New York and San Francisco and LA as areas of great innovation?In the Chips and Science Act, there's two parts: chips and science. The part that we are really advocating for is the science part. And it's exactly what you said, Jim, which is you spend more on science, spread it around the country. There are a lot of people in this country who are innovative, want to be innovative. There are some really good resources, private sector, but also public sector, public-sector universities, for example, in almost every state where you could have more innovation in some basic knowledge-creation sense. And that can become commercialized, that can become private initiative, that can generate jobs. That's what we are supporting. And I think the Science Act absolutely did internalize that. In part, because people learned some hard lessons during COVID, for example.The CHIPS Act is not what we were advocating for. And that's going to be really interesting to see how that plays out. That's more, I would say, conventional, somewhat old-fashioned industrial policy: Pick a sector, back a sector, invest in the sector from the public sector perspective. Chips are of course a really important sector, and the discussion of AI is absolutely part about that. And of course we're also worried, in part because of COVID but also because of the rise of China, about the security of supply chains, including chips that are produced in, let's say, parts of Asia. I think there are some grounds for that. There's also some issues, how much does it cost to build a state-of-the-art fab and operate it in the US versus Taiwan or South Korea, or even China for that matter? Those issues need to be confronted and measured. I think it's good that we're having a go. I'm a big believer in more science, more science spending, more responsible deployment of it and more discussion of how to do that. The chips industrial policy, we'll see. I hope something like this works. It would be quite interesting to pursue further, but we have had some bumps in those roads before. This is a public episode. If you'd like to discuss this with other subscribers or get access to bonus episodes, visit fasterplease.substack.com/subscribe
Humanity is poised on a threshold that could be described as a global dark night of the soul. This global crisis is not asking us to simply “make things better” or invent new ways of living. It's demanding that we surrender to a transformation so radical that we become a new variety of the human species. Our assignment is to become sacred activists. Carolyn Baker, Ph.D. was a psychotherapist in private practice and a college professor of psychology and history. Now through her webinars, podcasts, live workshops, books, and articles, as well as one-on-one life coaching, Carolyn is touching the lives of thousands to assist them in deeply adapting and becoming resilient in the face of the unprecedented changes confronting humanity. She works closely with Andrew Harvey for the Institute for Sacred Activism. She is the author of Undaunted: Living Fiercely into Climate Meltdown in an Authoritarian World (Apocryphile Press 2022) and Radical Regeneration: Sacred Activism and the Renewal of the World (co-author Andrew Harvey) (Inner Traditions 2022).Interview Date: 2/17/2023 Tags: MP3, Carolyn Baker, Stephen Jenkinson, H.H. the Dalai Lama, sacred activism, sacred activists, Margaret Wheatley, caterpillar liquefying, rite-of-passage, shelter in place, pandemic, Willis Harman, pessimism, optimism, Paul Levy, infinite possibilities, AI, Artificial intelligence, Shoshana Zuboff, Jonathan Harari, fascism, democracy, sacredness, reverence, joy, isolation, eldership, Social Change/Politics, Personal Transformation, Spirituality
Paris Marx is joined by Emily M. Bender to discuss what it means to say that ChatGPT is a “stochastic parrot,” why Elon Musk is calling to pause AI development, and how the tech industry uses language to trick us into buying its narratives about technology. Emily M. Bender is a professor in the Department of Linguistics at the University of Washington and the Faculty Director of the Computational Linguistics Master's Program. She's also the director of the Computational Linguistics Laboratory. Follow Emily on Twitter at @emilymbender or on Mastodon at @emilymbender@dair-community.social. Tech Won't Save Us offers a critical perspective on tech, its worldview, and wider society with the goal of inspiring people to demand better tech and a better world. Follow the podcast (@techwontsaveus) and host Paris Marx (@parismarx) on Twitter, and support the show on Patreon. The podcast is produced by Eric Wickham and part of the Harbinger Media Network. Also mentioned in this episode:Emily was one of the co-authors on the “On the Dangers of Stochastic Parrots” paper and co-wrote the “Octopus Paper” with Alexander Koller. She was also recently profiled in New York Magazine and has written about why policymakers shouldn't fall for the AI hype.The Future of Life Institute put out the “Pause Giant AI Experiments” letter and the authors of the “Stochastic Parrots” paper responded through DAIR Institute.Zachary Loeb has written about Joseph Weizenbaum and the ELIZA chatbot.Leslie Kay Jones has researched how Black women use and experience social media.As generative AI is rolled out, many tech companies are firing their AI ethics teams.Emily points to Algorithmic Justice League and AI Incident Database.Deborah Raji wrote about data and systemic racism for MIT Tech Review.Books mentioned: Weapons of Math Destruction by Cathy O'Neil, Algorithms of Oppression by Safiya Noble, The Age of Surveillance Capitalism by Shoshana Zuboff, Race After Technology by Ruha Benjamin, Ghost Work by Mary L Gray & Siddharth Suri, Artificial Unintelligence by Meredith Broussard, Design Justice by Sasha Costanza-Chock, Data Conscience: Algorithmic S1ege on our Hum4n1ty by Brandeis Marshall.Support the show
Kina fylder overskrifterne overalt lige nu. Men det er ikke med deres planer for fred i Ukraine eller deres charmeture rundt i Europa - det er Tik Tok-forbud, balloner og mulige våbenleverancer til Rusland. Hvorfor går det lige nu så galt for Kinas forsøg på at positionere sig som en ansvarlig stormagt? Ophavskvinden til begrebet overvågningskapitalisme, Harvard-professor Shoshana Zuboff, har studeret overvågningssamfundet siden slutningen af 70'erne. Hun er mildt sagt kritisk overfor den fremadstormende kunstige intelligens, der senest har sendt os i armene på chatbots. Udsyn undersøger i dag sammen Shoshana Zuboff, hvordan vi skal stille os over for AI, der er i rivende udvikling. Tilrettelæggelse: Asta Handberg, Tine Linde og Morten Narvedsen. Vært: Kirstine Dons Christensen. Lyddesign: Jonas Johs Andersen. Redaktør: Tine Møller Sørensen.
If surveillance capitalism permeates all of modern society, how on earth can we step back to think critically about what it may be doing to us? In this episode we think through more of the implications of living in a non-private digital village in the 21st century, but is privacy even a Christian virtue in the first place? We also ponder the implications of the more deceptive and destructive aspects of addictive digital technologies and think through some initial efforts believers have made to carve out space for family time and spirituality in our disembodied always-on world. Some extra reading: Surveillance capitalism: the hidden costs of the digital revolution, Jonathan Ebsworth, Samuel Johns, Michael Dodson, Cambridge Papers June 2021 The Question of Surveillance Capitalism, Nathan Mladin and Stephen Williams, in The Robot will see you Now: Artificial Intelligence and the Christian Faith, ed John Wyatt and Stephen Williams, SPCK, 2021 The Age of Surveillance Capitalism, Shoshana Zuboff, Profile Books, 2019 Atlas of AI: Power politics and the planetary costs of artificial intelligence, Kate Crawford, Yale University Press, 2021 Irresistible: The rise of addictive technology and the business of keeping us hooked, Adam Alter, Penguin, 2017 Hooked: how to build habit forming products, Nir Eyal, Penguin, 2019 Weapons of Math Destruction, Cathy O'Neil, Penguin, 2017 Subscribe to the Matters of Life and Death podcast: https://pod.link/1509923173 If you want to go deeper into some of the topics we discuss, visit John's website: http://www.johnwyatt.com For more resources to help you explore faith and the big questions, visit: http://www.premierunbelievable.com
The Existential Threat of Big Tech's Predatory Business ModelsSee Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.
Every tap, swipe and click we make on our phones, tablets and laptops is being recorded by big tech firms. This is often called surveillance capitalism – a network of products and services we use every day which sucks up large quantities of data about us and then sells it on to advertisers at huge profits. It's garnering increasing concern from citizens and regulators around the world, but should we care as Christians? What impact is this system having on once flourishing industries such as journalism or bookselling, let alone on us as human beings? And why have tech companies made their products so addictively hard to put down and stop tapping, swiping and clicking? Some extra reading... Surveillance capitalism: the hidden costs of the digital revolution, Jonathan Ebsworth, Samuel Johns, Michael Dodson, Cambridge Papers June 2021 The Question of Surveillance Capitalism, Nathan Mladin and Stephen Williams, in The Robot will see you Now: Artificial Intelligence and the Christian Faith, ed John Wyatt and Stephen Williams, SPCK, 2021 The Age of Surveillance Capitalism, Shoshana Zuboff, Profile Books, 2019 Atlas of AI: Power politics and the planetary costs of artificial intelligence, Kate Crawford, Yale University Press, 2021 Irresistible: The rise of addictive technology and the business of keeping us hooked, Adam Alter, Penguin, 2017 Hooked: how to build habit forming products, Nir Eyal, Penguin, 2019 Weapons of Math Destruction, Cathy O'Neil, Penguin, 2017 Subscribe to the Matters of Life and Death podcast: https://pod.link/1509923173 If you want to go deeper into some of the topics we discuss, visit John's website: http://www.johnwyatt.com For more resources to help you explore faith and the big questions, visit: http://www.premierunbelievable.com
In this episode, Sarah returns to the podcast to help Rich dive deep into a review of VPRO's landmark documentary: Shoshana Zuboff on surveillance capitalism. Surveillance capitalism is the core of the business model for today's Big Tech companies like Google and Facebook. It's not about your email address, phone number or social security. Surveillance capitalism is all about collecting voluminous amounts of rich, behavioral data on platform users to develop models that predict future behavior. These predictions can in turn be sold to advertisers that are looking to reach niche markets. Predictive models developed by Google and Facebook are so sophisticated that they can practically guarantee advertiser success. The key here is that Big Tech is selling predictions, not data. In fact, the models are so advanced that they can even be used to manipulate users' behavior. Shoshana Zuboff is a Harvard Professor and author of the groundbreaking book The Age of Surveillance Capitalism. She has been researching the area of surveillance capitalism for years and asserts that in order for surveillance capitalism to work, it must happen through stealth and obfuscation. In other words, users of platforms like Google Search and Facebook must be so hooked on using these products that they don't see how the monetization model works. The more engaged users are, the more behavioral data that can be collected, and the more precise those predictive models can be. We hope this episode will spur some thinking about what type of personal data is truly valuable to Big Tech and get you to reevaluate your own use of various web-based products. And don't forget, as it has been said many times, "If the product is free, you are the product!" Link to watch documentary: https://www.youtube.com/watch?v=hIXhnWUmMvw OUR SPONSORS: Anonyome Labs - Makers of MySudo and Sudo Platform. Take back control of your personal data. www.anonyome.com MySudo - The world's only all-in-one privacy app. Communicate and transact securely and privately. Talk, text, email, browse, shop and pay, all from one app. Stay private. www.mysudo.com Sudo Platform - The cloud-based platform companies turn to for seamlessly integrating privacy solutions into their software. Easy-to-use SDKs and APIs for building out your own branded customer apps like password managers, virtual cards, private browsing, identity wallets (decentralized identity), and secure, encrypted communications (e.g., encrypted voice, video, email and messaging). www.sudoplatform.com
Ron Wakkary is a professor of design at Simon Fraser University's School of Interactive Arts and Technology in Canada. He is also a professor, holding the Chair of Design for More Than Human-Centered Worlds, in the industrial design department at Eindhoven University of Technology in the Netherlands.Ron is the founder of the design research studio Everyday Design Studio (EDS). At EDS, he works with Will Odom and an evolving cast of students to produce multi-disciplinary design research that is highly engaged with the practice and craft of design. For UX designers and industrial designers looking for ideas and inspiration from social sciences, humanities, and philosophy executed in design artifacts, the work from EDS is a fantastic resource.Ron recently published the book Things We Could Design: For More Than Human-Centered Worlds via MIT Press. The book packages his research focused on “post-humanist design” rather than human-centered design, bringing non-human stakeholders like nature, climate, and biological diversity into the focus of design methodology.Transcript: https://designdisciplin.com/ron:: Related Links+ Book: Design Research through Practice by Koskinen et al.: https://geni.us/design-research-thr+ Book: Discipline & Punish by Michel Foucault: https://geni.us/discipline-and-punish+ Everyday Design Studio: https://eds.siat.sfu.ca/+ Book: In Praise of Shadows by Junichiro Tanizaki: https://geni.us/in-praise-of-shadows+ Book: Reinventing Organizations by Frederic Laloux: https://geni.us/reinventing-org+ Book: Staying with the Trouble by Donna Haraway: https://geni.us/staying-with-the-troub+ Book: The Age of Surveillance Capitalism by Shoshana Zuboff: https://geni.us/age-of-surveillance+ Book: The Overstory by Richard Powers: https://geni.us/the-overstory+ Book: The Spell of the Sensuous by David Abram: https://geni.us/spell-of-the-sensuous+ Book: Things We Could Design by Ron Wakkary: https://geni.us/things-we-could-design+ Book: Vibrant Matter by Jane Bennett: https://geni.us/vibrant-matter+ Book: What Things Do by Peter-Paul Verbeek: https://geni.us/what-things-doFull list of related links: https://designdisciplin.com/ron :: Connect with Design Disciplin+ Website: http://designdisciplin.com+ Podcast: http://podcast.designdisciplin.com+ Instagram: http://instagram.com/designdisciplin/+ Twitter: http://twitter.com/designdisciplin/+ YouTube: http://youtube.com/designdisciplin:: Connect with Ron+ Twitter: https://twitter.com/ronwakkary+ Everyday Design Studio: http://eds.siat.sfu.ca/:: Episode Bookmarks00:00:00 Intro00:01:26 Ron's Story00:13:35 Research through Design00:18:54 Ron's Practice00:22:26 The Core Message in Ron's Book00:27:30 How To Put the Book in Practice00:34:45 "Designer as Biography / Force / Speaking Subject / Intensities and Origins"00:51:57 The Scope of Design vs. Other Disciplines00:58:50 "Nomadic Practice"01:21:55 Book Recommendations 01:27:00 What's Next for Ron01:33:00 Closing
In this episode, we spoke to Prof Galit Shmueli, Tsing Hua Distinguished Professor at the Institute of Service Science, and Institute Director at the College of Technology Management, National Tsing Hua University. Galit talked with us about the multi-disciplinary work she has done over the years, as well as the differences between statistical models that are purposed for predicting as opposed to explaining. We also discussed causal inference and how it can be used to estimate behaviour modification by the tech giants. We continued and talked about the ethics and the complexity of that landscape. Galit's recommended books: 1. The age of surveillance capitalism, Shoshana Zuboff 2. Books on causality: • The book of Why, Dana Mackenzie and Judea Pearl • Causal Inference in Statistics: A Primer, Judea Pearl, Madelyn Glymour, and Nicholas P. Jewell • Causality, Judea Pearl 3. Mostly Harmless Econometrics: An Empiricist's Companion, Joshua D. Angrist, Jörn-Steffen Pischke
We conclude our discussion of Zuboff's "In the Age of the Smart Machine: The Future of Work and Power" by projecting her conclusions to the present day. On the one hand, many of her findings about the creative ways that management reasserts its authority are still relevant today, but she had also offered strategies for integrating new technologies in ways that would improve both work performance and worker commitment and satisfaction. Would such strategies work today?
Coming soon! We will discuss Shoshana Zuboff's ethnographic study of how work changed with the introduction of information technologies in the 1980s. "In the Age of the Smart Machine" discusses how computers changed the meaning of work for both front line industrial workers and their managers, telling a rich cautionary tale about how these technologies upset the balance of power in the workplace and what managers did about it
This month, we discuss Shoshana Zuboff's "In the Age of the Smart Machine: The Future of Work and Power" that examines several cases of organizations introducing information technologies in the workplace hoping to improve organizational performance, transparency, and collaboration but instead dehumanized the workplace and ushered in new ways of managerial surveillance. In Part 1, we discuss the major themes of the book, her telling of the histories of both blue- and white-collar work, and her incredible case studies.
In this episode, Eric Hsu and Louis Everuss have an in-depth chat about Shoshana's Zuboff's theory of 'surveillance capitalism', which postulates the existence of a new variant of capitalism that significantly involves the digital monitoring of people's behaviours. Eric and Louis mainly base their discussion on Zuboff's 2015 article in the Journal of Information Technology, which explains how capitalism in the contemporary era may be based in some respects on a new logic of accumulation. Because there is so much ground to cover, Eric only manages to slip in one of his celebrity impersonations into the episode. He tries to do a brief impression of George Takei, leading many listeners to think, 'oh my!'. Music and sound effects for this episode come from various sources and is licensed under the Creative Commons 0 License/the Creative Commons Attribution License 3.0 or is covered by a SFX (Multi-Use) License. Tracks include:https://freesound.org/people/Tuben/sounds/272044/https://freesound.org/people/funnyman850/sounds/194812/https://freesound.org/people/JPMusic82/sounds/415511/The opinions expressed in the Sociology of Everything podcast are that of the hosts and/or guest speakers. They do not reflect the opinions of anyone else at UniSA or the institution at large.The Sociology of Everything podcast | www.sociologypodcast.com
Artisten och låtskrivaren Tove Styrke har det senaste decenniet kommit att bli en av våra mest hyllade popartister, både internationellt och på hemmaplan. År 2009 kom hon på en tredjeplats i Idol och sedan dess har hon skapat hits på hits. Hon har turnérat som förband åt både Lorde och Katy Perry, hennes musik har streamats över 350 miljoner gånger och nu är hon aktuell med plattan Hard.Vi pratar om hennes uppväxt i Umeå, hur hennes mamma skickade in ansökan till Idol och hur det kom att ta henne ut i världen. Vi går in på varför det är så viktigt att känna mening i det man gör, vikten av att konst blir kommunikation, om kontrasten mellan introvert och extrovert och massor av annat. Tusen tack för att du lyssnar!Besök Framgångsakademin.Beställ "Mitt Framgångsår".Alexander Pärleros Instagram.Bästa tipsen från avsnittet i Nyhetsbrevet.I samarbete med Convendum.Besök Tove Styrkes hemsida.Tove Styrkes instagram.Shoshana Zuboff - Övervakningskapitalismen Hosted on Acast. See acast.com/privacy for more information.
Artisten och låtskrivaren Tove Styrke har det senaste decenniet kommit att bli en av våra mest hyllade popartister, både internationellt och på hemmaplan. År 2009 kom hon på en tredjeplats i Idol och sedan dess har hon skapat hits på hits. Hon har turnérat som förband åt både Lorde och Katy Perry, hennes musik har streamats över 350 miljoner gånger och nu är hon aktuell med plattan Hard.Vi pratar om hennes uppväxt i Umeå, hur hennes mamma skickade in ansökan till Idol och hur det kom att ta henne ut i världen. Vi går in på varför det är så viktigt att känna mening i det man gör, vikten av att konst blir kommunikation, om kontrasten mellan introvert och extrovert och massor av annat. Tusen tack för att du lyssnar!Besök Framgångsakademin.Beställ "Mitt Framgångsår".Alexander Pärleros Instagram.Bästa tipsen från avsnittet i Nyhetsbrevet.I samarbete med Convendum.Besök Tove Styrkes hemsida.Tove Styrkes instagram.Shoshana Zuboff - Övervakningskapitalismen Hosted on Acast. See acast.com/privacy for more information.
El capitalismo de la vigilancia ha sido adoptado por las gigantes tecnológicas, por las aseguradoras médicas y hasta por los concesionarios de autos. Y según Shoshana Zuboff, se trata de “la institución económica que domina nuestros tiempos”. ¿Pero cómo es ese modelo? La aclamada autora nos explica.
El capitalismo de la vigilancia ha sido adoptado por las gigantes tecnológicas, por las aseguradoras médicas y hasta por los concesionarios de autos. Y según Shoshana Zuboff, se trata de “la institución económica que domina nuestros tiempos”. ¿Pero cómo es ese modelo? La aclamada autora nos explica.
From The CIA To Cybersecurity & Hacking Expert Mr. Elliott is CEO of Comar Cyber, Inc., a Washington, DC-based company that specializes in government and corporate training in cybersecurity. He created the award-winning “Learn by Hacking” TM online cybersecurity course, with distribution deals in the US and Japan. Mr. Elliott served for almost two decades at the CIA, where he worked in the Directorate of Operations as a Case Officer and ops leader in field assignments. He has extensive experience at the intersection of HUMINT operations and technology. As a manager at CIA Headquarters, he worked with companies, investors, and other elements of government to identify, purchase, and create technologies for the CIA’s operational use. He used his training and experience in assignments in Europe, South Asia, and Africa, as well as at HQS, to identify and counter nation-state cyber threats to protect enterprise and operational systems. Mr. Elliott, in coordination with various stakeholders, created the Agency’s requirements for operational technology in the foreign field. Interview Questions #1 - What do you see as some of the new and biggest Cybersecurity threats at the present moment? #2 - WordPress and open-source software are not as secure as proprietary software; what are your general views on this? #3 - What are some critical things website developers and designers need to understand to make their websites more secure? #4 - We seem to be in an age of Surveillance Capitalism, as popularized by Shoshana Zuboff. what are your own views on this? #5 - There has been a lot of talk TikTok about it being a significant security threat; what are your own views on this? #6 - What are your personal views on Edward Snowden and his revelations connected to the NSA?
In today's episode, Roanne meets with Araz Najarian. She enables leaders, teams, and organisations in being purposeful and relevant. Both those that are scaling up and need support in creating focus and pacing their growth, and also those companies stalling in their growth and needing to re-frame and re-ignite their innovative spirit. How does she do that, asks Roanne? The short answer is: by being curious, and by sparking people's creativity - yes, even those who insist they are not at all creative. Araz lives in the Netherlands with her husband and their beagle, and while she loves her work, she also enjoys leaving enough space in her life to go on new adventures and learn new skills. This is a conversation on invoking inspiration, daring to connect, daring to ask, and daring to grow. Enjoy! Links mentioned during the podcast: Zen Mind, Beginner's Mind: Informal Talks on Zen Meditation and Practice, Shunryu Suzuki: “In the beginner's mind there are many possibilities, but in the expert's there are few.”: www.goodreads.com/book/show/402843…_Beginner_s_Mind Getting Curious with Jonathan Van Ness: podcasts.apple.com/us/podcast/gett…ss/id1068563276 Note to Self with Manoush Zomorodi: The Creative Habit: Learn It and Use It for Life, Twyla Tharp: www.goodreads.com/book/show/254799…e_Creative_Habit The Age of Surveillance Capitalism, Shoshana Zuboff: https://en.wikipedia.org/wiki/The_Age_of_Surveillance_Capitalism Solid Project, Tim Berners Lee: solidproject.org/ And if anyone is interested in some of the insights Araz mentioned from her ELP Network projects and research, they can find it here: www.elpnetwork.com/en/insights www.linkedin.com/in/araznajarian/
Kristen and Kyla are joined by activist Robert Miller to discuss Corey Doctorow's book “How to Destroy Surveillance Capitalism”. For extra credit Kyla and Kristen also tried to read Shoshana Zuboff's “The Age of Surveillance Capitalism”, as Doctorow wrote his piece in large part as a response to Zuboff's. Topics: what is surveillance capitalism; what are the harms of surveillance capitalism; what would a world without surveillance capitalism look like. Robbie encourages listeners to support BSA (Black Socialists in America) - https://blacksocialists.us/dual-power-map Robbie would also like everyone to read more Murray Bookchin - https://theanarchistlibrary.org/category/author/murray-bookchin Leave us a voicemail! https://podinbox.com/pullback Website: https://www.pullback.org/episode-notes/episode83 Harbinger Media Network: https://harbingermedianetwork.com/Twitter: https://twitter.com/PullbackPodcast Instagram: https://www.instagram.com/pullbackpodcast/?igshid=i57wwo16tjko Facebook: https://www.facebook.com/PullbackPodcast/ Read "How to Destroy Surveillance Capitalism" here: https://onezero.medium.com/how-to-destroy-surveillance-capitalism-8135e6744d59 Pullback is produced and hosted by Kristen Pue and Kyla Hewson. Logo by Rachel Beyer and Evan Vrinten.
Elon Musk's bid to buy Twitter has sparked heated discussion, and concerns over what that might mean for the power held by big tech companies, and their impact on democracy. We talk to William D. Cohan, bestselling author of a number of books on high finance intrigue; and Shoshana Zuboff, a retired Harvard Business School professor.
In Episode 4 of Series 7 of The Rights Track, Todd is in conversation with Sam Gilbert, an entrepreneur and affiliated researcher at the Bennett Institute for Public Policy at the University of Cambridge. Sam works on the intersection of politics and technology. His recent book – Good Data: An Optimist's Guide to Our Future – explores the different ways data helps us, suggesting that “the data revolution could be the best thing that ever happened to us”. Transcript Todd Landman 0:01 Welcome to The Rights Track podcast which gets the hard facts about the human rights challenges facing us today. In Series 7, we're discussing human rights in a digital world. I'm Todd Landman, in the fourth episode of this series, I'm delighted to be joined by Sam Gilbert. Sam is an entrepreneur and affiliated researcher at the Bennett Institute for Public Policy at the University of Cambridge, working on the intersection of politics and technology. His recent book, Good Data: An Optimist's Guide to Our Future explores the different ways data helps us suggesting the data revolution could be the best thing that ever happened to us. And today, we're asking him, what makes data good? So Sam, welcome to this episode of The Rights Track. Sam Gilbert 0:41 Todd thanks so much for having me on. Todd Landman 0:44 So I want to start really with the book around Good Data. And I'm going to start I suppose, with the negative perception first, and then you can make the argument for a more optimistic assessment. And this is this opening set of passages you have in the book around surveillance capitalism. Could you explain to us what surveillance capitalism is and what it means? Sam Gilbert 1:01 Sure. So surveillance capitalism is a concept that's been popularised by the Harvard Business School Professor, Shoshana Zuboff. And essentially, it's a critique of the power that big tech companies like Google and Facebook have. And what it says is that, that power is based on data about us that they accumulate, as we live our lives online. And by doing that produce data, which they collect, and analyse, and then sell to advertisers. And for proponents of surveillance capitalism theory, there's something sort of fundamentally illegitimate about that. In terms of the way that it, as they would see it, appropriates data from individuals for private gain on the path of tech companies. I think they would also say that it infringes individual's rights in a more fundamental way by subjecting them to surveillance. So that I would say is surveillance capitalism in a nutshell. Todd Landman 2:07 Okay. So to give you a concrete example, if I'm searching for a flannel shirt from Cotton Trader, on Google, the next day, I open up my Facebook and I start to see ads for Cotton Trader, on my Facebook feed, or if I go on to CNN, suddenly I see an ad for another product that I might have been searching for on Google. Is that the sort of thing that he's talking about in this concept? Sam Gilbert 2:29 Yes, that's certainly one dimension to it. So that example that you just gave is an example of something that's called behaviour or retargeting. So this is when data about things you've searched for, or places you've visited on the internet, are used to remind you about products or services that you've browsed. So I guess this is probably the most straightforward type of what surveillance capitalists would call surveillance advertising. Todd Landman 2:57 Yeah, I understand that, Sam, but you know when I'm internally in Amazon searching for things. And they say you bought this other people who bought this might like this, have you thought about, you know, getting this as well. But this is actually between platforms. This is, you know, might do a Google search one day. And then on Facebook or another platform, I see that same product being suggested to me. So how did, how did the data cross platforms? Are they selling data to each other? Is that how that works? Sam Gilbert 3:22 So there's a variety of different technical mechanisms. So without wanting to get too much into the jargon of the ad tech world, there are all kinds of platforms, which put together data from different sources. And then in a programmatic or automated way, allow advertisers the opportunity to bid in an auction for the right to target people who the data suggests are interested in particular products. So it's quite a kind of complex ecosystem. I think maybe one of the things that gets lost a little bit in the discussion is some of the differences between the ways in which big tech companies like Facebook and Google and Amazon use data inside their own platforms, and the ways in which data flows out from those platforms and into the wider digital ecosystem. I guess maybe just to add one more thing about that. I think, probably many people would have a hard time thinking of something as straightforward as being retargeted with a product that they've already browsed for, they wouldn't necessarily see that as surveillance, or see that as being particularly problematic. I think what gets a bit more controversial, is where this enormous volume of data can have machine learning algorithms applied to it, in order to make predictions about products or services that people might be interested in as consumers that they themselves haven't even really considered. I think that's where critics of what they would call surveillance capitalism have a bigger problem with what's going on. Todd Landman 4:58 No I understand that's, that's a great great explanation. Thank you. And I guess just to round out this set of questions, really then it sounds to me like there's a tendency for accumulated value and expenditure here, that is really creating monopolies and cartels. To what degree is the language of monopoly and cartel being used? Because these are, you know, we rattle off the main platforms we use, but we use those because they have become so very big. And, you know, being a new platform, how does a new platform cut into that ecosystem? Because it feels like it's dominated by some really big players. Sam Gilbert 5:32 Yes. So I think this is a very important and quite complicated area. So it is certainly the case that a lot of Silicon Valley tech companies have deliberately pursued a strategy of trying to gain a monopoly. In fact, it might even be said that that's sort of inherent to the venture capital driven start-up business model to try and dominate particular market space. But I suppose the sense in which some of these companies, let's take Facebook as an example, are monopolies is really not so related to the way in which they monetize data or to their business model. So Facebook might reasonably be said to be a monopolist of encrypted messaging, because literally billions of people use Facebook's platform to communicate with each other. But it isn't really a monopolist of advertising space, because there are so many other alternatives available to advertisers who want to promote their products. I guess another dimension to this is the fact that although there are unquestionably concentrations of power with the big tech companies, they also provide somewhat of a useful service to the wider market, in that they allow smaller businesses to acquire customers much more effectively. So that actually militates against monopoly. Because now in the current digital advertising powered world, not every business has to be so big and so rich in terms of capital, that it can afford to do things like TV advertising. The platform's that Facebook and Google provides are also really helpful to small businesses that want to grow and compete with bigger players. Todd Landman 7:15 Yeah, now I hear you shifting into the positive turn here. So I'm going to push you on this. So what is good data? And why are you an optimist about the good data elements to the work you've been doing? Sam Gilbert 7:27 Well, for me, when I talk about good data, what I'm really talking about is the positive public and social potential of data. And that really comes from my own professional experience. Because although at the moment, I spend most of my time researching and writing about these issues of data and digital technology, actually, my background is in the commercial sector. So I spent 18 years working in product and strategy and marketing roles, and particularly financial services. Also at the data company, Experian, also in a venture backed FinTech business called Bought By Many. And I learnt a lot about the ways in which data can be used to make businesses successful. And I learned a lot of techniques that, in general, at the moment, are only really put to use to achieve quite banal goals. So for example, to sell people more trainers, or to encourage them to buy more insurance products. And so one of the things that I'm really interested in is how some of those techniques and technologies can move across from the commercial sector, into the public sector, the third sector, and be put to work in ways that are more socially beneficial. So maybe just to give one example of that type of data that I think contains huge potential for public goods is search data. So this is the data set that is produced by all of us using Google and Bing and other search engines on a daily basis. Now, ordinarily, when this data is used, it is to do banal things like, target shoes more effectively. But there is also this emerging discipline called Infodemiology, where academic researchers use search data in response to public health challenges. So one great example of that, at the moment has been work by Bill Lampos at University College London and his team, where they've built a predictive model around COVID symptoms using search data. And that model actually predicts new outbreaks 17 days faster than conventional modes of epidemiological surveillance. So that's just one example of the sort of good I believe data can bring. Todd Landman 9:50 So it's like a really interesting example of an early early warning system and it could work not only for public health emergencies, but other emerging emergencies whether they be conflict, or natural disasters or any topic that people are searching for, is that correct? Sam Gilbert 10:05 Yes, that's right. I mean, it's not just in the public health field that researchers have used this, you just put me in mind actually Todd of a really interesting paper written by some scholars in Japan who are looking at citizens decision making in response to natural disaster warnings. So floods and earthquakes that that migration patterns I guess, would be the way of summarising it. Those are things that can also be detected using search data. Todd Landman 10:31 Well, that's absolutely fascinating. So if we go back to public health then. I was just reading a new book, out called Pandemocracy in Europe: Power, Parliaments and People in Times of COVID. And it's edited by Matthias Kettemann and Konrad Lachmayer. And there's a really fascinating chapter in this book that transcends the nation state, if you will. And it talks about platforms and pandemics. And one section of the chapter starts to analyse Facebook, Twitter, YouTube, and telegram on the degree to which they were able to control and or filter information versus disinformation or misinformation. And just the scale of some of this stuff is quite fascinating. So you know, Facebook has 2.7 billion daily users, it's probably a bigger number now. And you know, 22.3% of their investigated Facebook posts contain misinformation about COVID-19. And they found that the scale of misinformation was so large that they had to move to AI solutions, some human supervision of those AI solutions. But what's your take on the role of these big companies like we've been talking about Facebook, Twitter, YouTube, Telegram, and their ability to control the narrative and at least provide safe sources of information, let's say in times of COVID, but there may be other issues of public interest where they have a role to play? Sam Gilbert 11:57 Yes, I think this is such an important question. It's very interesting that you use the phrase, control the narrative, because of course, that is something that big tech companies have traditionally been extremely reluctant to do. And one of the things I explore a bit in my book is the extent to which this can really be traced back to some unexamined normative assumptions on the part of tech company executives, where they think that American norms of free speech and the free speech protections of the First Amendment that's sort of universal laws that are applicable everywhere, rather than things which are culturally and historically contingent. And for that reason, they have been extremely reluctant to do any controlling of the narrative and have tended to champion free speech over the alternative course of action that they might take, which is to be much more proactive in combating harms, including but not limited to misinformation. I think this probably also speaks to another problem that I'm very interested in, in the book, which is what we are concerned about when we say we're concerned about big tech companies' power, because I think ordinarily, the discussion about big tech companies power tends to focus on their concentrations of market power. Or in the case of surveillance capitalism theory, it concentrates on the theoretical power that algorithms have over individuals and their decision making. And what gets lost a bit in that is the extent to which tech companies by providing these platforms and these technologies actually empower other people to do things that weren't possible before. So in some work I've been doing with Amanda Greene, who's a philosopher at University College London, we've been thinking about that concept of empowering power, as we call it. And as far as we're concerned, that's actually a much more morally concerning aspect of the power of big tech, big tech companies than their market position. Todd Landman 14:11 Yeah. So I like it that you cite the First Amendment of the American Constitution, but interestingly, the international framework for the protection and promotion of human rights also, you know, has very strong articles around protection of free speech, free assembly, free association, which of course, the tech companies will be interested in looking at and and reviewing. But what it raises to I believe really is is a question around the kind of public regulation of private actors, because these are private actors. They're not subjected to international human rights law in the way that states are. And yet they're having an impact on mass publics. They're having an impact on politics. They're having an impact on debate. So perhaps I misspoke by saying control the narrative. What I'm really interested in is we seem to have lost mediation. We have unmediated access to information. And it seems to me that these it's incumbent upon these organisations to provide some kind of mediation of content, because not all things are true just because they're said. So it gets back to that question, what where's the boundary for them? When will they step in and say this is actually causing harm if there's some sort of a big tech Hippocratic oath about do no harm that needs to be developed? So that, so there is at least some kind of attempt to draw a boundary around what is shared and what is not shared? Sam Gilbert 15:34 Yes, so the idea of a Hippocratic oath for tech workers is definitely out there, the writer who has explored it more than I have is James Williams in his book Stand Out Of Our Light. I think that that is certainly something that would help. I also think that it is beneficial that at the moment, we're having more discussion about data ethics and the ethics of artificial intelligence, and that that is permeating some of the tech companies. So I think more ethical reflection on the part of tech executives and tech workers is to be welcomed. I don't think that's sufficient. And I do think that it's important that we have stronger regulation of the tech sector. And I suppose from my perspective, the thing that needs to be regulated, much more than anything to do with how data is collected or how data is used in advertising. Is this what sometimes referred to as online safety, or other times it's referred to as online harms. So that is anything that gives rise to individuals being at risk of being harmed as they live their lives online. There's actually legislation that is coming through in the UK at the moment called online safety bill, which is far from perfect legislation, but in my opinion, it's directionally right. Because it is more concerned with preventing harm and giving tech companies a responsibility for playing their part in it, then it is concerned with trying to regulate data or advertising. Todd Landman 17:13 Yeah, so it's really the result of activity that is trying to address rather than that the data that drives the the activity, if I could put it that way. So if we think about this, do no harm element, the mediating function that's required at least to get trusted information available to users. I, I wonder if we could pivot a little bit to the current crisis in Ukraine, because I've noticed on social media platforms, a number of sites have popped up saying we're a trusted source for reporting on on the current conflict, and they get a sort of kite mark or a tick for that. I've also seen users saying, don't believe everything you see being tweeted out from Ukraine. So where does this take us and not only COVID, but to something as real time active and horrific as conflict in a country, we can talk about Ukraine or other conflicts about the sharing of information on social media platforms? Sam Gilbert 18:08 Yes, well, this is a very difficult question. And unfortunately, I don't have the answer for you today. I guess what I would point to is something you touched on there Todd, which is the idea of mediation. And we have been through this period with social media, where the organizations, the institutions that we traditionally relied on to tell us what was true and what was false and sort fact from fiction, those organisations have been disintermediated. Or in some cases, they have found themselves trying to compete in this very different information environment that is much more dynamic in a way that actually ends up undermining the journalistic quality that we would otherwise expect from them. So this is not a very satisfactory answer, because I don't know what can be done about it, except that it is a very serious problem. I suppose just to make one final point that I've been reminded I've been reading stories on this topic in relation to the Ukraine crisis, is that the duality of this power that tech companies and that technology has given to ordinary users in the era of social media over the last 15 years or so. So if we were to rewind the clock to 2010, or 2011, the role of Twitter and Facebook and other technology platforms in enabling protest and resistance against repressive regimes that was being celebrated. If we then roll forwards a few years and look at a terrible case like the ethnic cleansing of the Rohingya people in Myanmar, we are at the complete opposite end of the spectrum where the empowerment of users with technology has disastrous consequences, and I guess if we then roll forward again to the Ukraine crisis, it's still not really clear whether the technology is having a beneficial or detrimental effect. So this is really just to say, once again, when we think about the power of tech companies, these are the questions I think we need to be grappling with, rather than questions to do with data. Todd Landman 20:31 Sure, there was there was a great book years ago called the Logic of Connective Action. And it was really looking at the way in which these emerging platforms because the book was published some years ago about lowering collective action costs, whether it was, you know, for protest movements, or, you know, anti-authoritarian movements, etc, we did a piece of work years ago with someone from the German Development Institute on the role of Facebook, in, in opposition to the Ben Ali regime in Tunisia, and Facebook allowed people to make a judgement as to whether they should go to a protest or not based on number of people who said they were going and and so it lowered the cost of participation, or at least the calculated costs of participating in those things. But as you say, we're now seeing this technology being used on a daily basis, I watch drone footage every day of tanks being blown up, of buildings being destroyed. And you know, part of my mind thinks it's this real, what I'm watching. And then also part of my mind thinks about, what's the impact of this? Does this have an impact on morale of the people involved in the conflict? Does it change the narrative, if you will, about the progress and or, you know, lack of progress in in the conflict, and then, of course, the multiple reporting of whether they're going to be peace talks, humanitarian corridors and all this other stuff. So it does raise very serious questions about the authenticity, veracity and ways in which technology could verify what we're seeing. And of course, you have time date stamps, metadata and other things that tell you that that was definitely a geolocated thing. So are these companies doing that kind of work? Are they going in and digging into the metadata, I noticed that Maxar Technologies, for example, is being used for its satellite data extensively, and looking at the build-up of forces and the movement of troops and that sort of thing. But again, that's a private company making things available in the public sphere for people to then reach judgments, media companies to use, it's an incredible ecosystem of information, and that it seems like a bit like a wild west to me, in terms of what we believe what we don't believe and the uses that can be made of this imagery and commentary. Sam Gilbert 22:32 Yes, so there is this as an all things, this super proliferation of data. And what is still missing is the intermediation layer to both make sense of that. And also tell stories around it that have some kind of journalistic integrity. I mean what you put me in mind of there Todd was the open source intelligence community, and some of the work that including human rights organisations do to leverage these different data data sources to validate and investigate human rights abuses taking place in different parts of the world. So to me, this seems like very important work, but also work that is rather underfunded. I might make the same comment about fact checking organisations, which seem to do very important work in the context of disinformation, but don't seem to be resourced in the way that perhaps they should be. Maybe just one final comment on this topic would relate to the media, the social media literacy of individuals. And I wonder whether that is something that is maybe going to help us in trying to get out of this impasse, because I think over time, people are becoming more aware that information that they see on the internet may not be reliable. And while I think there's still a tendency for people to get caught up in the moment, and retweets or otherwise amplify these types of messages, I think that some of the small changes the technology companies have made to encourage people to be more mindful when they're engaging with and amplifying content might just help build on top of that increase in media literacy, and take us to a slightly better place in the future. Todd Landman 24:26 Yeah, I mean, the whole thing around media literacy is really important. And I I also want to make a small plea for data literacy, just understanding and appreciating what data and statistics can tell us without having to be you know, an absolute epidemiologist, statistician or quantitative analyst. But I wanted to hark back to your idea around human rights investigations, we will have a future episode with a with a group that does just that and it's about maintaining the chain of evidence, corroborating evidence and using you know, digital evidence as you, you know in ways that help human rights investigations and, you know, if and when this conflict in Ukraine finishes, there will be some sort of human rights investigatory process. We're not sure which bodies going to do that yet, because we've been called for, you know, like a Nuremberg style trial, there have been calls for the ICC to be involved as been many other stakeholders involved, but that digital evidence is going to be very much part of the record. But I wonder just to, yeah go ahead Sam. Sam Gilbert 25:26 Sorry I am just going to add one thing on that, which I touched on this a little bit, and my book, but I think there's a real risk, actually, that open-source intelligence investigations become collateral damage in the tech companies pivot towards privacy. So what some investigators are finding is that material that they rely on to be able to do their investigations is being unilaterally removed by tech companies, either because it's YouTube, and they don't want to be accused of promoting terrorist content, or because it's Google or Facebook, and they don't want to being accused of infringing individual's privacy. So while this is not straightforward, I just think it's worth bearing in mind that sometimes pushing very hard for values like data privacy can have these unintended consequences in terms of open source intelligence. Todd Landman 26:24 Yes, it's an age old chestnut about the unintended consequences of purposive social action. I think that was a Robert Merton who said that at one point, but I guess in closing that I have a final question for you because you are an optimist. You're a data optimist, and you've written a book called good data. So what is there to be optimistic about for the future? Sam Gilbert 26:42 Well, I suppose I should say something about what type of optimist I am first, so to do that, I'll probably reach for Paul Romer's distinction between blind optimism and conditional optimism. So blind optimism is the optimism of a child hoping that her parents are going to build her a tree house. Conditional optimism is the optimism of a child who thinks, well, if I can get the tools and if I can get a few friends together, and if we can find the right tree, I think we can build a really incredible tree house together. So I'm very much in the second camp, the camp of conditional optimism. And I guess the basis for that probably goes to some of the things we've touched on already, where I just see enormous amounts of untapped potential in using data in ways that are socially useful. So perhaps just to bring in one more example of that. Opportunity Insights, the group at Harvard run by Raj Chetty has had some incredibly useful insights into social mobility and economic inequality in America, by using de-identified tax record data to understand over a long period of time, the differences in people's incomes. And I really think that that type of work is just the tip of the iceberg when it comes to this enormous proliferation of data that is out there. So I think if the data can be made available to researchers, also to private organisations in a way that, as far as possible, mitigates the risks that do exist to people's privacy. There's no knowing quite how many scientific breakthroughs or advances in terms of human and social understanding that we might be able to get to. Todd Landman 28:52 Amazing and I guess, to your conditional optimism, I would add my own category, which is a cautious optimist, and that's what I am. But talking to you today does really provide deep insight to us to understand the many, many different and complex issues here and that last point you made about, you know, the de-identified data used for for good purposes - shining a light on things that that are characterising our society, it with a view to be able to do something about it, you see things that you wouldn't see before and that's one of the virtues of good data analysis is that you end up revealing macro patterns and inconsistencies and inequalities and other things that then can feed into the policymaking process to try to make the world a better place and human rights are no exception to that agenda. So for now, Sam, I just want to thank you so much for coming on to this episode and sharing all these incredible insights and, and and the work that you've done. So thank you. Chris Garrington 29:49 Thanks for listening to this episode of The Rights Track, which was presented by Todd Landman and produced by Chris Garrington of Research Podcasts with funding from 3DI. You can find a detailed transcript on the website at www.RightsTrack.org. And don't forget to subscribe wherever you listen to your podcasts to access future and earlier episodes. Further reading and resources: Sam Gilbert (2021) Good Data: An Optimist's Guide to Our Digital Future. Bill Lampos' covid infodemiology: Lampos, V., Majumder, M.S., Yom-Tov, E. et al. (2021) “Tracking COVID-19 using online search”. Infodemiology Japan/natural disasters paper: [1906.07770] Predicting Evacuation Decisions using Representations of Individuals' Pre-Disaster Web Search Behavior (arxiv.org) On “empowering power”: Greene, Amanda and Gilbert, Samuel J., (2021) “More Data, More Power? Towards a Theory of Digital Legitimacy”. On the Hippocratic oath for tech workers: James Williams (2018) Stand out of our Light: Freedom and Resistance in the Attention Economy. Matthias C. Kettemann and Konrad Lachmayer (eds.) (2022) Pandemocracy in Europe: Power, Parliaments and People in Times of COVID-19. W. Lance Bennett and Alexandra Segerberg (2013) The Logic of Connective Action; Digital Media and the Personalization of Contentious Politics.
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Book review: The Age of Surveillance Capitalism, published by Richard Ngo on February 14, 2022 on LessWrong. I recently finished Shoshana Zuboff's book The Age of Surveillance Capitalism. It's received glowing reviews, but left me disappointed. Zuboff spends much of the book outraged at the behaviour of big tech corporations, but often neglects to explain what's actually bad about either the behaviour itself or the outcomes she warns it'll lead to. The result is far more polemical than persuasive. I do believe that there are significant problems with the technology industry - but mostly different problems from the ones she focuses on. And she neglects to account for the benefits of technology, or explain how we should weigh them against the harms. Her argument proceeds in three stages, which I'll address in turn: Companies like Google and Facebook have an “extraction imperative” to continually “expropriate” more personal data about their users. They use this for “the instrumentation and instrumentalisation of behaviour for the purposes of modification, prediction, monetisation, and control.” Ultimately, this will lead to “a form of tyranny” comparable to (but quite different from) totalitarianism, which Zuboff calls instrumentarianism. On data: I agree that big companies collect a lot of data about their users. That's a well-known fact. In return, those users get access to a wide variety of high-quality software for free. I, for one, would pay thousands of dollars if necessary to continue using the digital products that are currently free because they're funded by advertising. So what makes the collection of my data “extraction”, or “appropriation”, as opposed to a fair exchange? Why does it “abandon long-standing organic reciprocities with people”? It's hard to say. Here's Zuboff's explanation: Industrial capitalism transformed nature's raw materials into commodities, and surveillance capitalism lays its claims to the stuff of human nature for a new commodity invention. Now it is human nature that is scraped, torn, and taken for another century's market project. It is obscene to suppose that this harm can be reduced to the obvious fact that users receive no fee for the raw material they supply. That critique is a feat of misdirection that would use a pricing mechanism to institutionalise and therefore legitimate the extraction of human behaviour for manufacturing and sale. It ignores the key point that the essence of the exploitation here is the rendering of our lives as behavioural data for the sake of others' improved control over us. The remarkable questions here concern the facts that our lives are rendered as behavioural data in the first place; that ignorance is a condition of this ubiquitous rendition; that decision rights vanish before one even knows that there is a decision to make; that there are consequences to this diminishment of rights that we can neither see nor tell; that there is no exit, no voice, and no loyalty, only helplessness, resignation, and psychic numbing. This is fiery prose; but it's not really an argument. In more prosaic terms, websites are using my data to serve me ads which I'm more likely to click on. Often they do so by showing me products which I'm more interested in, which I actively prefer compared with seeing ads that are irrelevant to me. This form of “prediction and control” is on par with any other business “predicting and controlling” my purchases by offering me better products; there's nothing “intrinsically exploitative” about it. Now, there are other types of prediction and control - such as the proliferation of worryingly addictive newsfeeds and games. But surprisingly, Zuboff talks very little about the harmful consequences of online addiction! Instead she argues that the behaviour of tech companies is wrong for intrin...
Harvard professor Shoshana Zuboff wrote a monumental book about the new economic order that is alarming. "The Age of Surveillance Capitalism," reveals how the biggest tech companies deal with our data. How do we regain control of our data? What is surveillance capitalism? In this documentary, Zuboff takes the lid off Google and Facebook and reveals a merciless form of capitalism in which no natural resources, but the citizen itself, serves as a raw material. How can citizens regain control of their data? It is 2000, and the dot.com crisis has caused deep wounds. How will startup Google survive the bursting of the internet bubble? Founders Larry Page and Sergey Brin don't know anymore how to turn the tide. By chance, Google discovers that the "residual data" that people leave behind in their searches on the internet is very precious and tradable.
Harvard professor Shoshana Zuboff explains how Google has secretly invaded your home.
Harvard professor Shoshana Zuboff wrote a monumental book about the new economic order that is alarming. "The Age of Surveillance Capitalism," reveals how the biggest tech companies deal with our data. How do we regain control of our data? What is surveillance capitalism? In this episode, Zuboff takes the lid off Google and Facebook and reveals a merciless form of capitalism in which no natural resources, but the citizen itself, serves as a raw material. How can citizens regain control of their data?
Erich begins a short series on Shoshana Zuboff's book, "The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power". In this particular episode Erich explains what Surveillance Capitalism is and why it's important for us to understand. ______________________ antivisions.com --- Support this podcast: https://anchor.fm/antivisions/support
Lina al-Hathloul, sister of Saudi women's rights activist, Loujain al-Hathloul, joins Christiane Amanpour to discuss her sister's release from prison and her views on Crown Prince Mohammed bin Salman. She argues he is not a reformer and that in Saudi Arabia, "activism is considered terrorism". We look at the history behind vaccine hesitancy in minorities with historian David Olusoga. He explains how he's campaigning to get minority communities in the UK to take the vaccine and why that hesitancy exists in Britain and beyond. Then U.S. National Security Adviser Jake Sullivan talks about reviving the Iran deal, the SolarWinds, troops in Afghanistan and relations with Saudi Arabia. Turning to big tech, our Hari Sreenivasan speaks to Shoshana Zuboff, author of "The Age of Surveillance Capitalism," about the information coup the tech companies are waging through data collection. To learn more about how CNN protects listener privacy, visit cnn.com/privacy
Facebook has already been accused of spreading lies and polarizing society. Now, the federal government says it illegally crushed competition. On this week's On the Media, how to roll back a global power that has transformed our economy and warped our democracy. 1. Dina Srinivasan [@DinaSrinivasan], author of the 2019 paper, “The Antitrust Case Against Facebook,” on digital-age interpretations of the Sherman Antitrust Act. Listen. 2. Carole Cadwalladr [@carolecadwalla], journalist for The Guardian and The Observer, on the harms of Facebook unaddressed by both antitrust law and the company's own attempts at self-regulation. Listen. 3. Shoshana Zuboff [@shoshanazuboff], professor emeritus at Harvard Business School and author of The Age of Surveillance Capitalism, on the data extraction and human futures markets that comprise much of our economy. Listen. Music: Joeira by Kurup Capernaum by Khaled Mouzanar Okami by Nicola Cruz Peer Gynt Suite No. 1 by Edvard Grieg On the Media is supported by listeners like you. Support OTM by donating today (https://pledge.wnyc.org/support/otm). Follow our show on Instagram, Twitter and Facebook @onthemedia, and share your thoughts with us by emailing onthemedia@wnyc.org.