Podcasts about Algorithmic

  • 901PODCASTS
  • 1,446EPISODES
  • 39mAVG DURATION
  • 5WEEKLY NEW EPISODES
  • Feb 27, 2026LATEST

POPULARITY

20192020202120222023202420252026

Categories



Best podcasts about Algorithmic

Show all podcasts related to algorithmic

Latest podcast episodes about Algorithmic

Damn Interesting Week
BONUS Episode #36: Cold Hard Cash

Damn Interesting Week

Play Episode Listen Later Feb 27, 2026 41:44


Margarine money grab, Crowdsourced landlords, Liquidity at the laundromat, Banana bills, Kosovo crackdowns, Expensive password recovery, Algorithmic landlord, Cash comparisons. Jennifer, Angie, Way, and Bradley discuss a variety of curated links from the archives. Please consider supporting this ad-free content on Patreon.

HBS Managing the Future of Work
Executive recruiting: Tom Monahan on algorithmic power brokers and adaptability

HBS Managing the Future of Work

Play Episode Listen Later Feb 25, 2026 34:38


Corner office churn is up as demands multiply. Heidrick & Struggles' CEO explains what CVs leave out and why flexibility and organizational fit matter more as AI and global volatility undercut the predictive power of past performance. Also, AI-enhanced recruiting and lifelong learning.

PodcastDX
Rehabilitation Reimagined: Technology, Therapy and Independence

PodcastDX

Play Episode Listen Later Feb 24, 2026 18:35


The integration of Artificial Intelligence (AI) into post-injury rehabilitation is transforming recovery paradigms by enabling personalized, adaptive, and efficient rehabilitation pathways tailored to individual patient needs. This podcast reviews the current advances in AI applications that facilitate assessment, monitoring, and optimization of rehabilitation programs following injuries. Through machine learning algorithms, wearable sensors, and predictive analytics, AI enhances the precision of therapy plans, tracks patient progress in real-time, and predicts recovery trajectories. The discussion includes the benefits of AI-driven rehabilitation, including improved functional outcomes, reduced recovery times, and increased patient engagement. It also addresses challenges such as data privacy, algorithmic bias, and integration with clinical workflows.  1. Transforming recovery paradigms Traditional post‑injury rehab relies on periodic in‑person assessments, therapist intuition, and standardized protocols that only partially account for individual variability. AI is shifting this model toward: Continuous, data‑driven care: Instead of snapshots in clinic, rehab can be informed by near real‑time streams of kinematic, physiological, and behavioral data from wearables, smart devices, and robot interfaces. Dynamic adaptation: Therapy intensity, task difficulty, and exercise selection can be automatically adjusted based on ongoing performance, fatigue, and recovery trends, rather than fixed schedules. Precision rehabilitation: Algorithms can identify which patients are likely to respond to specific interventions (e.g., constraint‑induced movement therapy vs robotics) and tailor plans accordingly. This moves rehabilitation from a "one‑size‑fits‑many" paradigm toward precision, context‑aware therapy, analogous to precision oncology but focused on function and participation. 2. Assessment, monitoring, and optimization AI for assessment Sensor‑based movement analysis: Machine learning models process accelerometer, IMU, EMG, and pressure data to quantify gait symmetry, joint kinematics, balance, and fine motor control with higher resolution than visual observation alone. Automated scoring: AI can approximate or support standardized scales (e.g., Fugl‑Meyer, Berg Balance Scale) by mapping sensor features or video-derived pose estimates to clinical scores, reducing inter‑rater variability and saving clinician time. Continuous monitoring Home and community tracking: Wearable and ambient sensors enable monitoring of daily steps, walking speed, arm use, posture, and adherence to exercises outside the clinic, feeding rich longitudinal datasets into AI models. Real‑time alerts: Algorithms can detect abnormal patterns—such as increased fall risk, reduced limb use, or signs of over‑exertion—and flag the clinician or adjust digital therapy content automatically. Optimization and decision support Predictive models: Using historical data, AI can forecast functional gains, plateau points, or risk of complications (e.g., falls, readmission), supporting individualized goal‑setting and resource allocation. Reinforcement learning and "digital twins": Emerging work in neurorehabilitation treats rehab as a sequential decision problem, using model‑based reinforcement learning and patient "digital twins" to recommend optimal timing, dosing, and progression of interventions over weeks to months.​ 3. Technologies: ML, wearables, analytics Machine learning algorithms: Supervised ML classifies movement quality (normal vs compensatory), detects exercise type from sensor streams, and estimates clinical scores. Unsupervised learning clusters patients into phenotypes (e.g., gait patterns after stroke), revealing subgroups that respond differently to certain therapies. Reinforcement learning and contextual bandits explore which therapy adjustments yield the best long‑term functional outcomes for a given individual.​ Wearable sensors and robotics: Inertial sensors, EMG, pressure insoles, and exoskeleton sensors capture high‑frequency movement and muscle activity data during training. Robotic devices (upper‑limb exoskeletons, gait trainers) coupled with AI can modulate assistance, resistance, or task difficulty in real time based on performance and predicted fatigue. Predictive and prescriptive analytics: Predictive analytics estimate trajectories (e.g., time to independent walking, expected upper‑limb function) to inform shared decisions with patients and families. Prescriptive analytics recommend therapy intensity, modality mix, and scheduling to maximize functional gains under resource constraints. 4. Benefits: outcomes, efficiency, engagement Improved functional outcomes: Studies report better motor recovery, gait quality, and ADL performance when AI‑assisted training is used—especially when robotics and intelligent feedback are involved. Reduced recovery time and resource use: More precise dosing and earlier identification of non‑responders can reduce ineffective sessions, shorten time to key milestones, and support safe earlier discharge with robust remote follow‑up. Increased adherence and engagement: AI‑driven digital rehab platforms use gamification, adaptive difficulty, and personalized feedback to keep patients engaged in home programs, improving adherence compared to static paper instructions. Support for clinicians: Instead of replacing therapists, AI can offload repetitive measurement tasks, highlight concerning trends, and offer data‑driven suggestions, allowing clinicians to focus on relational, motivational, and complex decision‑making aspects of care. 5. Challenges and ethical considerations Data privacy and security: Rehab AI often relies on continuous collection of sensitive motion, physiological, and sometimes audio/video data, raising questions about consent, storage, secondary use, and breach risk. Approaches like federated learning and on‑device processing are being explored to reduce centralization of identifiable data while still enabling model training. Algorithmic bias and fairness: If training data under‑represent older adults, women, certain racial/ethnic groups, or people with severe disability, AI models may misestimate performance or risk for those groups, potentially widening disparities in rehab access and outcomes. Ongoing auditing, diverse datasets, and participatory design with patients and clinicians are needed to ensure equitable performance. Integration with clinical workflows: Many AI tools are developed in research settings and are not yet seamlessly integrated into EHRs, scheduling systems, or therapist documentation workflows. Poorly integrated tools risk adding documentation burden or "alert fatigue," reducing adoption. Successful implementations co‑design interfaces with frontline therapists and physicians. Regulation, liability, and trust: It remains unclear in many jurisdictions how to regulate adaptive rehab algorithms (as medical devices, clinical decision support, or wellness tools) and who is liable when AI‑informed plans cause harm.​ Transparent, explainable models and clear communication to patients about the role of AI are critical for maintaining trust. 6. Case studies and emerging trends Remote and hybrid digital rehabilitation: AI‑driven platforms providing home‑based stroke, orthopedic, or Parkinson's rehab with clinician dashboards are improving adherence and extending care beyond brick‑and‑mortar clinics. Collaborative AI for precision neurorehabilitation: Frameworks combining patient‑clinician goal setting, digital twins, and reinforcement learning exemplify "collaborative AI" that augments rather than replaces therapists.​ Multimodal personalization: Integration of movement data, EMG, heart rate, sleep, and self‑reported pain/fatigue is enabling more nuanced adaptation to daily fluctuations in capacity. Conversational AI for education and coaching: Early work is assessing tools like ChatGPT as low‑risk supports for exercise education and motivation, though they are not yet precise enough to replace professional plan design AI is moving rehab toward patient‑centered, continuously adapting, and data‑rich care, but realizing this promise depends on addressing privacy, bias, workflow, and regulatory challenges in partnership with clinicians and patients.

The Sunday Show
How to Become an Algorithmic Problem

The Sunday Show

Play Episode Listen Later Feb 22, 2026 46:34


As AI technologies proliferate, a growing number of people are asking what it means to live in a world dominated by algorithms and automated systems—and what gets lost when those systems optimize human behavior at scale. These questions sit at the intersection of political theory, technology policy, and everyday life, and they are drawing scholars from fields well outside computer science into the conversation.José Marichal is a political scientist at California Lutheran University who has been writing and teaching about technology and politics for more than two decades. Marichal's new book, You Must Become an Algorithmic Problem: Renegotiating the Socio-Technical Contract, considers the age of recommendation systems and large language models. Drawing on political philosophy, he argues that individuals have entered into an implicit bargain with technology companies, trading unpredictability and novelty for the convenience of algorithmically curated experience. The consequences of that bargain, he contends, reach beyond personal preference and into the foundations of liberal democratic citizenship.

1000 Hours Outsides podcast
1KHO 714: Do We Have Free Will in an Algorithmic World? | Kartik Hosanagar, A Human's Guide to Machine Intelligence

1000 Hours Outsides podcast

Play Episode Listen Later Feb 18, 2026 58:37


Kartik Hosanagar has been tracking AI long before “ChatGPT” became a household word, and in this conversation with Ginny Yurich, he helps parents see what's already shaping their homes (and possibly children). AI can affect what we watch, what we buy, what we believe, who we date, and our career paths. Drawing from his book A Human's Guide to Machine Intelligence, Kartik explains how machines stopped simply following “recipe-like” instructions and began learning like children do—surprising us, improvising, and sometimes operating as black boxes we can't fully interpret. You'll hear the jaw-dropping story of the early chatbots that felt like real friends, why recommendation engines narrow our options without us noticing, and what it looks like to reclaim agency by adding “friction” back into family life. This episode is both a wake-up call and a steadying roadmap for staying human in an algorithmic world. Get a copy of the Book: A Human's Guide to Machine Intelligence Check out Kartik's Substack Creative Intelligence: https://hosanagar.substack.com Learn more about your ad choices. Visit megaphone.fm/adchoices

New Books Network
Ted Striphas, "Algorithmic Culture Before the Internet" (Columbia UP, 2023)

New Books Network

Play Episode Listen Later Feb 18, 2026 58:49


In this episode, Ted Striphas, Jeffrey Herlihy-Mera and Alex Rivera Cartagena discuss Algorithmic Culture Before the Internet (Columbia University Press 2023), considering how some pre-digital human systems functioned through repetitive structures and automated processes that have similarities to electronic algorithms. They discuss how cognition has become digitized, dispersed across algorithmic and biological systems, and how digital tools attempt to overtake lived experiences and knowledges. Their conversation traces the history of computation while engaging culture and language as analytical tools. Their dialogue connects analog media, cultural practices, and symbolic systems to reflect on the importance of words in the human experience. Long before digital code, verbal narratives shaped (or attempted to shape) our relationship with knowledge and power; building on that insight, an important analytical point to critique algorithms begins with culture, and that culture begins in language. This episode and the Instituto Nuevos Horizontes are sponsored in part by the Teagle Foundation.Our conversation in Spanish about Algorithmic Culture Before the Internet is available here. Topics and scholars mentioned in this episode: Héctor José Huyke, Elogio a las cercanías: crítica a la cultura tecnológica actual (Editora Educación Emergente, 2024). The Late Age of Print: Everyday Book Culture from Consumerism to Control (Columbia University Press, 2011). Erik Hoel's notion of a “consciousness winter.” Lawrence Grossberg Medium theory Joshua Myerwicz Janice Radway Scriptocentrism “Things that different forms of media do to us.” -Ted Striphas Scott Kushner, University of Rhode Island, “A turnstile is more persuasive than a person saying 'go this way.'" Alan Touring The Late Age of Print: Blog and book "The locus of cultural decision making [has been] shifting in the direction of computer systems and algorithms." -Ted Striphas “Build different meanings of words so we can build different worlds,” -Ted Striphas. “What is culture when human beings are not the only one producing it?” -Ted Striphas Pluriverse, A Post-Development Dictionary (Columbia University Press, 2019), edited by Ashish Kothari, Ariel Salleh, Arturo Escobar, Federico Demaria, and Alberto Acosta. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/new-books-network

New Books in Science, Technology, and Society
Ted Striphas, "Algorithmic Culture Before the Internet" (Columbia UP, 2023)

New Books in Science, Technology, and Society

Play Episode Listen Later Feb 18, 2026 58:49


In this episode, Ted Striphas, Jeffrey Herlihy-Mera and Alex Rivera Cartagena discuss Algorithmic Culture Before the Internet (Columbia University Press 2023), considering how some pre-digital human systems functioned through repetitive structures and automated processes that have similarities to electronic algorithms. They discuss how cognition has become digitized, dispersed across algorithmic and biological systems, and how digital tools attempt to overtake lived experiences and knowledges. Their conversation traces the history of computation while engaging culture and language as analytical tools. Their dialogue connects analog media, cultural practices, and symbolic systems to reflect on the importance of words in the human experience. Long before digital code, verbal narratives shaped (or attempted to shape) our relationship with knowledge and power; building on that insight, an important analytical point to critique algorithms begins with culture, and that culture begins in language. This episode and the Instituto Nuevos Horizontes are sponsored in part by the Teagle Foundation.Our conversation in Spanish about Algorithmic Culture Before the Internet is available here. Topics and scholars mentioned in this episode: Héctor José Huyke, Elogio a las cercanías: crítica a la cultura tecnológica actual (Editora Educación Emergente, 2024). The Late Age of Print: Everyday Book Culture from Consumerism to Control (Columbia University Press, 2011). Erik Hoel's notion of a “consciousness winter.” Lawrence Grossberg Medium theory Joshua Myerwicz Janice Radway Scriptocentrism “Things that different forms of media do to us.” -Ted Striphas Scott Kushner, University of Rhode Island, “A turnstile is more persuasive than a person saying 'go this way.'" Alan Touring The Late Age of Print: Blog and book "The locus of cultural decision making [has been] shifting in the direction of computer systems and algorithms." -Ted Striphas “Build different meanings of words so we can build different worlds,” -Ted Striphas. “What is culture when human beings are not the only one producing it?” -Ted Striphas Pluriverse, A Post-Development Dictionary (Columbia University Press, 2019), edited by Ashish Kothari, Ariel Salleh, Arturo Escobar, Federico Demaria, and Alberto Acosta. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/science-technology-and-society

Off the Page: A Columbia University Press Podcast
Ted Striphas, "Algorithmic Culture Before the Internet" (Columbia UP, 2023)

Off the Page: A Columbia University Press Podcast

Play Episode Listen Later Feb 18, 2026 58:49


In this episode, Ted Striphas, Jeffrey Herlihy-Mera and Alex Rivera Cartagena discuss Algorithmic Culture Before the Internet (Columbia University Press 2023), considering how some pre-digital human systems functioned through repetitive structures and automated processes that have similarities to electronic algorithms. They discuss how cognition has become digitized, dispersed across algorithmic and biological systems, and how digital tools attempt to overtake lived experiences and knowledges. Their conversation traces the history of computation while engaging culture and language as analytical tools. Their dialogue connects analog media, cultural practices, and symbolic systems to reflect on the importance of words in the human experience. Long before digital code, verbal narratives shaped (or attempted to shape) our relationship with knowledge and power; building on that insight, an important analytical point to critique algorithms begins with culture, and that culture begins in language. This episode and the Instituto Nuevos Horizontes are sponsored in part by the Teagle Foundation.Our conversation in Spanish about Algorithmic Culture Before the Internet is available here. Topics and scholars mentioned in this episode: Héctor José Huyke, Elogio a las cercanías: crítica a la cultura tecnológica actual (Editora Educación Emergente, 2024). The Late Age of Print: Everyday Book Culture from Consumerism to Control (Columbia University Press, 2011). Erik Hoel's notion of a “consciousness winter.” Lawrence Grossberg Medium theory Joshua Myerwicz Janice Radway Scriptocentrism “Things that different forms of media do to us.” -Ted Striphas Scott Kushner, University of Rhode Island, “A turnstile is more persuasive than a person saying 'go this way.'" Alan Touring The Late Age of Print: Blog and book "The locus of cultural decision making [has been] shifting in the direction of computer systems and algorithms." -Ted Striphas “Build different meanings of words so we can build different worlds,” -Ted Striphas. “What is culture when human beings are not the only one producing it?” -Ted Striphas Pluriverse, A Post-Development Dictionary (Columbia University Press, 2019), edited by Ashish Kothari, Ariel Salleh, Arturo Escobar, Federico Demaria, and Alberto Acosta.

New Books in Technology
Ted Striphas, "Algorithmic Culture Before the Internet" (Columbia UP, 2023)

New Books in Technology

Play Episode Listen Later Feb 18, 2026 58:49


In this episode, Ted Striphas, Jeffrey Herlihy-Mera and Alex Rivera Cartagena discuss Algorithmic Culture Before the Internet (Columbia University Press 2023), considering how some pre-digital human systems functioned through repetitive structures and automated processes that have similarities to electronic algorithms. They discuss how cognition has become digitized, dispersed across algorithmic and biological systems, and how digital tools attempt to overtake lived experiences and knowledges. Their conversation traces the history of computation while engaging culture and language as analytical tools. Their dialogue connects analog media, cultural practices, and symbolic systems to reflect on the importance of words in the human experience. Long before digital code, verbal narratives shaped (or attempted to shape) our relationship with knowledge and power; building on that insight, an important analytical point to critique algorithms begins with culture, and that culture begins in language. This episode and the Instituto Nuevos Horizontes are sponsored in part by the Teagle Foundation.Our conversation in Spanish about Algorithmic Culture Before the Internet is available here. Topics and scholars mentioned in this episode: Héctor José Huyke, Elogio a las cercanías: crítica a la cultura tecnológica actual (Editora Educación Emergente, 2024). The Late Age of Print: Everyday Book Culture from Consumerism to Control (Columbia University Press, 2011). Erik Hoel's notion of a “consciousness winter.” Lawrence Grossberg Medium theory Joshua Myerwicz Janice Radway Scriptocentrism “Things that different forms of media do to us.” -Ted Striphas Scott Kushner, University of Rhode Island, “A turnstile is more persuasive than a person saying 'go this way.'" Alan Touring The Late Age of Print: Blog and book "The locus of cultural decision making [has been] shifting in the direction of computer systems and algorithms." -Ted Striphas “Build different meanings of words so we can build different worlds,” -Ted Striphas. “What is culture when human beings are not the only one producing it?” -Ted Striphas Pluriverse, A Post-Development Dictionary (Columbia University Press, 2019), edited by Ashish Kothari, Ariel Salleh, Arturo Escobar, Federico Demaria, and Alberto Acosta. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/technology

Blue Collar Schmucks
The Algorithmic Abyss

Blue Collar Schmucks

Play Episode Listen Later Feb 16, 2026 85:10


This week the Schmucks ricochet from Valentine's Day chatter into a sprawling marathon of controversial theories, Zionist politics, historical rabbit holes, and social‑media censorship, before veering into everything from celebrity conspiracies to Epstein emails, Biden body‑double rumors, and luxury‑lifestyle sleuthing. Between knife‑blade demos, shotgun window‑shopping, gaming nostalgia, and a Workaholics gag reel, the crew wrestles with information control, political frustration, and the absurdity of the modern media landscape—all with their signature mix of skepticism, humor, and unfiltered curiosity.

The Morning Review with Lester Kiewit Podcast
Algorithmic Chess: Winning Visibility in a Digital World

The Morning Review with Lester Kiewit Podcast

Play Episode Listen Later Feb 16, 2026 20:17 Transcription Available


Views and News with Clarence Ford is the mid-morning show on CapeTalk. This 3-hour long programme shares and reflects a broad array of perspectives. It is inspirational, passionate and positive. Host Clarence Ford’s gentle curiosity and dapper demeanour leave listeners feeling motivated and empowered. Known for his love of jazz and golf, Clarrie covers a range of themes including relationships, heritage and philosophy. Popular segments include Barbs’ Wire at 9:30am (Mon-Thurs) and The Naked Scientist at 9:30 on Fridays. Thank you for listening to a podcast from Views & News with Clarence Ford Listen live on Primedia+ weekdays between 09:00 and 12:00 (SA Time) to Views and News with Clarence Ford broadcast on CapeTalk https://buff.ly/NnFM3Nk For more from the show go to https://buff.ly/erjiQj2 or find all the catch-up podcasts here https://buff.ly/BdpaXRn Subscribe to the CapeTalk Daily and Weekly Newsletters https://buff.ly/sbvVZD5 Follow us on social media: CapeTalk on Facebook: https://www.facebook.com/CapeTalk CapeTalk on TikTok: https://www.tiktok.com/@capetalk CapeTalk on Instagram: https://www.instagram.com/ CapeTalk on X: https://x.com/CapeTalk CapeTalk on YouTube: https://www.youtube.com/@CapeTalk567See omnystudio.com/listener for privacy information.

Mixed Signals from Semafor Media
Rolling Stone scion on investing in ‘Track Star' and the anti-algorithmic future of music media

Mixed Signals from Semafor Media

Play Episode Listen Later Feb 13, 2026 60:19


Gus Wenner, chairman of Rolling Stone and son of the magazine's legendary founder, has spent years balancing the brand's print heritage with its digital transformation. On this week's Mixed Signals, Ben and Max talk with Wenner about his new holding company, Wenner Media Ventures, and why he thinks algorithms  disrespect audiences. They also discuss the venture's investment in the hit video series Track Star, his defense of Rolling Stone's most provocative journalism, and his potential future media investments.

New Books Network en español
Ted Striphas, Algorithmic Culture Before the Internet (2023)

New Books Network en español

Play Episode Listen Later Feb 12, 2026 46:46


En este episodio, Jeffrey Herlihy-Mera, Catedrático de Humanidades de la Universidad de Puerto Rico-Mayagüez, y Alex Rivera Cartagena de la Universidad de Puerto Rico-Mayagüez, conversan sobre Algorithmic Culture Before the Internet (Columbia U. Press, 2023) por Ted Striphas, considerando cómo un mundo predigital funcionaba por estructuras repetitivas y procesos automáticos. Ponemos especial atención en la lengua como tecnología fundamental: un sistema de reglas compartidas que hace posible el pensamiento, la memoria y la comunicación. Si las palabras no sólo describen la realidad, también la ordenan, la limitan y la expanden. A través del lenguaje, las culturas transmiten patrones, valores y formas de interpretar lo humano, funcionando como algoritmos vivos que se reescriben con el tiempo. Este episodio conecta medios analógicos, prácticas culturales y sistemas simbólicos para reflexionar sobre la importancia de las palabras en la experiencia humana y en la memoria colectiva. Considera las instrucciones, gramáticas y narrativas que moldean (o intentan moldear) nuestra relación con el conocimiento y el poder. Un espacio para entender que los algoritmos comienzan en la cultura, y como tanto, la cultura comienza en el lenguaje, y el lenguaje comienza en la palabra. Este podcast y el Instituto Nuevos Horizontes son patrocinados por el Departamento de Humanidades. Temas dialogados: Un momento cuando “The solution is now the problem.” Facebook, las emociones y la capacidad social Criticando el género, la raza, la etnicidad, el grupo lingüístico y otras categorías aprobadas por el imperio. Imperialismo Cultural “Scriptocentrism” La improvisación y el pensamiento La improvisación y la docencia “indoor lives” Cómo la infraestructura impacta al pensamiento. Cuando una escasez de tecnología es saludable en Cuba. Carlos Alberto Peón Casas, Universidad de Camagüey y “la naturaleza profunda de su relación con las palabras.” Steve Jobs Martin Luther King Scott Kushner Héctor Huyke y las tertulias de la UPR-M David Lorenzo Milena Pi Reyes “La mejor versión de sí mismo” como lema de los 1990. Cultura tecnológica El lenguaje de la imagen (imagen-centrismo) Vídeo-centrismo La universidad y la tecnología de la educación Estados Unidos como mito recientemente desarrollado. La conducta que los algoritmos desean establecer en nosotros. Learn more about your ad choices. Visit megaphone.fm/adchoices

Sidecar Sync
From Deep Learning Indaba to Algorithmic Sovereignty with Benjamin Rosman | 121

Sidecar Sync

Play Episode Listen Later Feb 12, 2026 52:44


Send a textIf you think the future of AI is being decided only in Silicon Valley or Beijing, this conversation will stretch your perspective. Amith Nagarajan and Mallory Mejias sit down with Benjamin “Benji” Rosman—Professor at the University of the Witwatersrand, Founder of the Machine Intelligence and Neural Discovery (MIND) Institute, and recently named to the Time 100 Most Influential People in AI for 2025—to explore how Africa is building its AI ecosystem from the ground up. From the origin story of Deep Learning Indaba (now the largest machine learning summer school in the world) to the concept of algorithmic sovereignty, Benji explains why diversity of thinking fuels scientific breakthroughs, how constraints drive innovation, and why leaders must balance exploration and exploitation in an era of exponential change. 

Novedades editoriales en tecnología
Ted Striphas, Algorithmic Culture Before the Internet (2023)

Novedades editoriales en tecnología

Play Episode Listen Later Feb 12, 2026 46:46


En este episodio, Jeffrey Herlihy-Mera, Catedrático de Humanidades de la Universidad de Puerto Rico-Mayagüez, y Alex Rivera Cartagena de la Universidad de Puerto Rico-Mayagüez, conversan sobre Algorithmic Culture Before the Internet (Columbia U. Press, 2023) por Ted Striphas, considerando cómo un mundo predigital funcionaba por estructuras repetitivas y procesos automáticos. Ponemos especial atención en la lengua como tecnología fundamental: un sistema de reglas compartidas que hace posible el pensamiento, la memoria y la comunicación. Si las palabras no sólo describen la realidad, también la ordenan, la limitan y la expanden. A través del lenguaje, las culturas transmiten patrones, valores y formas de interpretar lo humano, funcionando como algoritmos vivos que se reescriben con el tiempo. Este episodio conecta medios analógicos, prácticas culturales y sistemas simbólicos para reflexionar sobre la importancia de las palabras en la experiencia humana y en la memoria colectiva. Considera las instrucciones, gramáticas y narrativas que moldean (o intentan moldear) nuestra relación con el conocimiento y el poder. Un espacio para entender que los algoritmos comienzan en la cultura, y como tanto, la cultura comienza en el lenguaje, y el lenguaje comienza en la palabra. Este podcast y el Instituto Nuevos Horizontes son patrocinados por el Departamento de Humanidades. Temas dialogados: Un momento cuando “The solution is now the problem.” Facebook, las emociones y la capacidad social Criticando el género, la raza, la etnicidad, el grupo lingüístico y otras categorías aprobadas por el imperio. Imperialismo Cultural “Scriptocentrism” La improvisación y el pensamiento La improvisación y la docencia “indoor lives” Cómo la infraestructura impacta al pensamiento. Cuando una escasez de tecnología es saludable en Cuba. Carlos Alberto Peón Casas, Universidad de Camagüey y “la naturaleza profunda de su relación con las palabras.” Steve Jobs Martin Luther King Scott Kushner Héctor Huyke y las tertulias de la UPR-M David Lorenzo Milena Pi Reyes “La mejor versión de sí mismo” como lema de los 1990. Cultura tecnológica El lenguaje de la imagen (imagen-centrismo) Vídeo-centrismo La universidad y la tecnología de la educación Estados Unidos como mito recientemente desarrollado. La conducta que los algoritmos desean establecer en nosotros. Learn more about your ad choices. Visit megaphone.fm/adchoices

Novedades editoriales en literatura y estudios culturales
Ted Striphas, Algorithmic Culture Before the Internet (2023)

Novedades editoriales en literatura y estudios culturales

Play Episode Listen Later Feb 12, 2026 46:46


En este episodio, Jeffrey Herlihy-Mera, Catedrático de Humanidades de la Universidad de Puerto Rico-Mayagüez, y Alex Rivera Cartagena de la Universidad de Puerto Rico-Mayagüez, conversan sobre Algorithmic Culture Before the Internet (Columbia U. Press, 2023) por Ted Striphas, considerando cómo un mundo predigital funcionaba por estructuras repetitivas y procesos automáticos. Ponemos especial atención en la lengua como tecnología fundamental: un sistema de reglas compartidas que hace posible el pensamiento, la memoria y la comunicación. Si las palabras no sólo describen la realidad, también la ordenan, la limitan y la expanden. A través del lenguaje, las culturas transmiten patrones, valores y formas de interpretar lo humano, funcionando como algoritmos vivos que se reescriben con el tiempo. Este episodio conecta medios analógicos, prácticas culturales y sistemas simbólicos para reflexionar sobre la importancia de las palabras en la experiencia humana y en la memoria colectiva. Considera las instrucciones, gramáticas y narrativas que moldean (o intentan moldear) nuestra relación con el conocimiento y el poder. Un espacio para entender que los algoritmos comienzan en la cultura, y como tanto, la cultura comienza en el lenguaje, y el lenguaje comienza en la palabra. Este podcast y el Instituto Nuevos Horizontes son patrocinados por el Departamento de Humanidades. Temas dialogados: Un momento cuando “The solution is now the problem.” Facebook, las emociones y la capacidad social Criticando el género, la raza, la etnicidad, el grupo lingüístico y otras categorías aprobadas por el imperio. Imperialismo Cultural “Scriptocentrism” La improvisación y el pensamiento La improvisación y la docencia “indoor lives” Cómo la infraestructura impacta al pensamiento. Cuando una escasez de tecnología es saludable en Cuba. Carlos Alberto Peón Casas, Universidad de Camagüey y “la naturaleza profunda de su relación con las palabras.” Steve Jobs Martin Luther King Scott Kushner Héctor Huyke y las tertulias de la UPR-M David Lorenzo Milena Pi Reyes “La mejor versión de sí mismo” como lema de los 1990. Cultura tecnológica El lenguaje de la imagen (imagen-centrismo) Vídeo-centrismo La universidad y la tecnología de la educación Estados Unidos como mito recientemente desarrollado. La conducta que los algoritmos desean establecer en nosotros. Learn more about your ad choices. Visit megaphone.fm/adchoices

Soul Renovation - With Adeline Atlas
Tech, Archives & Algorithmic Truth (By Adeline Atlas)

Soul Renovation - With Adeline Atlas

Play Episode Listen Later Feb 7, 2026 7:42


Adeline Atlas 11 X Published AUTHOR Digital Twin: Create Your AI Clone: ⁠⁠https://www.soulreno.com/digital-twin⁠⁠SOS: School of Soul Vault: Full Access ALL SERIES⁠⁠⁠https://www.soulreno.com/joinus-202f0461-ba1e-4ff8-8111-9dee8c726340⁠⁠Instagram: ⁠⁠⁠https://www.instagram.com/soulrenovation/⁠⁠Soul Renovation - BooksSoul Game - ⁠⁠https://tinyurl.com/vay2xdcp⁠⁠Why Play: ⁠⁠⁠https://tinyurl.com/2eh584jf⁠⁠How To Play: ⁠⁠⁠https://tinyurl.com/2ad4msf3⁠⁠Digital Soul: ⁠⁠https://tinyurl.com/3hk29s9x⁠⁠Every Word: ⁠⁠⁠⁠http://tiny.cc/ihrs001⁠⁠Drain Me: ⁠⁠⁠https://tinyurl.com/bde5fnf4⁠⁠The Rabbit Hole: ⁠⁠https://tinyurl.com/3swnmxfj⁠⁠Destiny Swapping: ⁠⁠https://tinyurl.com/35dzpvss⁠⁠Spanish Editions: Every Word: ⁠⁠https://tinyurl.com/ytec7cvc⁠⁠Drain Me: ⁠⁠https://tinyurl.com/3jv4fc5n⁠⁠

Podcast Notes Playlist: Latest Episodes
1280: Cory Doctorow | Why Everything Got Worse and What to Do About It

Podcast Notes Playlist: Latest Episodes

Play Episode Listen Later Feb 6, 2026


Jordan Harbinger Show: Read the notes at at podcastnotes.org. Don't forget to subscribe for free to our newsletter, the top 10 ideas of the week, every Monday --------- Remember when Facebook was fun and Google actually worked? Cory Doctorow coined a term for what went wrong, and he's here to explain how we fight back.Full show notes and resources can be found here: jordanharbinger.com/1280What We Discuss with Cory Doctorow:"Enshittification" is Cory Doctorow's term for how platforms decay. First they're good to users, then they abuse users to serve business customers, then they abuse everyone to claw back value for themselves. Facebook, Amazon, and Google all followed this playbook — and policy makers let it happen."Switching costs" are a deliberate policy choice, not an inevitability. Companies jack up the friction of leaving their platforms through design and lobbying, but regulations like phone number portability prove we can legislate friction down when we choose to.The Digital Millennium Copyright Act criminalizes fixing things you own. Security researchers who expose corporate sabotage — like the Polish train company bricking locomotives to extort customers — face harsher legal consequences than actual pirates."Algorithmic wage discrimination" is surveillance capitalism's newest trick. Apps like Uber track how desperate workers are and pay them less accordingly — the more rides you accept, the lower your future offers, turning desperation into a permanent wage ceiling.You can fight back by supporting interoperability and making strategic choices. Use alternative services (like Kagi for search), follow advocates like the Electronic Frontier Foundation (eff.org), and remember: every time you demand the right to own what you buy, you're pushing back against enshittification.And much more...And if you're still game to support us, please leave a review here — even one sentence helps! Sign up for Six-Minute Networking — our free networking and relationship development mini course — at jordanharbinger.com/course!Subscribe to our once-a-week Wee Bit Wiser newsletter today and start filling your Wednesdays with wisdom!Do you even Reddit, bro? Join us at r/JordanHarbinger!This Episode Is Brought To You By Our Fine Sponsors: Article: Visit article.com/jordan for $50 off your first purchase of $100 or moreBetterHelp: 10% off first month: betterhelp.com/jordanBombas: Go to bombas.com/jordan to get 20% off your first orderButcherBox: Free protein for a year + $20 off first box: butcherbox.com/jordanHomes.com: Find your home: homes.comSee Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.

The Current
Are you a target of algorithmic pricing?

The Current

Play Episode Listen Later Feb 5, 2026 12:01


We know that companies are gathering data on us as we go about our lives online, but that information might also be used to create a personalized price for something you're looking at buying. We hear from Jim Balsellie, the co-founder of the Council of Canadian Innovators and the Digital Governance Council, on how algorithmic pricing works and what guardrails need to be put in place.

The Jordan Harbinger Show
1280: Cory Doctorow | Why Everything Got Worse and What to Do About It

The Jordan Harbinger Show

Play Episode Listen Later Feb 3, 2026 93:51


Remember when Facebook was fun and Google actually worked? Cory Doctorow coined a term for what went wrong, and he's here to explain how we fight back.Full show notes and resources can be found here: jordanharbinger.com/1280What We Discuss with Cory Doctorow:"Enshittification" is Cory Doctorow's term for how platforms decay. First they're good to users, then they abuse users to serve business customers, then they abuse everyone to claw back value for themselves. Facebook, Amazon, and Google all followed this playbook — and policy makers let it happen."Switching costs" are a deliberate policy choice, not an inevitability. Companies jack up the friction of leaving their platforms through design and lobbying, but regulations like phone number portability prove we can legislate friction down when we choose to.The Digital Millennium Copyright Act criminalizes fixing things you own. Security researchers who expose corporate sabotage — like the Polish train company bricking locomotives to extort customers — face harsher legal consequences than actual pirates."Algorithmic wage discrimination" is surveillance capitalism's newest trick. Apps like Uber track how desperate workers are and pay them less accordingly — the more rides you accept, the lower your future offers, turning desperation into a permanent wage ceiling.You can fight back by supporting interoperability and making strategic choices. Use alternative services (like Kagi for search), follow advocates like the Electronic Frontier Foundation (eff.org), and remember: every time you demand the right to own what you buy, you're pushing back against enshittification.And much more...And if you're still game to support us, please leave a review here — even one sentence helps! Sign up for Six-Minute Networking — our free networking and relationship development mini course — at jordanharbinger.com/course!Subscribe to our once-a-week Wee Bit Wiser newsletter today and start filling your Wednesdays with wisdom!Do you even Reddit, bro? Join us at r/JordanHarbinger!This Episode Is Brought To You By Our Fine Sponsors: Article: Visit article.com/jordan for $50 off your first purchase of $100 or moreBetterHelp: 10% off first month: betterhelp.com/jordanBombas: Go to bombas.com/jordan to get 20% off your first orderButcherBox: Free protein for a year + $20 off first box: butcherbox.com/jordanHomes.com: Find your home: homes.comSee Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.

Aliveness: Earth Medicine and Deep Inner Work to Connect us With Who We Are
Kelly Stonelake on Power, Child Safety, and Telling the Truth at Meta

Aliveness: Earth Medicine and Deep Inner Work to Connect us With Who We Are

Play Episode Listen Later Feb 2, 2026 82:44


I am so excited to share my conversation with Kelly Stonelake!Kelly Stonelake is a business operator and technologist with nearly 15 years of experience at Meta, where she held senior roles in product marketing and organizational leadership. During her tenure, she raised concerns about child safety and internal misconduct and faced retaliation after senior leadership chose to conceal risks rather than address them. The gap between Meta's stated values and her lived experience led to a severe loss of functioning and a fight for her life.Following her departure, Kelly became a whistleblower and advocate for stronger protections for children online, greater tech accountability, awareness of autistic burnout and suicide risk, and safer workplaces for marginalized people. Her work focuses on challenging the concentration of power in technology and advancing equity, accountability, and systemic reform.Kelly recently testified before the Washington State Senate in support of SB 5708 on addictive feeds and SB 5784 on AI companions, sharing first-hand accounts of child exposure, data collection without parental consent, and corporate retaliation against those who spoke up.Please visit Kelly Stonelake on Substack at Overturned by Kelly Stonelake .02:16 – 10:40 | From early idealism to the inside of FacebookKelly traces her early interests in journalism, constitutional debate, and technology, leading to nearly 15 years at Facebook. She describes the early years before the algorithmic feed, when the mission felt real and principled. She reflects on Cambridge Analytica, internal decision-making, and how loyalty, good faith, and “doing the right thing” slowly became tools for accumulating power.10:41 – 29:24 | Horizon Worlds, retaliation, and the open secret about kidsKelly recounts her time as the only woman on the senior leadership team for Horizon Worlds. She describes discovering that children were already using the product through adult accounts, being exposed to adults without parental consent, while Meta collected vast amounts of data in violation of federal law. She speaks plainly about misogyny, silencing women, attorney-client privilege used to avoid accountability, and the moment the rollout was paused for “quality” reasons while the truth stayed buried.29:25 – 1:02:00 | The harms parents, therapists, and social workers need to understandKelly lays out, in clear and specific terms, the dangers children face across social platforms:• Sextortion pipelines that move from shame to suicide in hours or days• Relentless bullying amplified by anonymity and disappearing messages• Algorithmic feeds that escalate vulnerability into self-harm content• Drug acquisition through social apps leading to fentanyl deaths• Viral challenges that kill curious kidsShe explains how “teen accounts” and parental controls function as sugar pills. They reassure parents without protecting children.1:02:01 – 1:11:30 | A chapter for parentsKelly speaks directly to parents. Delay smartphones as long as possible. She recommends no social media. Give yourself permission to reevaluate decisions with new information and self-compassion. She shares how technology can exist in a home without extractive platforms, from family computers to creative and educational tools. She names the economic manipulation at play and explains why neurodivergent children face disproportionate risk. This is not a parenting failure. It is a systemic one.1:11:31 – 1:21:53 | Moral injury, refusing blood money, and choosing truthKelly shares her mental health collapse, autistic burnout, and the moment...

New England Weekend
Your Data, Your Dollars: Algorithmic Pricing vs. Your Next Shopping Trip

New England Weekend

Play Episode Listen Later Jan 31, 2026 18:57 Transcription Available


Dynamic pricing, like surging rideshare prices during a storm, has been around for some time. Companies are now embracing a tactic called "algorithmic pricing", where the personal data it gets from your shopping habits, apps, discount cards, browsing, and much more is used to set prices for you, personally, in real-time for various items. Noah Giansiracusa, Bentley University professor and author of "Robin Hood Math", joins the show to break down how this all works and explain what it means for the future of shopping.See omnystudio.com/listener for privacy information.

Round Table China
Say goodbye to algorithmic price discrimination

Round Table China

Play Episode Listen Later Jan 28, 2026 30:05


Loyalty can be expensive. For years, digital platforms have used your personal data to quietly test the limits of what you will pay. Now, a global crackdown has begun. From China to the U.S., new laws are prying open the algorithmic black box. This emerging transparency promises to challenge the secret math of personalized pricing, a hidden economy built on data. On the show: Steve, Niu Honglin & Yushan

china loyalty say goodbye algorithmic price discrimination yushan
Disruption Now
Disruption Now Episode 190 | What Is Explainable AI?

Disruption Now

Play Episode Listen Later Jan 23, 2026 48:41


Dr. Kelly Cohen is a Professor of Aerospace Engineering at the University of Cincinnati and a leading authority in explainable, certifiable AI systems. With more than 31 years of experience in artificial intelligence, his research focuses on fuzzy logic, safety-critical systems, and responsible AI deployment in aerospace and autonomous environments. His lab's work has received international recognition, with students earning top global research awards and building real-world AI products used in industry.In this episode 190 of the Disruption Now Podcast,

Nephilim Death Squad
The Raven: 015 - Civil War on ICE

Nephilim Death Squad

Play Episode Listen Later Jan 22, 2026 110:42 Transcription Available


In this episode of The Raven, David “The Raven” Corbo breaks down the ICE shooting incident, algorithm-driven narrative warfare, and how modern media systems manufacture division through selective framing and emotional manipulation.What starts as a single police shooting quickly reveals something much larger: algorithmic reality tunnels, ideological echo chambers, and how tragedy is weaponized to pit people against each other while truth becomes secondary to narrative loyalty.The episode also goes far deeper—into generational trauma, spiritual oppression, family testimony, and why personal history matters when interpreting the unseen forces shaping culture.Topics covered include:The ICE shooting and why different people saw different “realities”Algorithmic censorship and selective footage suppressionLeft vs right narrative warfareMedia manipulation and emotional primingSpiritual oppression, generational iniquity, and testimonyWhy division is always the real objectiveDiscernment in an age of manufactured outrageThis is not a hot take.This is pattern recognition.If you feel like the truth keeps getting buried under outrage cycles, this episode explains why.raven ice

Edgy Ideas
103: Lacanian Insights on AI

Edgy Ideas

Play Episode Listen Later Jan 21, 2026 36:46


Show NotesIn this episode Simon and Dr. Jack Black, Associate Professor at Sheffield Hallam University, think dangerously about AI through the unsettling lens of Lacanian psychoanalysis. This is a conversation about desire, discourse, power and the fantasies we project onto machines.Drawing on Lacan, Jack reframes AI not as a neutral tool or intelligent object, but as a relational phenomenon - one that speaks into us, structures us, and increasingly stands in for authority itself. Together, Simon and Jack interrogate how AI comes to occupy the place of the Big Other: the supposed holder of knowledge, truth, and certainty in a fragmented world.They explore Lacan's four discourses, particularly the discourse of the hysteric, as a way of resisting AI's creeping authority and the ideological narratives that present it as omniscient, objective, or inevitable. AI, they argue, does not know in any human sense - it recombines, repeats, and reflects back our own symbolic order, including its exclusions, biases and violences.The conversation moves into education, where AI is rapidly being positioned as a new master signifier. What happens when learning is outsourced to algorithmic systems? What kinds of subjects are being produced? And whose knowledge is being legitimised - or erased - in the process?Throughout the episode, AI is revealed as a site where cultural anxiety, political power, and unconscious desire collide. Rather than rejecting technology, Simon and Jack argue for a more critical, psycho-social engagement - one that keeps the human, the relational, and the ethical firmly in view.This is a conversation about AI, but it is also about us: our longing for certainty, our fear of lack, and our temptation to hand over authority to machines. Lacan, unexpectedly, offers not despair but hope - a way to stay with complexity and resist the fantasy that technology can save us from being human.Key Takeaways Lacanian psychoanalysis offers a radical way to rethink AI beyond hype and fear. AI is relational - it emerges within human discourse, not outside it. The discourse of the hysteric provides a critical stance toward AI as authority. AI does not “know”; it mirrors and amplifies existing symbolic systems. Education must resist uncritical adoption of AI as a master solution. Algorithmic systems reproduce social bias, including racism and exclusion. Technology increasingly objectifies the Big Other. AI exposes deep tensions around desire, knowledge, and power. Ideology sits quietly behind the push to normalise AI everywhere. Lacan helps us stay critical, hopeful, and human in a technological age. KeywordsAI, Lacan, psychoanalysis, discourse, education, culture, technology, relationality, society, human experienceBrief BioDr. Jack Black is Associate Professor of Culture, Media, and Sport at Sheffield Hallam University. An interdisciplinary researcher, working across the disciplines of psychoanalysis, media and communications, cultural studies, and sport, his research focuses on topics related to race/racism, digital media, and political ecology. He is the author of The Psychosis of Race: A Lacanian Approach to Racism and Racialization (Routledge, 2023) and co-editor of Sport and Psychoanalysis: What Sport Reveals about Our Unconscious Desires, Fantasies, and Fears (Lexington Books, 2024). He is also Senior Editor for the Journal, Sport and Psychoanalysis (Cogent Social Sciences).

The Pomp Podcast
How The Secret Crypto Algorithms REALLY Work | Andrew Parish

The Pomp Podcast

Play Episode Listen Later Jan 20, 2026 43:17


Andrew Parish is the co-founder of Arch Public. In this conversation, we discuss tokenization, the New York Stock Exchange's latest announcement, and the growing role of algorithmic trading in crypto markets. We also break down bitcoin's recent price action, regulation and the Clarity Act in Washington, and how new AI tools are changing the way companies like Arch Public are built.======================BitcoinIRA: Buy, sell, and swap 80+ cryptocurrencies in your retirement account. Take 3 minutes to open your account & get connected to a team of IRA specialists that will guide you through every step of the process. Go to https://bitcoinira.com/pomp/ to earn up to $1,000 in rewards.======================Simple Mining makes Bitcoin mining simple and accessible for everyone. We offer a premium white glove hosting service, helping you maximize the profitability of Bitcoin mining. For more information on Simple Mining or to get started mining Bitcoin, visit https://www.simplemining.io/======================TIMESTAMPS:0:00 – Intro2:01 – NYSE tokenization announcement & impact of 24/7 markets4:56 – Coinbase & Robinhood vs Wall Street9:30 – Algorithmic trading: stocks vs crypto16:22 – AI agents vs trading algorithms19:52 – Speed, infrastructure & high-frequency trading23:01 – Crypto regulation & the Clarity Act26:19 – U.S. politics, regulation, and bitcoin30:59 – Sovereignty, taxes & asset seizure concerns34:06 – Building Arch Public with new AI dev tools

The Capitol Pressroom
Algorithmic pricing raises consumer protection concerns

The Capitol Pressroom

Play Episode Listen Later Jan 19, 2026 16:41


Jan. 19, 2026- Following an investigation into the use of algorithmic pricing by Instacart, the state attorney general's office is asking for information from the third-party shopping service. We explore the use personal data to set prices and efforts to regulate these practices with Justin Brookman, director of technology policy for Consumer Reports, which helped expose the tactics of Instacart.

Hotel Bar Sessions
MINIBAR: Algorithmic Nostalgia (with Leigh M. Johnson)

Hotel Bar Sessions

Play Episode Listen Later Jan 16, 2026 34:06


Why do AI's fabricated memories "feel" so true?Hotel Bar Sessions is currently between seasons and while our co-hosts are hard at work researching and recording next season's episodes, we don't want to leave our listeners without content! So, as we have in the past, we've given each co-host the opportunity to record a "Minibar" episode-- think of it as a shorter version of our regular conversations, only this time the co-host is stuck inside their hotel room with whatever is left in the minibar... and you are their only conversant!AI engineers and designers are currently, and rightly, focused on minimizing the deleterious effects of AI's three primary "memory problems"-- hallucinations, catastrophic forgetting, and bias-- but in this Minibar episode, HBS co-host Leigh M. Johnson argues that none of these problems can be design-engineered away. They are, according to Johnson, baked-in and unavoidable structural elements of any language-based system reliant on an archive.Borrowing from Jacques Derrida's work on archives, language, and memory, Johnson argues that we should think more seriously about the manner in which LLM's outputs come to us cloaked in the garb of memory. We take AI hallucinations, for example, to be true because they inspire in us a feeling of nostalgia... something that we could have remembered, perhaps even should have remembered, but didn't.Or didn't we?Tune in for the first episode of Season 15 on January 23, 2026!Full episode notes available at this link:https://hotelbarpodcast.com/minibar-algorithmic-noslagia---------------------SUBSCRIBE to the podcast now to automatically download new episodes!SUPPORT Hotel Bar Sessions podcast on Patreon here! (Or by contributing one-time donations here!)BOOKMARK the Hotel Bar Sessions website here for detailed show notes and reading lists, and contact any of our co-hosts here.Hotel Bar Sessions is also on Facebook, YouTube, BlueSky, and TikTok. Like, follow, share, duet, whatever... just make sure your friends know about us! ★ Support this podcast on Patreon ★

Crypto Altruism Podcast
Episode 234 - Geo - Community-Governed Knowledge for the Open Internet, with The Graph Co-Founder Yaniv Tal

Crypto Altruism Podcast

Play Episode Listen Later Jan 13, 2026 46:23


For episode 234, we're excited to welcome Yaniv Tal, a legendary builder who has helped shape the foundations of Web3 as we know it. Yaniv is the co-founder and former CEO of The Graph, one of the most critical pieces of decentralized infrastructure in the ecosystem, powering tens of thousands of applications across Web3. Today, he's building Geo, a project focused not on scaling transactions, but on rebuilding trust, knowledge, and coordination on the internet itself.In today's episode you'll learn:

TalkingTrading
The Silent Killer of Algo Trading: Why Your Trading System Fails in Real Markets

TalkingTrading

Play Episode Listen Later Jan 12, 2026 39:57


Algorithmic trading promises consistency, discipline, and freedom from emotion - but for many traders, reality looks very different.In this episode of Talking Trading, Louise Bedford is joined by systematic trading expert Adrian Reid to uncover the silent killer behind so many underperforming algo trading systems: human psychology.You might have a well-built algorithm. The logic is solid. The backtests look impressive. Yet when real money is on the line, doubt creeps in. Trades get overridden. Rules get tweaked. Confidence evaporates at exactly the wrong time.Together, Louise and Adrian explore why even automated trading systems still require psychological discipline - and how traders unknowingly sabotage their own results by chasing control instead of consistency.You'll discover:Why traders override profitable algorithms even when the data is soundHow fear, doubt, and impatience undermine systematic tradingThe danger of endless optimisation and 'fiddling'What it really takes to trust your trading system through drawdownsPractical ways to trade your rules instead of your emotionsIf you've ever found yourself rewriting code mid-drawdown, hesitating on valid signals, or questioning a system you once believed in, this episode will help you break that cycle.Perfect for traders using algorithmic, systematic, or rules-based strategies who want more consistency, confidence, and clarity in real market conditions.-----------------------------Get your free trading plan templateAre you tired of the trading rollercoaster? One day you're up, the next, you're down—and it's exhausting. You're putting in the effort, but your results just aren't where you want them to be.It's time to take control. Visit www.tradinggame.com.au and grab my free trading plan template. It's designed to help you create a plan that fits your lifestyle.Trade confidently. Louise Bedford is a best-selling author and founder of www.tradinggame.com.au and www.talkingtrading.com.au.FacebookYouTube TwitterLinkedIn

Law School
Criminal Law Part Seven: The Changing Face of Justice

Law School

Play Episode Listen Later Jan 11, 2026 54:29


This conversation explores the profound transformation in the criminal justice system driven by technological advancements, particularly in the realm of cybercrime, data analysis, and artificial intelligence. It delves into the challenges of jurisdiction, the complexities of cross-border evidence collection, and the implications of encryption on privacy and security. The discussion also highlights systemic biases revealed through data, the fairness paradox in algorithmic risk assessments, and the need for legislative reforms to adapt to these changes. Ultimately, it emphasizes the importance of AI literacy within the justice system to ensure that core principles of due process are upheld in a digital world.In today's rapidly evolving legal landscape, the traditional foundations of criminal justice are being reshaped by three transformative forces. As we delve into these changes, we uncover the profound impact of cybercrime, data-driven insights into systemic bias, and the philosophical shift towards restorative justice.Cybercrime and Jurisdiction: The borderless nature of cybercrime challenges traditional notions of jurisdiction. With crimes often spanning multiple countries, the Budapest Convention on Cybercrime emerges as a critical framework for international cooperation. However, the absence of universal enforcement mechanisms highlights the need for continued legal innovation.Data-Driven Insights into Systemic Bias: Data analysis reveals deep-rooted biases in the justice system, particularly affecting marginalized communities. Tools like COMPASS, intended to introduce objectivity, have inadvertently amplified existing biases. This underscores the importance of transparency and fairness in algorithmic decision-making.Restorative Justice and Legislative Reform: The shift towards restorative justice emphasizes healing and accountability over punishment. By involving victims, offenders, and communities in the justice process, this approach aims to repair harm and reduce recidivism. Legislative reforms, such as the elimination of mandatory minimums and bail reform, further support this transformative vision.Conclusion: As we navigate these changes, the legal profession must adapt to ensure justice remains fair and equitable. By embracing technological advancements and addressing systemic biases, we can uphold the rule of law and protect the rights of all individuals.Subscribe Now: Stay informed about the latest developments in criminal justice by subscribing to our newsletter.TakeawaysThe traditional era of criminal justice is fundamentally over.Cybercrime challenges the concept of jurisdiction.International cooperation is essential for addressing cybercrime.Cross-border evidence collection is a significant bottleneck.Encryption poses a dilemma between privacy and security.Authentication of digital evidence is crucial but not sufficient for admissibility.Deepfakes threaten the integrity of multimedia evidence.Data analysis reveals systemic biases in sentencing.Algorithmic risk assessments can perpetuate existing biases.Legislative reforms are necessary to adapt to technological advancements.criminal justice, cybercrime, jurisdiction, international law, encryption, digital evidence, systemic bias, AI, legislative reform, due process

The Bible for Everyday Life
Act 2: The Algorithmic Apprenticeship

The Bible for Everyday Life

Play Episode Listen Later Jan 10, 2026 31:20


In Act 2: The Fear Factor of Faith, part of The Long Game: Finding Purpose with the God Who Sees, we wrestle with the reality that fear is often the biggest reason we don't step forward in faith—fear of getting it wrong, fear of God being silent, and fear that our failure might hurt others. Looking at Abraham's story in Genesis 12–13, we sit in the uncomfortable space where God seems quiet, circumstances fall apart, and Abraham makes fear-driven decisions that cause real damage—yet God doesn't cancel him. This message isn't about pretending faith makes life smooth or rushing past pain to a happy ending; it's about facing our false selves, owning the consequences of our choices, and discovering that while failure is painful and real, it isn't the end. God's grace doesn't just forgive us—it invites us to grow into who we were actually created to be, individually and as a church, so healing can begin.

Nightside With Dan Rea
What To Know About Algorithmic Pricing

Nightside With Dan Rea

Play Episode Listen Later Jan 6, 2026 40:37 Transcription Available


Bradley Jay Filled in On NightSide You might be wondering what exactly is algorithmic pricing? In short, when companies use your personal data collected online through various social media outlets, apps, etc. and use that data to personalize your pricing on goods and services. In other words, two people could be charged different prices for the same item based off what an algorithm determines the highest price you’re willing to pay. Should personalized pricing be banned or at the very least mandated? Noah Giansiracusa, professor of mathematics at Bentley University and faculty associate at Harvard University, joined Bradley to explain!See omnystudio.com/listener for privacy information.

Al-Mahdi Institute Podcasts
Blue Notes and Black Codes: Womanism, Digital Faith, and the Algorithmic Future by Rev. Dr Shonda Nicole Gladden

Al-Mahdi Institute Podcasts

Play Episode Listen Later Dec 31, 2025 21:38


This talk centres Black women's digital religious leadership through a Womanist lens. Rev. Dr Gladden explores how digital rituals, online worship, and algorithmic systems intersect with justice, creativity, and resilience in contemporary faith communities.

The World of Eora: an Avowed & Pillars of Eternity Lore Podcast
Ep. #155: Algorithmic Roleplaying in the World of Eora

The World of Eora: an Avowed & Pillars of Eternity Lore Podcast

Play Episode Listen Later Dec 29, 2025 31:17


The World of Eora is a news & lore podcast about the fantasy setting created by Obsidian Entertainment for their cRPG series, Pillars of Eternity, and their action RPG: Avowed.This week is an unorthodox episode. I don't discuss a lore topic so much as I discuss a form of roleplaying in video games that I've enjoyed since I was a youth. At the time I didn't have a name for it, but as I grew I began calling it "algorithmic roleplaying", which some have asked me about. So I made this episode to outline how it works (for me), and maybe it'll inspire you to try your own as well...worldofeora@gmail.com@worldofeorako-fi.com/worldofeora

Racism White Privilege In America
Algorithmic Privilege

Racism White Privilege In America

Play Episode Listen Later Dec 23, 2025 4:11


Today, we're diving into a critical issue shaping our digital world: "algorithmic privilege."  As Artificial Intelligence becomes increasingly integrated into our daily lives,  from job applications to healthcare, a new form of societal challenge has emerged where AI systems inadvertently, or sometimes explicitly, favor certain groups, amplifying existing social inequalities.At its core, algorithmic privilege stems from systematic errors, often termed algorithmic bias, within machine learning algorithms, leading to unfair or discriminatory outcomes. This isn't usually an inherent flaw in the algorithm itself, but rather a reflection of the data and choices made during its development.Become a supporter of this podcast: https://www.spreaker.com/podcast/racism-white-privilege-in-america--4473713/support.

The Political Orphanage
Jason Pargin on Internet Addiction and Algorithmic Horror

The Political Orphanage

Play Episode Listen Later Dec 17, 2025 82:15


Former Cracked.com editor Jason Pargin explores the subject of how social media makes us insane and warps the universe we're in, in his new book "I'm Starting to Worry About This Black Box of Doom." He joins to discuss.

Conservative Daily Podcast
Joe Oltmann Untamed | Tommy & Nick Pitruzello | America: The Path Forward | 12.15.25

Conservative Daily Podcast

Play Episode Listen Later Dec 16, 2025 114:53


Ignite your week with Joe Oltmann Untamed, where Joe drops a bombshell: he's seriously considering a full-throttle run for office, deciding December 26—because neither side has honored the people. With Trump vowing "truckloads" of 2020 election evidence proving it was rigged, Tulsi Gabbard confirming DNI probes into election integrity, and Rasmussen highlighting historic Supreme Court rulings on federal oversight of stolen races, Joe demands justice: prison or death penalty for the thieves who stole our voice. From Colorado's betrayal to threats against patriots like Rep. Brandi Bradley, Untamed exposes the cartel still defying the will of the people—no more half-measures, no more silence.Algorithmic trading powerhouse Nick Pitruzzello—the "Algo Cowboy" and co-founder of Algo Factory—storms in with his 2026 economic and crypto forecast. As debt explodes past $36 trillion and fiat crumbles, Nick's geopolitics-infused algorithms reveal what's coming: dollar devaluation, BRICS rebellion, and whether Bitcoin surges as digital gold or regulators crush the escape hatch. From hyperinflation hedges to altcoin plays in crisis, Nick arms you with high-performance strategies to survive—and thrive—in the chaos ahead. No coding needed, just raw truth for real-market warriors.The world burns while the fight intensifies: Trump's FBI foils an Islamist New Year's Eve bombing in LA, Australia reels from a father-son massacre at Bondi Beach, and Colorado's own radicals threaten families. Joe closes with fire—this is war on evil, from rigged elections to economic slavery.

Self Directed
Nicklas Bergman | How to Stay Sceptical in an Algorithmic World

Self Directed

Play Episode Listen Later Dec 14, 2025 65:09 Transcription Available


Nicklas Bergman is a deep-tech investor and technology explorer who focuses on how new tools shape everyday life rather than predicting distant futures. The episode examines AI, social media, and regulation through concrete examples from work, education, family life, and investing, with an emphasis on curiosity, skepticism, and personal judgment.

Techmeme Ride Home
Algorithmic Pricing?

Techmeme Ride Home

Play Episode Listen Later Dec 10, 2025 20:56


Instagram is giving you some control over your algorithm. Is Instacart using algorithmic pricing? SpaceX thinks it will be worth $1.5 trillion. Has DeepSeek been smuggling chips? And what if your startup's side-hustle can plug into the AI CAPEX bonanza? Instagram Will Start Letting You Pick What Shows Up in Your Reels (Wired) Same Product, Same Store, but on Instacart, Prices Might Differ (NYTimes) SpaceX to Pursue 2026 IPO Raising Far Above $30 Billion (Bloomberg) DeepSeek is Using Banned Nvidia Chips in Race to Build Next Model (The Information) Boom Supersonic raises $300M to build natural gas turbines for Crusoe data centers (TechCrunch) Learn more about your ad choices. Visit megaphone.fm/adchoices

Ad Law Access Podcast
New York's Algorithmic Pricing Disclosure Law Takes Effect

Ad Law Access Podcast

Play Episode Listen Later Dec 3, 2025 6:10


New York's new algorithmic pricing law is now in effect, requiring businesses that use consumer data and algorithms to set individualized prices to clearly disclose: “THIS PRICE WAS SET BY AN ALGORITHM USING YOUR PERSONAL DATA.” In this episode, we unpack what triggered the law, how far its reach extends, and what it means for companies using dynamic pricing, personalization, or AI-driven optimization tools — including new compliance obligations and enforcement risk for legal, marketing, and compliance teams. Hosted by Simone Roach. Based on a blog post by Paul L. Singer, Beth Bolen Chun, Abigail Stempson, and Salim Rashid.

On with Kara Swisher
Fighting for Truth in a Rage-Driven Algorithmic Age with Jessica Yellin

On with Kara Swisher

Play Episode Listen Later Nov 27, 2025 51:02


In the mid-2010s, television journalist and former chief White House correspondent Jessica Yellin left her job at CNN to go independent. A few years later, she founded News Not Noise, a multi-platform news outlet that publishes all across the internet  (mainly on Substack, Instagram and YouTube). It made Yellin one of the first journalists to ditch mainstream media for social media, and it's given her a unique perspective on the challenges and opportunities facing independent journalists, newsfluencers, and content creators in a crowded media environment.   In a live interview hosted by the Center for Journalism Ethics at the University of Wisconsin–Madison, earlier this fall, Kara and Jessica talk about what it takes to be a successful online news creator, the effects President Trump's attacks on fact-based journalism have had on the news business as a whole, and how news creators can adapt to changing social media algorithms and AI. They also talk about solutions that could help the entire news industry in an era of waning public trust.  Please note: This conversation was recorded before X rolled out a new transparency location feature, revealing some prominent pro-MAGA accounts are not based in the U.S. despite claims on their profiles.  Questions? Comments? Email us at on@voxmedia.com or find us on YouTube, Instagram, TikTok, Threads, and Bluesky @onwithkaraswisher. Learn more about your ad choices. Visit podcastchoices.com/adchoices

Flavortone
Episode 66: On the Sixth Day, God Made Algorithmic Music

Flavortone

Play Episode Listen Later Nov 19, 2025 75:06


Alec and Nick discuss the algorithm as a mysterious force within the production and consumption of music. Despite being used daily in our various contendings with digital platforms and culture, the term is often misunderstood. The conversation loosely defines the term as "some kind of procedure," embarking on a survey of chance (Cage), serialism (Schoenberg), Bach & Hindustani classical music, scales and modes, The League of Automatic Music Composers, Laurie Spiegel, newer electronic music, and more—as well as philosophical debates between form and process. Is an algorithm a dialectic? Do algorithms produce form, or does form precede an algorithmic process? Ultimately, the discussion draws latent comparisons to the idea of musical truth and an algorithm itself, and outlines a reversal of algorithm as a set of procedures that would create and bring music into a being, to a process that now entraps and contains it. The episode concludes with a discussion of algorithms that bring us to a contemporary visual culture of music, tying in The Velvet Underground & Warhol, Rosalía, Björk, and more.

Bannon's War Room
WarRoom Battleground EP 893: AI The Algorithmic Parasite

Bannon's War Room

Play Episode Listen Later Nov 18, 2025


WarRoom Battleground EP 893: AI The Algorithmic Parasite

inControl
ep38 - inControl guide to ... Feedback

inControl

Play Episode Listen Later Nov 17, 2025 85:04


Outline00:00 – Intro07:22 – Anatomy of a feedback loop15:12 – A brief historical recap on the history of feedback23:40 – Inventing the negative feedback amplifier34:28 – Feedback in biology, economics, society, and ... board games!52:44 – Negative vs positive feedback59:15 – Feedback, causality, and the arrow of time1:06:22 – Classics: fundamental limitations, uncertainty, robustness1:21:30 – Adaptive control: learning in the loop1:29:50 – Modern AI feedback loops (RL, social media, alignment)1:40:40 – OutroLinksWatt's flyball governor: https://tinyurl.com/ne5nene3Maxwell - "On Governors": https://tinyurl.com/2a7cxj7m Black - "Inventing the negative-feedback amplifier": https://tinyurl.com/yevsemdpNyquist Criterion: https://tinyurl.com/33hfbw8mBode's integral: https://tinyurl.com/53sxkdzuWiener - "Cybernetics": https://tinyurl.com/yta899ayApoptosis: https://tinyurl.com/mcxjycka Predator–prey dynamics (Lotka–Volterra): https://tinyurl.com/5cvx33tn Bird migration cues (photoperiodism): https://tinyurl.com/y2e7t22v Neuron action potentials: https://tinyurl.com/2wemwdn4Economic equilibrium & feedback: https://tinyurl.com/nhdx7r3sEcho chambers: https://tinyurl.com/4v8yk7e8 Game design: https://tinyurl.com/bdhdhv38Gap metric (Vinnicombe): https://tinyurl.com/y9nw3yveGeorgiou, Smith - "Feedback Control and the Arrow of Time": https://tinyurl.com/5xvj76jrAnnaswamy, Fradkov - "A Historical Perspective of Adaptive Control and Learning": https://tinyurl.com/4nfew8vm Algorithmic trading flash crash (2010): https://tinyurl.com/2dsrs8j2AI alignment: https://tinyurl.com/yvs3wnj8Support the showPodcast infoPodcast website: https://www.incontrolpodcast.com/Apple Podcasts: https://tinyurl.com/5n84j85jSpotify: https://tinyurl.com/4rwztj3cRSS: https://tinyurl.com/yc2fcv4yYoutube: https://tinyurl.com/bdbvhsj6Facebook: https://tinyurl.com/3z24yr43Twitter: https://twitter.com/IncontrolPInstagram: https://tinyurl.com/35cu4kr4Acknowledgments and sponsorsThis episode was supported by the National Centre of Competence in Research on «Dependable, ubiquitous automation» and the IFAC Activity fund. The podcast benefits from the help of an incredibly talented and passionate team. Special thanks to L. Seward, E. Cahard, F. Banis, F. Dörfler, J. Lygeros, ETH studio and mirrorlake . Music was composed by A New Element.

The Europeans
Help! My manager is an algorithm!

The Europeans

Play Episode Listen Later Nov 14, 2025 53:42


KATY IS BACK! And we are proud to report that her new baby no longer looks like far-right French politician Éric Zemmour. Relief all around!    It's been a hectic time in Europe, but we're happy to be covering it all—or, you know, a sizable sliver of it—starting with Latvia's potential withdrawal from the Istanbul Convention and the European Parliament's call for new regulation of algorithmic tech in the workplace. Algorithmic management has made its way into all sorts of industries; we dig into whether or not that's a good thing and how new legislation might help to protect us all.   Then it's off to Paris, where tens of thousands of shoppers have already flooded the aisles of the new brick-and-mortar Shein store and thousands of others have been protesting its very existence. That's not only because of Shein's environmentally toxic business model but because of the recent appearance of some despicable products on its website—which has led the French government to threaten to ban the fast-fashion giant. To break it all down, we rang up Paris-based fashion journalist Dana Thomas, author of the book Fashionopolis and host of the podcast The Green Dream.   Mentioned in this episode:   ‘“Cynical and completely reckless” Latvia has the highest femicide rate in Europe — including Russia. Its parliament just voted to exit a treaty protecting women from violence.' - Meduza, November 4, 2025 EU study: 37% of employees are monitored for working hours 1 in 4 workplaces make decisions with algorithms Case studies in algorithmic management Dana's book Fashionopolis Dana's newsletter, The Style Files   This week's Inspiration Station recommendations are the Rosalía album Lux and the podcast series Where Is Jón?, a co-production of RTÉ in Ireland and RÚV in Iceland.    We don't often have sponsors on this podcast but this week, we do: Patagonia. Three years ago, Patagonia named Earth as its only shareholder. But moving more profits to environmental causes hasn't made them a perfect company—let alone a sustainable one. Out now is Patagonia's 2025 Work-in-Progress report: the raw truth about where they're messing up, but also, the latest ways they're rethinking business as usual. You can check out the report here. This podcast was brought to you in cooperation with Euranet Plus, the leading radio network for EU news. But it's contributions from listeners that truly make it all possible—we could not continue to make the show without you! If you like what we do, you can chip in to help us cover our production costs at ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠patreon.com/europeanspodcast⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠ (in many different currencies), or you can gift a donation to a superfan. We'd also love it if you could tell two friends about this podcast. We think two feels like a reasonable number. 01:21  Katy's back! 05:33  Bad Week: Latvian politicians 19:08  Good Week: All European workers! (Maybe) 30:48  Interview: Dana Thomas on France's threat to ban SHEIN 46:00  The Inspiration Station: 'Lux' by Rosalía and 'Where is Jón'? 50:46  Happy Ending: Europe's first major elephant sanctuary Produced by Morgan Childs  Editorial support from Katz Laszlo Mixing and mastering by Wojciech Oleksiak Music by Jim Barne and Mariska Martina YouTube | Bluesky | Instagram⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠ | Mastodon⁠⁠⁠⁠⁠⁠⁠⁠⁠ | hello@europeanspodcast.com

Drunk Ex-Pastors
Podcast #554: Synthetic Lovers, Plastoline, and Algorithmic Disruption

Drunk Ex-Pastors

Play Episode Listen Later Oct 27, 2025 64:16


We begin this episode by highlighting a fight between Mark Zuckerberg and Joseph Gordon Levitt over the issue of “synthetic relationships” and their implications for young children. We discuss the recent invention of plastoline, an alternative fuel source that can power automobiles. We conclude by considering one of Bill Maher's recent New Rules involving intentionally disrupting our social media algorithms in order to make space for understanding of and dialog with our opponents.  

Balanced Black Girl
Creating Connection in a Digital World: Shan Boodram on Friendship, Modern Love and Algorithmic Attachment

Balanced Black Girl

Play Episode Listen Later Oct 7, 2025 60:25


#313: If you've ever felt like you're not measuring up to where you “should” be—whether that's finding partnership by a certain age, building lasting friendships as an adult, or navigating the expectations society throws at you—this episode is for you.Today, I'm sitting down with the always insightful Shan Boodram to talk about the real journey of connecting—from romance, to friendship and everywhere in between. Shan doesn't shy away from the messy, inconvenient parts of connection—whether it's dating in a world that prizes efficiency, how algorithms influence our decisions and attachment styles, or the challenges of learning how to nurture deep friendships later in life.We dive into why your age or relationship timeline doesn't define your worth, how to find and nurture love that truly sees you, and why inconvenience might be the secret ingredient to building bonds that last. Shan shares her wisdom on how to show up authentically in both romantic and platonic relationships, break away from rigid dating rules, and live a big, joy-filled life at any stage.If you've ever questioned whether it's too late for new love or genuine friendships—or if you're simply craving more meaningful connection—hit play on this episode.We talk about:How online algorithms are impacting our attachment styles and preferencesNavigating dating and romance without falling into the efficiency trapThe truth about cultivating friendships as an adultThe power of showing up, even when it's inconvenientLetting go of dating rules and loving by your own standardsLinks & Resources:Watch Lovers by ShanJoin the Lovers by Shan communityFollow Shan on Instagram @shanboodramGet your She's So Lucky MerchSponsors:LMNT: LMNT is a zero sugar electrolyte drink mix with a research-backed ratio of electrolytes. To try it out go to drinkLMNT.com/balancedles to receive a free LMNT sample pack with any purchase.Vionic Shoes: Use code LUCKY at checkout for 15% off your entire order at vionicshoes.com.Bumble: Start your love story on BumbleGrüns: Grüns are comprehensive nutrition packed into a snack pack a day. Visit gruns.co and use the code LUCKY for 52% off your first order.Vimergy: Vimergy: Vimergy makes liquid vitamins that are clean, potent, and actually easy for your body to absorb. Visit vimergy.com and use code LUCKY for 20% off your first order.Stay in TouchFollow on IG: @shessoluckypod @lesalfredFollow on TikTok @shessoluckypod @balancedlesSubscribe to the She's So Lucky Newsletter: https://shessolucky.kit.com/bestcaseVisit our website at shessoluckypodcast.comPlease note that this episode may contain paid endorsements and advertisements for products and services. Individuals on the show may have a direct or indirect financial interest in products or services referred to in this episode.Produced by Dear Media.See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.

The Daily Zeitgeist
We AREN'T Skibidi Cooked, Chat! with Etymology Nerd 09.23.25

The Daily Zeitgeist

Play Episode Listen Later Sep 23, 2025 51:32 Transcription Available


In episode 1935, Jack and Miles are joined by linguist and author of Algospeak: How Social Media Is Transforming the Future of Language, Adam Aleksic AKA Etymology Nerd, to discuss… Who Makes Our Language? America’s Kids Ain’t Able To Read Good Or Math Good, Words As Windows Into History, What Is Even Sincere Expression In The Age Of Algorithmic Language And Content? And more! How did students perform in the nation compared to 2019? LISTEN: Spiral by BugseedSee omnystudio.com/listener for privacy information.