Podcasts about Yi

  • 608PODCASTS
  • 1,541EPISODES
  • 45mAVG DURATION
  • 5WEEKLY NEW EPISODES
  • May 18, 2025LATEST

POPULARITY

20172018201920202021202220232024

Categories



Best podcasts about Yi

Show all podcasts related to yi

Latest podcast episodes about Yi

Fellowship Bible Church Conway
But Even If He Does Not...

Fellowship Bible Church Conway

Play Episode Listen Later May 18, 2025


But Even If He Does Not... Daniel 3:18 For the bulletin in PDF form, click here. Guiding Principle: Our Faith Outweighs Our CostOverview of Daniel What's in a Name?Shadrach: Hananiah, God is gracious, a compassionate God Meshach: Mishael, Who is like GodAbednego: Azariah, God helps, He is the ultimate rescuer The Trial The Accusation (Daniel 3:13-14) The Defense (Daniel 3:16-18) But Even if He Does Not... (Daniel 3:18) The Sentence (Daniel 3:19-23) What Do We Learn? Next Steps?Trust in GodTrust in the unknown because God knows Stand Tall against the worldStand Firm irregardless of the cost But Even if He Does Not...Mission Highlight - Pray for the Unreached: The Adu people of ChinaThe Adu people of China, a subgroup of the Yi nationality, number just 8,200 and remain entirely unreached with the gospel. Though they speak a dialect of Yunnan Chinese and have access to the complete Bible, Jesus Film, and audio recordings, there are no known Christians among them. Most younger Adu are non-religious, while older generations hold to animistic traditions. Isolated and largely unaware of the gospel, they need workers willing to bring the message already prepared in their language. Pray for a spiritual hunger and a powerful movement to Christ among the Adu.FinancesWeekly Budget 35,297Giving For 05/20 34,393Giving For 05/11 27,440YTD Budget 1,588,372Giving 1,532,340OVER/(UNDER) (56,032) VBS 2025 | June 23-27 | 9:00 am - 12:00 pmJoin us in Ancient Egypt! You'll explore Pharaoh's palace, experience thrilling “real-life” dramas, play high-energy games, sample tasty snacks, and hear unforgettable music. Plus, you'll meet lots of new friends! VBS is for children currently in kindergarten through fourth grade - invite a friend for free! Register by June 6, at fellowshipconway.org/register. The cost is $5 per child. New to Fellowship?We are so glad that you chose to worship with our Fellowship Family this morning. If you are joining us for the first time or have been checking us out for a few weeks, we are excited you are here and would love to meet you. Please fill out the “Connect Card” and bring it to the Connection Center in the Atrium, we would love to say “hi” and give you a gift. Atrium Remodel Exciting changes are happening to the atrium over the next two and a half months as we continue inviting people into God's story, equipping and releasing them to become reproducing disciples of Jesus Christ. The remodel includes adding a bathroom stall in both the men's and women's restrooms. The atrium will be under construction, but usable on Sundays, except the restrooms, which will be closed until mid-July. Please use the bathrooms that are located in the first kids hallway (elevator and stairway area). Imperishable: a 4-Week Study of 1 PeterJoin us for Imperishable, Wednesday nights at Fellowship beginning May 28, at 6 p.m. led by Heather Harrison. Text Shanna at 501-336-0332 to reserve childcare. Register at fellowshipconway.org/register.Fellowship 101We invite you to join us on Sunday, June 8, at 9:00 a.m. to learn more about Fellowship. This is a great opportunity to hear about our mission, values, and our ministries. If you're new to Fellowship, join us in the conference room (first floor) to hear what God is doing and where He is taking us. During this time, you will meet some of our ministry leaders and get to ask questions. Register at fellowhipconway.org/register..Fellowship Kids Summer BashSummer is almost here, and we want to celebrate! Join us on Saturday, May 31, from 10 to 12:00 p.m. for fun and games, finished off with some frosty treats. You won't want to miss out as we welcome summer with our Fellowship friends and family. Prayer During ServiceWe love to pray for one another. Our prayer team will have people at the front of the Auditorium under the signs Hope and Love to pray for you after the message. Please feel free to walk up to them for prayer or encouragement during the first worship song after the message. Change for Life - Life Choices FundraiserLife Choices is a pregnancy resource center in Conway that Fellowship Bible Church supports. If you were able to bring a baby bottle today, thank you. If you weren't able to bring it today, please drop it off at Life Choices, 1330 S. Donaghey. Your support provides women in Central Arkansas facing unplanned pregnancies a safe place for spiritual, physical, and emotional support.

Badass of the Week
Admiral Yi: One Man. One Fleet. Zero Mercy

Badass of the Week

Play Episode Listen Later May 13, 2025 55:05


When Japan invaded Korea in 1592, one man stood defiantly against impossible odds: Admiral Yi Sun-sin. Outnumbered, betrayed by jealous rivals, and stripped of command, Yi clawed his way back from ruin, inventing the legendary “Turtle Ships” to wreak havoc on enemy fleets. Using unmatched tactics and sheer audacity, Yi crushed Japan's armada again and again, earning victory even when it seemed hopeless. Join Ben and Andrew as they dive into the life of history's most badass naval commander—a warrior whose courage, cunning, and stubborn refusal to quit turned certain defeat into eternal glory.

Bi' Gidene Soralım | Türkçe Podcast
7.16 Bi Dönene Soralım | Yiğit Üstündağ

Bi' Gidene Soralım | Türkçe Podcast

Play Episode Listen Later May 8, 2025 37:10


Bu bölümde bi dönene soruyoruz ve bundan üç sene önce 4. sezonda konuk ettiğim, iş fırsatıyla 2021'de Stockholm'e taşınan Yiğit Üstündağ yeniden konuğum. O zaman Stockholm'de bir oyun şirketinde yeni işe başlamıştı; şimdi ise bu bölümde hem ülkeye dönüşü hem de İzmir'e taşınmasını konuşuyoruz.Göçün sadece gitmekle değil, kalmakla ve bazen geri dönmekle de ilgili olduğunu hatırlatan bu sohbette; yurt dışına taşınma süreci, adaptasyon, ters kültür şoku, geri dönme kararı ve Stockholm-İzmir hattında yaşam kalitesi üzerine farklı farklı şeyler konuştuk. Yiğit'in bir önceki bölümünü dinlemek için: 4.16 Stockholm'de Yaşamak Tıkla Gelsin hakkında detaylı bilgi almak için linke tıklayabilirsiniz.

Living Change I Ching podcast
A marriage dispersing

Living Change I Ching podcast

Play Episode Listen Later May 8, 2025 31:09


How does the Yi help in an impossibly painful situation? It's hard to describe - deep recognition, being recognised, a sense of reconnection. Dominique's reading for this episode: "What is the lesson that I must learn through the pain of losing my marriage and our future?" And Yi's answer: Dispersing - Hexagram 59, with no changing lines. It's a beautiful response - I hope you enjoy listening Transcript, for Change Circle members

Evrim Kuran
Evrim Kuran ile 3+3: Yiğit Güralp

Evrim Kuran

Play Episode Listen Later May 7, 2025 71:04


3+3'ün 86. bölümünde konuğum sinema sanatçısı, senarist, yazar Yiğit Güralp.

Architects Explained
Eui-Sung Yi Explained

Architects Explained

Play Episode Listen Later May 3, 2025 103:17


Eui-Sung Yi is a Partner at Morphosis Architects and the Director of The NOW Institute, a research center for urban strategy and sustainability with which he co-authored the bestselling 100 Buildings and the 730 page visual almanac HAITI NOW. He first obtained his B.Arch from Cornell University AAP program and his M.Arch II from the GSD.  Currently, he is an Adjunct Associate Professor at the University of Southern California where he had served in the past as Director of the Graduate Programs. On the professional practice side, Mr. Yi has had experience as VP of design in Seoul, running his own practice, and now working at Morphosis as head of their architectural and urban operations in Asia and the Middle East. In this episode we discuss his career evolution, built work, research, and advice for aspiring architects. Send us a text

Slacker & Steve
Full show - Tuesday | Work mistake | News or Nope - Travis Kelce, Blake Lively, and Taylor Swift | Slacker's socks | Life's simple pleasures | So where did the "Yi!" come from, anyways? | Would you listen to your friends if they hated your partner? |

Slacker & Steve

Play Episode Listen Later Apr 30, 2025 70:21


Full show - Tuesday | Work mistake | News or Nope - Travis Kelce, Blake Lively, and Taylor Swift | Slacker's socks | Life's simple pleasures | So where did the "Yi!" come from, anyways? | Would you listen to your friends if they hated your partner? | Is it gross to let your dog lick your plate? | T. Hack's mystery ball | Stupid stories @theslackershow @ericasheaaa @thackiswack @radioerin

Learning Bayesian Statistics
#131 Decision-Making Under High Uncertainty, with Luke Bornn

Learning Bayesian Statistics

Play Episode Listen Later Apr 30, 2025 91:46 Transcription Available


Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)Thank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan, Francesco Madrisotti, Ivy Huang, Gary Clarke, Robert Flannery, Rasmus Hindström, Stefan, Corey Abshire, Mike Loncaric, David McCormick, Ronald Legere, Sergio Dolia, Michael Cao, Yiğit Aşık and Suyog Chandramouli.Takeaways:Player tracking data revolutionized sports analytics.Decision-making in sports involves managing uncertainty and budget constraints.Luke emphasizes the importance of portfolio optimization in team management.Clubs with high budgets can afford inefficiencies in player acquisition.Statistical methods provide a probabilistic approach to player value.Removing human bias is crucial in sports decision-making.Understanding player performance distributions aids in contract decisions.The goal is to maximize performance value per dollar spent.Model validation in sports requires focusing on edge cases.

Kerem Önder
Büyüklerin ruhları görünürse aldanma! - Mektubat 148, 149 Kerem Önder

Kerem Önder

Play Episode Listen Later Apr 24, 2025 42:21


Bu mektûb, molla Sâdık-ı Kâbilîye yazılmışdır. Kendini kavuşmuş sanan, bir şey elde edemez. Büyüklerin rûhlarından fâidelenmeğe aldanmamalıdır. Onlar, kendi üstâdının latîfeleridir:“İki mektûbunuz arka arkaya geldi. Birinci mektûb, kavuşduğunuzu, doyduğunuzu bildiriyordu. İkincisi, susuzluğunuzu, boşluğunuzu anlatıyordu. Allahü teâlâya hamd olsun! Çünki her işin sonuna bakılır. Kendini doymuş sanan, birşeye kavuşmamışdır. Kendini boş, uzak sanan, kavuşmuş demekdir. Size arka arkaya bildirmişdim ki, büyüklerin rûhlarının zâhir olmasına, onların yardım etmelerine, sakın aldanmamalıdır.O büyüklerin sûretleri, kendi üstâdınızın latîfeleridir. O şekillerde görünmekdedir. Tek bir yere bağlanmak şartdır. Çeşidli yerlere bağlanan, birşey kazanmaz, zarar eder. Size çok söylemişdim ki, sona çabuk kavuşmak için, işe, vazîfeye sıkı sarılmalıdır. Lâzım olan şeyleri bırakarak, lüzûmsuz şeylerle uğraşmak, akla uygun değildir. Fekat siz, kendi görüşünüze uyuyorsunuz. Söz dinlemiyorsunuz. Siz bilirsiniz! Habercinin vazîfesi ancak bildirmekdir.”149.Bu mektûb, yine molla Sâdık-ı Kâbilîye yazılmışdır. Allahü teâlâ herşeyi sebeble yaratmakda ise de, belli bir sebebe bağlanmak lâzım olmadığı bildirilmekdedir:“Kardeşim molla Muhammed Sâdık! Bütün varlığınızla sebeblere bağlandığınıza şaşılır. Sebebleri yaratan “teâlâ ve tekaddes”, herşeyi sebeblerle yaratmakda ise de, herşey için belli bir sebebe yapışmak doğru değildir.Mısra tercemesi: Bir kapı kapanırsa, üzülme ey gönül, başkası açılır!Bu kısa görüşlülük, çok uygunsuz kimselerde bulunur. Sizin gibilerde bu hâli görmek pek çirkindir. Biraz kendinize geliniz! Bu kötülüğün derecesini anlayınız! Hem müttekî olmak, hem de Allahü teâlânın sevmediği şeylerin peşinde koşmak, çok çirkin bir işdir. Bu çirkinliğin, sizin gözünüze güzel görünmesine pek şaşılır. Çok lâzım olan şeyleri, ihtiyâcı giderecek kadar elde etmek için çalışmalıdır. Bütün vaktleri oraya vermek ve bütün ömrü onun arkasında geçirmek, tâm bir ahmaklıkdır. Fırsatın kıymetini biliniz! Bu fırsatı, sonu gelmez, lüzûmsuz şeyleri elde etmek için kaçıranlara binlerle yazıklar olsun! Mektûblaşmamız lâzımdır. Habercinin vazîfesi, yalnız haber vermekdir. İnsanların dedi-kodularına aldırmayın! Buna üzülmeyiniz! Size sürmek istedikleri lekeler, sizde bulunmadığı için, üzülmeniz doğru değildir. Herkesin kötülediği bir kimsenin iyi olması, çok büyük se'âdetdir. Fekat, bunun aksi olursa, çok tehlükelidir. Vesselâm.”"İnsanlar için hak yolunu kapatan beş şey vardır:Cahillikten rahatsız olmamak, dünya hırsı, cimrilik, amelde riya, kendi fikrini beğenmek." Hz. Ali ra.Şeytan taşlamaktan tavaf yapamıyoruz! Başarı, en iyi intikamdır.Yiğit 1000 gün yaşar fırsat bir gün düşerKorkularının üstüne git! Agresif ol ve yüzleş onlarla. Sert saldır! Vücudunda bir yer tutulup ağrıdığında, masör kişi o bölgeye sert bir masaj yapar, ödeme dönüşmüş olan kas yapını yumuşatır ve ağrı biter.Hasan-ı Basrî "rahmetullahi aleyh" hazretlerinin talebeleri, şeytanın vesvesesinden şikâyet ederek; "Yâ Şeyh! Şeytandan gâyet incindik. Hep bizi yaramaz işlere teşvik ediyor. "Elinize geçen dünyâyı sıkı tutun, size lâzım olacak." diyor ve bizi hayırdan alıkoyuyor." dediler.Hasan-ı Basrî hazretleri gülümseyerek buyurdu ki: "Şimdi buradaydı. O da sizden şikâyet eti. Dedi ki: "Şu Âdemoğullarına nasîhat eyle de benim hakkıma tamah etmesinler. Kendi haklarına râzı olsunlar. Hak teâlâ beni huzûrundan kovduğu zaman, dünyâyı ve Cehennem'i bana mülk kıldı. Cennet'i ve kanâati ise onlara verdi. Şimdi bunlar kendi haklarını bıraktılar benim mülküme tamah ediyorlar. Ben de onların îmânlarını almayınca dünyâyı kendilerine vermiyorum." dedi. Eğer şeytanın vesvesesinden emin olmak isterseniz, dünyâyı terk edin ve endişesini gönüllerinizden çıkarın."Bu nasîhatleri dinleyen talebeleri başlarını öne eğerek huzûrundan ayrıldılar.4 şeytanı tanımadan Allah dostu olamazsın. İblis, nefis, daha kötüsü kötü arkadaş, daha kötüsü kötü din adamı.Kol saatını dusurursen ne olur? Zamannn!

Learning Bayesian Statistics
#130 The Real-World Impact of Epidemiological Models, with Adam Kucharski

Learning Bayesian Statistics

Play Episode Listen Later Apr 16, 2025 69:05 Transcription Available


Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!Intro to Bayes Course (first 2 lessons free)Advanced Regression Course (first 2 lessons free)Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)Thank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan, Francesco Madrisotti, Ivy Huang, Gary Clarke, Robert Flannery, Rasmus Hindström, Stefan, Corey Abshire, Mike Loncaric, David McCormick, Ronald Legere, Sergio Dolia, Michael Cao, Yiğit Aşık and Suyog Chandramouli.Takeaways:Epidemiology requires a blend of mathematical and statistical understanding.Models are essential for informing public health decisions during epidemics.The COVID-19 pandemic highlighted the importance of rapid modeling.Misconceptions about data can lead to misunderstandings in public health.Effective communication is crucial for conveying complex epidemiological concepts.Epidemic thinking can be applied to various fields, including marketing and finance.Public health policies should be informed by robust modeling and data analysis.Automation can help streamline data analysis in epidemic response.Understanding the limitations of models...

Turn the Page Podcast
Turn The Page – Episode 345C – Yiğit Turhan

Turn the Page Podcast

Play Episode Listen Later Apr 10, 2025 23:01


Yiğit Turhan took a virtual trip from Milan, Italy to discuss THEIR MONSTROUS HEARTS, a gothic tale of horror that bathes you in beautiful language and tense imagery from start to finish.  

Encycliques
Arrêté 3 fois par la police en Uruguay

Encycliques

Play Episode Listen Later Apr 10, 2025 49:06


Notre nouvelle chaîne YouTube

De Balie Spreekt
Wat als Turkije je als terrorist ziet?

De Balie Spreekt

Play Episode Listen Later Mar 26, 2025 68:08


Wat als Turkije je als terrorist ziet? Dit overkwam twee Nederlandse zussen die vastzitten in Turkije. Samen met hun moeder en een gevluchte Turkse activist praten we over de rechtsgang in Turkije en de diplomatieke verhouding met Nederland.Hoe gaat Nederland om met haar burgers die in Turkije worden verdacht van terrorisme? En hebben deze verdachten kans op een eerlijk proces? Naar aanleiding van de podcast Enkele reis Istanbul gaan we hierover in gesprek in De Balie.Podcastmaker Emmie Kollau en researcher Catrien Spijkerman vertellen het verhaal van Betül en Bergün. De zussen vertrokken naar Istanbul om zich aan te sluiten bij een Turkse protestband. In 2017 werden ze opgepakt en sindsdien zitten ze vast in Turkije op verdenking van terrorisme. Hun moeder Günay Akkaya neemt het publiek mee in hoe het gezin verstrikt raakte in het Turkse én het Nederlandse justitiesysteem. Ook spreken we met activist Yiğit Aksakoğlu. Hij zat maanden vast in Turkije omdat hij werd verdacht van het voorbereiden van een gewelddadige couppoging.Dit programma is in samenwerking met Human, Autres Directions en Aldus' producties.Zie het privacybeleid op https://art19.com/privacy en de privacyverklaring van Californië op https://art19.com/privacy#do-not-sell-my-info.Zie het privacybeleid op https://art19.com/privacy en de privacyverklaring van Californië op https://art19.com/privacy#do-not-sell-my-info.

Mevzu Fener
Samsunspor Maçı ve Fenerbahçe Gündemi | Mevzu Fener

Mevzu Fener

Play Episode Listen Later Mar 18, 2025 52:28


Fenerbahçemiz, Trendyol Süper Lig'in 28. haftasında konuk ettiği Reeder Samsunspor ile golsüz berabere kaldı. Mevzu Fener'in lokomotif programı olma iddiasıyla yola çıktığımız Mevzu Fener isimli programda Doğukan, Emre, Kaan ve Yiğit her hafta sizlerle.Mevzu Fener'i diğer sosyal medya hesaplarımızdan da takip edebilirsiniz.X - https://x.com/mevzufener1907Instagram - https://bit.ly/4digAEzPodcast HesaplarımızSpotify - https://bit.ly/4cEYA7lApple Podcasts - https://bit.ly/462D21OMail - mevzufener1907@gmail.com#fenerbahçe #samsunspor #süperlig

No mires por la ventana
No mires bajo la cama. Relatos para no dormir.

No mires por la ventana

Play Episode Listen Later Mar 7, 2025 93:56


Yihjan, la voz mágica detrás de Los Entes Bajo la Cama, nos comparte su historia personal con lo sobrenatural. Desde su infancia, una serie de sucesos inexplicables despertaron en ella una profunda curiosidad por lo oculto. Criada en un entorno que le permitió explorar y decidir su propio camino espiritual, poco a poco fue formando un criterio, encontrando en los lugares menos esperados historias que desafían nuestro entendimiento. Sin embargo, una presencia la ha acompañado desde siempre. Un ente misterioso aparece cada noche al pie de su cama. No se mueve. No habla. Solo la observa fijamente, como si esperara algo... En su búsqueda de respuestas, Yi se adentra en los conocimientos ancestrales y la cosmogonía de civilizaciones antiguas. Lo que descubre cambia su percepción de la realidad: aquellos entes que la acechaban en la oscuridad no eran simples espectros… sino deidades con un propósito. Mensajes ocultos, señales del más allá y revelaciones impactantes marcarán su destino.

Taking Responsibility for Torah
SOTAH 2025 Iyyun Shiur #4 AUDIO

Taking Responsibility for Torah

Play Episode Listen Later Mar 5, 2025 90:09


Given 3/3/2025Sponsored at YI by the Fistel family in memory of Eddie, lifelong learner and teacher

Taking Responsibility for Torah
SOTAH 2025 Iyyun Shiur #4 VIDEO

Taking Responsibility for Torah

Play Episode Listen Later Mar 5, 2025 90:09


Given 3/3/2025Sponsored at YI by the Fistel family in memory of Eddie, lifelong learner and teacher

Kerem Önder
Allah, üzülmeni istiyor? - Mektubat 140, 146 / Kerem Önder

Kerem Önder

Play Episode Listen Later Mar 2, 2025 37:10


Bu mektûb, Muhammed Ma'sûm-i Kâbilîye yazılmışdır. Sevenlerin sıkıntılara, üzüntülere dayanmaları lâzım geldiği bildirilmekdedir:“Fakîrleri seven kardeşim! Kalbinde sevgi taşıyanların sıkıntı ve üzüntü çekmeleri lâzımdır. Dervîşliği seçenlerin dertlere, sıkıntılara alışması lâzımdır.Fârisî beyt tercemesi: Seni sevmek, dert ve gam tatmak içindir, Yoksa, râhat etdirecek şeyler çokdur.Sevgili, sevenin çok üzülmesini ister. Böylece, kendinden başkasından büsbütün soğumasını, kesilmesini bekler. Sevenin râhatlığı, râhatsızlıkdadır. Âşıka en tatlı gelen şey, sevgili için yanmakdır. Sükûnet bulması çırpınmakdadır. Râhatı, yaralı olmakdadır. Bu yolda istirâhat aramak, kendini sıkıntıya atmakdır. Bütün varlığını sevgiliye vermek, ondan gelen herşeyi seve seve kapmak acısını, ekşisini, kaşları çatmadan almak lâzımdır. Aşk içinde yaşamak böyle olur. Elinizden geldiği kadar böyle olunuz! Yoksa, gevşeklik hâsıl olur. Sizin çalışmanız iyi idi. Bunun dahâ artmasını beklerken, azalıverdi. Fekat üzülmeyiniz. Eğer, kendinizi bu duraklamadan kurtarırsanız, eskisinden dahâ iyi olur. Sizi bu dağınıklığa sürükleyen şeylerin, toparlanmanıza da sebeb olacaklarını biliniz! Böylece, çalışmanız artar. Vesselâm.”146.“Oğlum Şerefeddîn Hüseynin mektûbu geldi. Allahü teâlâya hamd olsun ki, fakîrleri hâtırlamakla şereflenmekdesiniz. Aldığınız vazîfeyi çok yaparak zemânlarınızı kıymetlendiriniz! Fırsatı elden kaçırmayınız. Geçici olan şânlar, şerefler sizi aldatmasın. Dünyâ lezzetleri, hakîkî lezzetlerden mahrûm etmesin.Fârisî beyt tercemesi:Sana söyliyeceğim hep şudur: Çocuksun, yol ise korkuludur.Allahü teâlâ, bir kulunu gençlikde tevbe etmeğe kavuşdurursa ve bu tevbesini bozmakdan korursa, ne büyük ni'met olur. Diyebilirim ki, bütün dünyâ ni'metleri ve lezzetleri, bu ni'metin yanında, büyük deniz yanındaki bir damla su gibidir. Çünki bu ni'met, insanı Allahü teâlânın rızâsına, sevgisine kavuşdurur. Bu ise, dünyâ ve âhıret ni'metlerinin hepsinin üstündedir. Âl-i İmrân sûresinin onbeşinci ve Tevbe sûresinin yetmişüçüncü âyetinde meâlen, “Allah'ın râzı olması nimeti dahâ büyüktür” buyuruldu. Doğru yolda olanlara ve Muhammed Mustafâya “aleyhi ve alâ âlihissalevâtü vetteslîmâtü etemmühâ ve ekmelühâ” uymakla şereflenenlere selâm olsun!” RabbaniSahabîlerden biri şöyle dedi: Bir gün Peygamberimiz, aramızda gülüşürken çıkagelmişti. Bize, “Cehennem ardınızdayken nasıl gülersiniz? Vallahi, sizi gülerken görmemeliyim!" dedi ve yüzünü dönerek giti. Sanki başlarımıza birer kartal konmuş gibi olmuştuk. Fakat, az sonra yanımıza gelerek şu müjdeyi verdi: "Biraz önce Cebrail gelerek bana şöyle dedi. Yüce Allah buyuruyor ki: "Niçin kullarımın ümidini rahmetimden kesiyorsun? Kullarıma Benim affedici ve merhametli olduğumu, bunun yanında azabımın da ağır olduğunu bildir."ّدَاصرملابلّكَ برّنَ اBütün peygamberlerin ortak nasihati. Utanmadıktan sonra dilediğini yap.İyilikte kötülükte bulaşıcıdır."İnsanlar için hak yolunu kapatan beş şey vardır:Cahillikten rahatsız olmamak, dünya hırsı, cimrilik, amelde riya, kendi fikrini beğenmek." Hz. Ali ra.Bir vehabi yazdı sen ölünce cenaze namazına asla gelmicem. Hiç cevap vermem ama buna yazdım: Benim cenaze namazıma 1000 Peygamber gelecek, sen eksik kal nolur.“Güneşin Görevi Işık Saçmaktır! Yarasalar Rahatsız oluyor Diye, Güneş Bu Görevinden Vazgeçecek Değil Ya!” Şems-i TebriziŞeytan taşlamaktan tavaf yapamıyoruz!Başarı, en iyi intikamdır.Yiğit 1000 gün yaşar fırsat bir gün düşerBereket diye bişey var İslam'da. Kurtuluş savaşında Yunan nüfusu 10 milyon; Türkiye 10 milyon. Yıl 2025. Yunan yine 10 milyon; Türkiye 85 milyon.Korkularının üstüne git! Agresif ol ve yüzleş onlarla. Sert saldır! Vücudunda bir yer tutulup ağrıdığında, masör kişi o bölgeye sert bir masaj yapar, ödeme dönüşmüş olan kas yapını yumuşatır ve ağrı biter.Mülk Allahındır yazıyo apartmanda. Altında sahibinden satılık yazısı var!“Kendi ayıbı, insanların ayıbını görmekten alıkoyan kimseye müjdeler olsun." (Aclûnî, Keşfu'l-Hafa, II, 46)

Procento Miloše Čermáka
Nechodím do restuarace jen kvůli jídlu. Pro mě je to divadelní představení, říká šéfkuchař Radek David (268)

Procento Miloše Čermáka

Play Episode Listen Later Feb 26, 2025 101:51


Mluvili jsme o tom, co je a co není tzv. fine dining, o rozdílech mezi Prahou, Londýnem nebo New Yorkem, a taky o tom, co pro kuchaře znamená michelinská hvězda. Ta první, druhá, anebo dokoce třetí. “Samozřejmě, že toužím po tom, abych Michelina získal. Ale nezblázním se, když se to nestane,” říká Radek David, jeden z nejúspěšnějších českých kuchařů. Dlouho byl tváří známé pražské restaurace Veranda a dalších spřízněných podniků, například Babiččiny zahrady v Průhonicích. Řadu let jezdil budovat restaurace do Kyjeva na Ukrajině. A dnes je “kuchařem na volné noze”, nebo “létajícím šéfkuchařem”, který se věnuje privátnímu vaření, kuchařským kurzům či konzultacím.Na webu RadekDavid.cz píše své zážitky z nejlepších světových restaurací i z podniků, o kterých jste nejspíš nikdy neslyšeli. Představa ideální dovolené? Obejít co nejvíc restaurací. A nechat se inspirovat a taky o tom vyprávět ostatním. Tahle práce ho naplňuje a je spokojený, ale zároveň ví či doufá, že své jméno spojí ještě s nějakým dalším podnikem.O tom všem jsme si povídali, Přeju příjemný poslech, a jako obvykle přidávám text, který na základě podcastu vygenerovala YI (tentokrát poprvé nový Claude 3.7 Sonnet).Miluju rondon. Je to taková naše uniforma, říká šéfkuchař Radek DavidRozhovor Miloše Čermáka s Radkem Davidem nabízí fascinující vhled do zákulisí kulinářského světa, který přesahuje hranice běžné konverzace o jídle a vaření.Davida, který po dvou dekádách v legendárních pražských podnicích Babičina zahrada a Veranda nyní působí jako "létající šéfkuchař" na volné noze, charakterizuje pozoruhodná upřímnost a nezaměnitelná vášeň pro řemeslo. Jeho nová profesní etapa není jen pragmatickým kariérním obratem, ale projevem hlubší filozofie – touhy po autentické tvůrčí svobodě a sdílení znalostí.Pozoruhodná je Davidova ambivalence vůči žádoucí michelinské hvězdě. Na jedné straně o prestiž evidentně stojí, na druhé straně artikuluje oprávněnou kritiku systému, který stejnou hvězdou oceňuje sofistikované restaurace i stánky se street foodem. "Měli to vymyslet trochu jinak, třeba pro streetfood udělat hvězdu jiné barvy," říká. Odhaluje tak svůj přemýšlivý vztah k oboru.Fascinující je pasáž o praktikách špičkových světových restaurací, které si předem "lustrují" hosty, kteří si zarezervovali stůl. "Vědí, kdo přijde," popisuje Radek David svou zkušenost z newyorského Per Se, kde ho při druhé návštěvě přivítali jménem a připravili mu poznámkový blok, protože si pamatovali, že si při první návštěvě dělal poznámky.Z rozhovoru vystupuje i důležitý kontrast mezi českou a zahraniční potravinovou kulturou. Davidův nostalgický popis ukrajinských tržnic s vůní "rajčat, jahod, broskví" proti českému systému, kde "všechno to je podtržený, aby to vydrželo týden na lodi nebo v kamionu", nabízí zamyšlení nad globalizovaným zásobováním surovinami.Podtext celého rozhovoru tvoří téma gastronomické mediální kritiky. David s Čermákem shodně konstatují úpadek seriózních recenzí v tištěných médiích – ne kvůli nedostatku prostoru, ale kvůli náročnosti procesu. "Ty musíš mít člověka, který do tý restaurace jde, stráví s tím dvě hodiny, a redakce pak ještě zaplatí účet. To je důvod, proč tenhle žánr prakticky vymizel. Prostě na to v médiích nejsou peníze," vysvětluje Čermák ekonomiku dnešní žurnalistiky.Navzdory všem výzvám a tlakům současného gastronomického světa David uzavírá rozhovor překvapivě optimistickým výhledem: za deset let bude svět lepší. Po pronikavé sondě do vztahu společnosti, ekonomika a jídla je nadějeplná a občerstvující tečka. Něco jako dezert na závěr mnohachodového menu.

J. Brown Yoga Talks
Emily Smith - "Ins and Outs of Online Yoga Industry"

J. Brown Yoga Talks

Play Episode Listen Later Feb 17, 2025 94:28


Emily Smith, founder of Yogaversity, talks with J about the former and current state of online yoga. They discuss her eight-year tenure as a producer for Yoga International and the transition from print to digital, first figuring out what a yoga class looks like online, Yogaglo and copyrighting camera angles,  portals and pay structures, subscription overload, YI being sold to Gaia, horizontal vs vertical growth, changing attention spans, set curriculum's and self-directed learning, process of developing courses, and the beauty of remaining forever curious.   To subscribe and support the show… GET PREMIUM.   Check out J's other podcast… J. BROWN YOGA THOUGHTS.    

Boş Yapma Enstitüsü
Kobra Gündem #164 - Suç, Beğen, Al

Boş Yapma Enstitüsü

Play Episode Listen Later Feb 14, 2025 43:52


Kobraların gündeminde bu hafta; haber programları hakkında açıklama yapan rtük başkanı, Ege Denizi'nde devam eden depremler ve alınan önlemler, sokak hayvanlarıyla uğraşan Tarım ve Sağlık Bakanlıkları, hakkında yeni iddialar ortaya atılan Ayşe Barım ve yaşananlar, iyileştiğini duyuran Devlet Bahçeli ve tutuklanan astrolog Hilal Saraç, haklarında beraat kararı verilen MücellaYapıcı, Ali Hakan Altınay ve Yiğit Ali Ekmekçi, 5 yıl hapsi talep edilen CHP Gençlik Kolları Başkanı Cem Aydın, CHP kurultayına karşı başlatılan soruşturma ve ifadeye çağırılan Kemal Kılıçdaroğlu, akşam yemeğinde buluşan Özgür Özel, Ekrem İmamoğlu ve Mansur Yavaş, yemekte konuşulanlar ve çıkan kararlar, tutuklanan CHP'li belediye meclis üyeleri, asgari ücreti geçen Cumhuriyet altını, İstanbul'da ortalama ev kiraları ve vatandaşa elektriği ucuza sattığını söyleyen Mehmet Şimşek, trans sporcuları kadın yarışlarından men eden, kağıt pipetleri yasaklayan ve Prens Harry'i sınır dışı etmeyeceğini açıklayan Donald Trump var. Ahmet Hakan köşesinde; "Karamsarlığa Hiç Gerek Yok Mansur Bey" ve "Buz Gibi Soğuyorum" başlıklı yazılar var. Cumhurbaşkanı köşesinde ise; Malezya gezisi ve yaşananlar, Özgür Özel'e verilen cevap ve adaletin olmadığı yerler var. Haftanın bütün gelişmelerini konuştuğumuz yepyeni bölüm yayında!Kobralara destek olmak için: http://kreosus.com/kobrakobrapodcastTwitter: http://twitter.com/kobrapodInstagram: http://instagram.com/kobrakobrapodcast

Taking Responsibility for Torah
SOTAH 2025 Iyyun Shiur #3 AUDIO

Taking Responsibility for Torah

Play Episode Listen Later Feb 13, 2025 82:20


Feb 10 2025Learning at YI this week sponsored by the Cheses family in memory of their grandfather–(Moshe ben Yehoshua) Morris Sulman's upcoming yahrzeit.

KPFA - APEX Express
APEX Express – 01.23.25 – Hmong Teen Dating Violence Awareness

KPFA - APEX Express

Play Episode Listen Later Jan 23, 2025 59:57


A weekly magazine-style radio show featuring the voices and stories of Asians and Pacific Islanders from all corners of our community. The show is produced by a collective of media makers, deejays, and activists. For this week's episode of APEX Express, we are joined by Yi Thoj and Belle Vang from Hmong Innovating Politics (HIP) and Pana Lee and Jennifer Xiong from California Hmong Advocates Network – Building Our Futures (CHAN-BOF) who will go into depth about these very tough but very real and needed conversations about abusive relationships, especially within the Hmong community, where 70% of Hmong Americans are under 24 years old.   Important Resources: Hmong Innovating Politics website California Hmong Advocates Network – Building Our Futures website Healthy vs. Unhealthy Relationships infographic How to Spot Abusive Relationships infographic Do you know someone in an abusive relationship? infographic Are you in an abusive relationship? infographic What does consent look like? infographic Transcript Cheryl: Good evening, everyone! You are tuned in to APEX Express. I'm your host, Cheryl and tonight is an What is AACRE?, you might ask. Well comprised of 11 grassroots, social justice groups, the Asian Americans for Civil Rights and Equality (AACRE) network, leverages the power of its network to focus on long-term movement, building and support for Asian-Americans and Pacific Islanders committed to social justice. Speaking of AACRE groups, APEX express is proud to be a part of the AACRE network.  For tonight's episode, we will be spotlighting the work of AACRE group Hmong Innovating Politics, also known as HIP. Belle Vang and Yi Thoj from HIP will be in conversation with Pana Lee and Jennifer Xiong from the California Hmong Advocates Network Building Our Futures, also known as CHAN-BOF.  They'll be in discussion on the importance of teen dating violence awareness, especially in the Hmong community as they are among the youngest of all ethnic groups in the United States with about 70% of Hmong Americans being under 24 years old.   I know somebody, you might want to learn more about HIP and CHAN-BOF so I'll let our speakers introduce themselves. And don't forget. All of their socials and websites will be linked in the show notes.    Belle: Hi, everyone, thank you so much for making time in your night to join us. We really appreciate it. Today we're going to be having a panel discussion in recognition of Teen Dating Violence Awareness Month. I really want to thank CHAN-BOF for collaborating with Hmong Innovating Politics. We're very excited to do this collab together. We're going to do a brief introduction. So, hi, everyone. My name is Bella Gaonoucci Vang. I'm with Hmong Innovating Politics as a Communication and Narrative Manager. If you're not one of our followers, make sure to follow us.  Hmong Innovating Politics is a grassroots organization focused on strengthening political power within Hmong communities through civic engagement. And with that being said, I'll go ahead and pull in one of our HIP members, Yi.  Yi Thoj: Hi everyone, my name is Yi and I use she, her pronouns, and I been a HIP young adult for around three to four years. I'm also working on the Bright Spots project.  Belle: And then if we can have Pana join the conversation.  Pana: Hi, everyone. I am Pana with CHAN-BOF champion stands for California Hmong Advocates Network Building Our Future. We were two grassroots organizations in community and outreach and this past year we have been able to provide mobile direct services to our Hmong survivors of domestic violence across the Central Valley– so from Sacramento to Fresno. Jennifer Xiong: All right. And that leaves me. Hi, everyone. My name is Jennifer Xiong. I use she/her pronouns and I work as a program specialist with CHAN-BOF and Banak, who actually serves as my supervisor. I'm really excited and happy to be here and really grateful for HIP for giving us a space time and platform to have this conversation  Belle: Thank you again CHAN-BOF for collaborating with us here at HIP. We really appreciate all the work y'all do in the community. I know y'all individually are really great folks. I'm really excited to dive into today's conversation. In your experience, I'm just asking everyone in the panel, where are some cultural norms or expectations within the Hmong community regarding relationships and dating, and that could be anything that you'd like to share from your own personal experiences. Pana: I think I can go. So I think growing up in the eighties, cultural expectations for women, Hmong women, We were expected to just cook, clean, and take care of our younger siblings and our parents. Right? So if you were dating, your relatives would just look down on us. Dating was frowned upon. I remember it was expected that if a guy is interested in you, they would have to come by your parent's house and your parents would have to approve. I remember guys come in and during our teenage years, my mom would have to be present. Right. My parents are really strict. Their limit was they could only stay two hours. And so my mom would ask fast questions. If they don't qualify, they don't meet expectations, they better be out ASAP. My parents are really, really strict.  So those were our expectations back in the 80s. We weren't really allowed to date during my younger days that's what we had to go through. Yi Thoj: I feel like a lot of the gender expectations of my generation is still very much by heteronormative and patriarchal norms and construct.  I'm the youngest of 7 girls, so all of my, 6 older sisters– they're fierce and they're also wonderful, powerful women who have helped me navigate through a lot of the contentions that I held before, interacting with romantic encounters and engagements. And so I think having that model definitely helped me navigate through my experiences as well. I feel like our parents are like, oh, if you want to engage in romantic encounters at a young age, that's welcome. But thankfully, they also didn't pressure us to do so. Jennifer Xiong: It's got me thinking about my own experiences, very little experiences, I might add. I think about some of the things my mom has said to me, which still stick around, it's kind of like embedded in my mind where she says Oh, ([Jennifer speaks in Hmong) meaning when your partner is visiting or at our home, you guys shouldn't be in your bedrooms. You should be out in the living rooms because that's really disrespectful. It, it invites negative perceptions about the person and about the relationship and it is a form of disrespect toward the, the parents and the home. I've also felt and seen from my older cousins or distant relatives who've gotten married– I think it's centered a lot around saving face. I remember hearing stories about my cousins. If they had gone out and they came home late, for example, and the parents were extremely displeased or unhappy, and they're like, no, you dishonored me and my daughter. You have to marry my daughter now because you took her home late, even if they didn't do anything salacious, so to speak. I'd hear those a lot. And, for me, those are always scary. Like, Oh my gosh, they would just do that! And you're a kid and you're growing up hearing these and actually, I think I heard it more commonly than I expected– people marrying young because of the whole consequence of arriving home late from a date or a hangout. So those are some of my experiences or what I've, I heard and witnessed. Yeah.  Belle: Thank y'all for sharing. I love hearing about your experiences. I It's really interesting how we all have different experiences, but it's still in the same realm of a very similar community, right? Very tight knit community. I echo both Jennifer and Yee's experience where my parents are a little bit more lax, but at the same time, it's like, make sure you marry someone who's a quality person. Right? I think that's really telling of how we see dating in the Hmong community. We don't date to date, right? We date to commit forever. And especially, I know all of us on this panel are women identifying and that can be a very dangerous tool, right? To just date to only marry– you're willing to put up with a lot, even if it's not really what you want for yourself, because the way the culture shapes us is if you are dating, you're only dating seriously. It's not to explore, not to be curious about yourself. And so I really appreciate the way that y'all frame it and the way that you share your experiences too. And I know we touched a little bit on this as well, but kind of gauging what it looks like to be in a healthy relationship. How would you say a healthy relationship is defined within the Hmong community? And what are qualities that you consider important? For a positive and respectful relationship within the community? Pana: So you all heard the word [Pana speaks in Hmong], right [Pana speaks in Hmong] right? [Pana speaks in Hmong] We We hear this over and over. I think even with my age, I've heard that. I'm pretty sure some of y'all have heard that to even my parents or friends or family, right? To me, what's considered positive in a relationship is really compromising and allowing you to have your own space, really meeting each other in the middle, trusting each other, having boundaries, appreciating each other, respecting, having that respect, right? Effective communication, being able to communicate with each other and having empathy. Also consent. Really having the permission of something to happen or agreement. Be able to agree with something and being committed to your relationship.  Jennifer Xiong: Yeah, I wanted to add, and also share that I think a lot of the times traditional expectations around what a healthy relationship looks like in the Hmong community generally entails being constricted and confined to your pre established roles that have been gone for generations. But I think that how we can further redefine that nowadays is to really think about how everything that Pana has already listed and shared. Right. I think it's important that those things like healthy boundaries and having balance within a relationship, I feel a lot of those things should be contextualized to the relationship. That's one, but also, I think it should be formed organically, which is difficult, and there will always be ongoing conversations about what a romantic commitment looks like, and what does that mean for the exact couple, but I think it's important to have an ongoing conversation about it, and then also it's important to understand these layers, that , If the couple is both Hmong, it's important to put that in context, and then it's also, what if it's a multiracial or multiethnic relationship? I think that's also very important. Understanding the values, and how these things can be formed organically as well. There are certain learned behaviors, beliefs, attitudes, that we pick up as we grow up and what the kind of relationships and dynamics we witnessed as we're growing up and then getting or getting involved in our own romantic relationships with people, and the things we witness and see can also really shape the way we go into relationships and the way we show up as partners.  I really don't know how to define it within the Hmong community, but I will say that I have seen when relationships and dynamics of dating are built on a foundation of patriarchy, it can, relating back to what Yi and Pana says, it can build really toxic and concerning, unhealthy relationship dynamics of power and control, and not knowing how to allow your partner to have autonomy to themselves, or knowing that it's two different people coming in together to a relationship. Power and control, when it gets mixed into this relationship, it can become really unhealthy and toxic. So I think it's also about unlearning those and realizing that certain attitudes, behaviors, and beliefs don't serve in creating a healthy relationship between a partnership or a romantic relationship. Within the Hmong community, a lot of us I've seen unlearning those behaviors and attitudes that we may have witnessed and maybe even internalized growing up. To answer the second part of the question what qualities are considered important for a positive respectful Relationship. I think it's really all that you you both named. Those are important like compromise and y'all named so many other great stuff, but then I was also just cranking up the things in my mind, but I just want to echo back what Yi and Pana said, and I'll leave it at that. Yi Thoj: What Jennifer just shared, about what we witnessed growing up sparked something in my mind as well about the media that we consumed growing up too. I watched a ton of Tyler Crohn's and Southeast Asian media growing up, and so much of the representations of love in there. It's so romanticized that abuse is okay. Non consensual engagements is okay. The media and real life relationships that are reflected and also modeled throughout our lives hold such a big factor into how we view love growing into a young adult and further. I know it definitely impacted me because I was always like, Oh, I think that's what love is, right? That's what it's showing on TV and things like that. Yeah, definitely holds weight.  Belle: Yeah, I love that you mentioned that Yi. I didn't really seriously start dating until I was in college and a lot of our generation grew up watching kdramas. Like, oh so romantic, super rich Boy is in love with super poor girl and he dictates her life and buys her everything like so romantic. And I tell my partner now that i'm married, if you ever do anything like in kdramas we are not messing around. That is not cool I don't want you to decide anything for me. I don't want you to pretend like you're in the hospital just as a prank You know boys over flowers. It's really interesting how love is framed growing up and how, just like you said, it's super romanticized. And like, you know how K dramas, you feel that excitement, like that, it's not necessarily love, right? That's just the thrill of being in something new, experiencing something different, but not necessarily love itself. And I really resonate with what you said earlier, Yi, about how it's really important to form those healthy boundaries and organically. And I really closely ties to Pana's comment about being able to create a consensual relationship and, Just like Jennifer said to like dismantling that patriarchy and foundation that we were built on.  We;re Belle: Learning those things are really hard to because initially I thought that drama was what love was supposed to be, but love is supposed to be safe and supposed to protect you, make you feel like you belong. Right? Because we like do grow up in a society that perpetuates love in honestly a violent way, I also just kind of want to know like y'all's thoughts on do you think there's enough awareness about dating violence within our communities, particularly the Hmong community? And how do you feel like it's generally perceived or even discussed amongst one another? Pana: I actually think there's not much awareness happening in the Hmong community. We really need to continue and bring more awareness. And it's awareness. Prevention. Intervention. We need to continue to do that. Some parents don't talk much to their youths about teen dating violence, what's healthy and what's not healthy, or actually like what to look for in a relationship.  In my household, I have only boys. And so we talk about safe sex, healthy boundaries, healthy relationship. What would they like to see in a relationship. I do this because, I've had experience working in the domestic violence field, sexual assault field for a long time. And plus, that's something that I never got from my parents. So my goal was, from now on, when I have my kids, these are stuff that I'm going to teach them. And so I kept my goals, you know, that was something that I told myself that I promised myself that I would do this, to continue to teach my kids healthy boundary, healthy relationship and dating violence., Most parents were taught when they were young you're going to get married and just have a good life, have a good family.  Yi Thoj: All points that are so valid and so true. There are generational gaps, between the elders and ourselves and myself. My parents are around mid 60s. As much as I think I try to bridge that gap sometimes, I think youth just don't have the language as well to fully explain to them.  There's even the conversation about like mental health and how romantic relationships are embedded in mental health and even that in itself is a difficult conversation to start. More tangible resources to learn more about communication in terms of learning the Hmong language and whatnot would definitely help with outreach and building awareness in the community. But I think a lot of recent events as well have also shown to me about where The reflection of culture and the communities as well Which I would also like to provide some sort of affirmation for any youth who's watching this that these contentions and frictions within the community– it's never a reflection of you. You know, it's always a reflection of the larger culture and what is happening. And something that we all need to advocate for and invest into to change.  Jennifer Xiong: yeah. I agree that Bottom line, there isn't enough awareness about dating violence within the Hmong community on many different fronts, like Pana mentioned, the prevention piece and the intervention piece. How does someone recognize or learn to recognize signs of I might be in a toxic, unhealthy relationship that is or can eventually lead into something that's violent? Or maybe I am in a current relationship where there is violence, but I don't know how to pick up on the signs and actually realize that, hey, I'm not in a safe place in this relationship, or in a safe relationship.  And then if your loved ones or family members or friends are recognizing it from an outside perspective, like, we lack a lot of resources and information out there for our community to engage with to learn how to intervene or also recognize it among our loved ones and the people we care about if they may be in those types of dynamics and relationships. And then when we do recognize it, how do we step in and help? What do we do? How can we help? And yeah, so bottom line, there isn't enough resources out there. I think it's still really on the, I guess the loose term, up and up. I really have a lot of faith and hope and I've seen, the work continue to expand and grow and obviously CHAN-BOF is a part of that, along with so many other organizations, statewide organizations that are trying to build more resources and information and push it out there into our communities, so that they know this information, they have access to it and can tap into it with our youth and young adults , and maybe even with our older folks or generations, cause I know you mentioned brought up a really great point too,in that , there's different gaps or different ways of understanding how to talk about dating violence within the Hmong community. Pana: Yeah, I remember my parents would tell me, [Pana speaks in Hmong] [Pana speaks in Hmong] [Pana speaks in Hmong] and I'm like I never understood that. And so growing up, getting older, I kind of understood it. And again, they said the same thing. We were talking, me and my kids were sitting in the table and we're talking about healthy relationship and stuff. What do you look for? How would the relationship look like? What's healthy? And then again, my dad says, yeah [Pana speaks in Hmong]  And my son was like, I don't understand that mom. It was just very generalized, and I had to like recorrect that. This is what he means. My definition of what my dad said was Look for a healthy relationship. Get to know the person Date them Belle: I love that example Pana because growing up everyone always told me that, and I took it at face value. You know when we speak in moments like poetry, right? but growing up I took that at face value saying like when you grow up make sure you marry someone who has Power, who has good reputation in the community, and then As I got older, my mom's like, that's never what I was telling you. Jennifer Xiong: I was just telling you, marry someone who makes you happy. And I was like, Oh, how come you didn't just say it that way? Then like you put it in a way that I was like, Oh man, I have to make sure I marry someone who's brings honor to my family, right? Like what a Mulan way of thinking. But I feel like that's always how I really perceive dating. And tying how Hmong is very much like poetry in our communities, I really like what Yi's comment earlier about how there's not really a lot of terminology in our community for even awareness about the mental health in our community. It's very much how medical terms have only really come to fruition in our community within the past like 50 years. We don't have anything regarding terms that we can use for mental health or dating violence, like the only thing we can use is sick, like that's pretty much how you say when you talk about mental health.   You just say basically, you have a sickness in your head, but there's not actual terms. When we talk about diabetes, like, [Jennifer speaks in Hmong] which literally translates to sweet blood or blood. Well, that is sweet. I hope to see, the next, I don't want to wait 50 years. I hope in the next 20 years there is verbiage that can help the community decipher and break down and bring more awareness to the violence that's being perpetrated in our communities as well. Belle: I love this conversation. I really love that. You showed examples of your son, and it really feels like how intergenerationally we think. We all think so differently, even though we have good intentions it doesn't get translated across the board. I kind of want to elaborate a little bit more when we talked about how it's really important to have consent when it comes to dating, how you really teach your sons that. Would you mind elaborating a little bit more about what consent looks like when it comes to dating, your perspective and how you see it within our communities as well. Pana: Have y'all seen the little video about drinking tea ? Sometimes you can drink the tea and you're like, I don't want to drink it no more. You know, and so you can change at any moment, right? And being able to understand okay, I This person might not want to, so I need to be able to give that respect and step away, right? And so, getting them to understand that. So if you all watched that video, the tea consent video. It's really cute, and It's really good for the youth, even for the kids. They understand it real quick. In a relationship, you should be able to give them that space and say, Okay, I get it. I'm gonna be able to understand if someone says no, then no means no. And then their body gestures are like they're pushing back, that means no. If my face is looking like, i'm shaking my head or you can see in my eyes like I don't like you stay away Right? And so being able to understand that Jennifer Xiong: I think one thing I want to add to that which is great. Like the tea consent video is super amazing at just Easily explaining under the understanding of consent, but also when someone can't consent like when they can't answer yes or no. For example, they're at a party and they've passed out drunk. They're just not conscious and awake and they can't answer yes or no, decline or accept. That also is not an invitation or permission. That is not a consent, basically. So I'm going back and forth. When a person can't answer, it's definitively no, because they're not consciously aware and awake enough to give that response. So I think that is also something I wanted to add. Yi Thoj: Yeah, I don't have much to add to this question. I've never seen the tea consent video, but putting that into perspective, that is such a great analogy and wonderful example and easy way to explain things can change right in the middle of an interaction.  Also just wanting to provide admiration to Pana as well to opening up the conversations with your sons because I think that's so important. A lot of the times younger Men or Hmong youth who are male identified. A lot of the times their influences are from other male figures in their lives who may not be the best role model. And so I'm totally leaning in towards the Hmong woman leaders in people's lives, especially Hmong youth, and just really loving that. Belle: I love that affirmation. we are right now a room of powerful women in our community itself. So I really, I want to like, double up on that echo Yi's statement as well.  Cheryl: You are currently tuned in to APEX Express on 94.1 KPFA and 88.1 on KFCF. You have so far been listening to Belle Vang and Yi Thoj from Hmong Innovating Politics, also known as HIP, and Pana Lee and Jennifer Xiong from California Hmong Advocates Network Building Our Future (CHAN-BOF). We are going to take a quick music break, but don't go anywhere. More on breaking the silence about teen dating violence awareness in the Hmong community after our break.  Welcome back. You were tuned into apex express on 94.1, KPFA 88.1. KFCF in Fresno. And online at KPFA. Dot org. You were just listening to your track off of the Anakbayan LB May Day mix tape called “Letter to Mom” by shining sons. Anakbayan LB is a Filipino youth and student organization based in long beach, California, working to arouse, organize and mobilize the community to address issues that impact Filipinos in the U S and in the Philippines.  Now, back to the show. We are here, with belle Vang and Yi Thoj from Hmong Innovating Politics (HIP) and Pana Lee and Jennifer Xiong. From California Hmong Advocates Network Building Our Futures (CHAN-BOF). We're talking about teen dating violence awareness and its impacts and implications in the Hmong community.  Belle: Jennifer, you talk about patriarchy and shared about how, you really tried to shape your son because you also work in this field you are definitely more eloquent work in addressing these issues. I want to dive more into what that looks like within our community and in our culture. Do you feel like there are specific cultural or community barriers that may prevent individuals, particularly Hmong individuals, from seeking help or disclosing incidents of dating violence? And what does that look like? Especially since I know CHAN-BOF does a lot of that direct work with clients. Pana: I think because we're so closely knitted, that's a barrier too, being afraid of, okay, this person might know me. One example is while growing up, I was taught men were more valuable than women. I think in our family, my parents really wanted a son and they kept on trying and trying until after they got 7 daughters, they finally got their son, right? And so we were told, you have to be patient because boys, [Jennifer speaks Hmong] and as a teenager, I was like, I guess I held no value. And so, and also keeping in mind for a long time, a lot of our culturally specific organizations were mainly ran by Hmong men. Hmong men are the main person who makes the decisions Jennifer Xiong: Some of those barriers are they don't seek help or support. The other barrier that I experienced in high school is I had a friend who was dating someone who was really abusive and verbally abusive, physically abusive. He sexually assaulted her. When she came to me. I was like, Oh, no, you need to go to your parents. The minute she told her parents, she was forced to marry him to save face. And so, after watching what had happened to my friend made me feel like if that happened to me and I went and told my parents. But these are back in my days, though, right? I would be forced to get married, like, and that time I didn't know that that was not okay. If someone raped you and forced you, that is not okay, but I wasn't aware of that. She wasn't aware of that. And so, again, we said, you know, back, awareness needs to happen. Awareness and education. That was something I remember for a long time and I felt guilty and I, I felt bad because I didn't know who to send to go for help. I referred it back to her parents and said, yeah, your parents would help you go for it and go for it. And that's, that's what happened. That's one of the other barriers. Some of our parents are not very educated in this topic, and it's a topic that we don't talk about. I do want to add, there's still strong sentiments of, victim blaming, shaming, disempowering. I've heard statements, or I will say, I was doing my research paper on DV in the Hmong community. My sources were like YouTube videos. And so, I found these videos of these women speaking out about their experiences of DV. In this particular example, she's married she was pregnant and her husband was abusing her. So much so that he was dragging her down the stairs of their apartment building. And so she mentioned her stomach was basically getting shaped. She was somehow able to escape his grasp and run to a neighbor and ask them to call law enforcement. And so law enforcement came and took away the husband because they visibly could see what, what had gone on. Her mother in law had said to her, Oh. [Jennifer speaks Hmong], meaning, oh, daughter in law, why did you call law enforcement and have them take away my son? It dawned on me how we perceived some of these dynamics and abuses when it happens in relationships. And again, the whole, why did you do that instead of are you okay? What happened to you? Why did they do that to you? Or really focusing on the wellness and safety of the person being in a violent relationship, violent abusive relationship. And to add to that, the terminology and the way we frame some of the resources out there, I remember a lot of the [Jennifer speaks in Hmong] the elders, would call DV shelters [Jennifer speaks in Hmong] right. The term, the explanation of it is like the place for runaway women or wives or mothers. But in fact, these shelters meant to house and keep individuals, women, children, who were experiencing abuse and violence in their relationship safe. But then we use negative connotations and terminology to label them because it brings a lot of shame and hesitation to seek out help. The fact that the resources that are available mainstream wise for those who are seeking help and resources because they may be in an abusive violent relationship is that there's also a lack of culturally responsive resources and services to aid and assist our specific community members when they're out trying to get the help that they need. I've witnessed and heard that a lot from the clients that I directly support and assist. Like, oh, we've gone here and then they mentioned not having a great experience, or being misunderstood, or I'm not feeling even safe or comfortable enough to talk about their experiences and get the resources and help that they need because some of the agencies really lacked the cultural understanding awareness or the intersection of that when it comes to dating violence or domestic violence in our own community. Yi Thoj: Yeah, all of this is like really great examples. Also, unfortunate. I think that from my own experience with dealing with victims around me who have undergone a lot of these violences, what I've seen is that a lot of it is them recognizing that the harm that is being done to them is wrong. Very much so. But they've also internalized and conditioned themselves to accept it as something that is normal and okay, even if a lot of the times there's this back and forth resistance of wanting to debate themselves from the situation, but then at the same time, them like always going back and this is the cycle of abuse, right, and how it works. But one note that I would also like to make is that what I've also seen is that it's really, really important that male perpetrators, especially Hmong men, it's important that there are other Hmong men who are holding them accountable, is what I found to be true. Because as much as Hmong women who are victims and other Hmong women bystanders who are wanting to advocate for these victims try to stand up for them, These perpetrators and also the culture inherently does not change if people who are in power and have that privilege don't actively help dismantle it, too. So, I think that it's important to note. There's so much power that goes into having woman led spaces and woman voices because that's so important, but I also think there should be so much more work done from the cisgendered male counterparts in our lives and in the community Belle: Thank y'all for that. Your sentiment is so powerful, yi and it's Very valid. A lot of times the folks that were leading this work are often the women in our communities Like that's just straight up facts, right? I attended a Boys and men of color conference, and one of the panels said the one time that men have these spaces together is also when women created. Right? As women, we build a lot of community for our community and at the same time, don't get the recognition of the work that is being done. So, it's really important that those who do have power, make sure that they implement it correctly and support communities that minorities within their communities that need that extra support.  The examples provided to I felt were very powerful, but also very traumatizing. When I was listening to your story, when you were talking about how you advise your friend to go to their family and they were forced into marriage. I know that we are different generations, but I feel like I definitely have met folks who are my age who were still forced to the situation. Those culture practices are so very normal and not unheard of. Like it's not completely cultural shift within one generation. And I'm sure When you witnessed that, that it was very traumatizing for you too, even though you were not the one immediately affected by it, but it also shifted the way you saw community, the way you viewed culture itself. And you even expressed you felt a lot of guilt and responsibility for that. It's really interesting that when there are those traumatizing, abusive relationships happening to those folks, and even at the third per person party that you feel that trauma in other ways as well. You mentioned how the patriarchy does affect our communities in that way. What is being done? What is being said to help heal our communities and work past these issues that are obviously very much rooted in our communities. I know we talked a little bit about the way cultural identity influences our communities. I know we specifically talked about the Hmong community too as well. I know we only have about 10 minutes left and so I kind of just want to dive into, not necessarily solutions, but what are things that we can take, what are steps that we can take to make progressive action and change in our community? So in your opinion, what role can the Hmong community play in addressing and preventing this deep imbalance? And Are there any community led solutions that you feel could be effective within our community? Yi Thoj: Yeah, I think as we've mentioned throughout the conversation, it's important to emphasize and highlight prevention work that can be done. And that is teaching the young boys and men and ongoing older Hmong men in our lives to. Because that is community, right? Folks who are directly within our circles, as well as people who we interact with. I think it's important to teach them very simple things that should already be fundamental, but unfortunately are not. Such as informed consent, and then also just normal consent. I think to echo back on what I just shared as well, having more male mentors who are very much progressive and radical in their work, and also centered in the actual tangible dismantling of the culture and harmful aspects of the system, I think is, A really big part of it. The reason why I think I'm bringing this up is because my experience with younger men who still hold a lot of these traditionalist and violent behaviors and mentalities receive a lot of their mentorship from other male mentors in their lives, and also just media consumption such as Andrew Tate and whatnot. A lot of folks in my own young adult experience very much religiously follow Andrew Tate and I had believed that we were at a point in our progressive history to where we have gone past that, but it's still very rampant in the community and it's affecting The youth, and it's affecting how they interact with and also date other Hmong women as well, assuming that this is a binary relationship.   Pana: It's time to talk about it, supporting each other, talking about what health relationship really is. And It doesn't have to just come from the school. For a long time, a lot of our parents, we depend on the school. Oh, they'll figure that out, right? it needs to come from everyone, every one of us. Even as a friend, as an individual, we all need to support in that piece like supportive organizations such as CHAN-BOF and HIP, right? Continuously talking about this, bringing the awareness. If you're feeling uncomfortable, if we're really uncomfortable talking about a certain topic, we do need to talk about that and really addressing that. Getting to understand what's healthy and what's not healthy. What are the signs of an abusive relationship? I think if we really want change, change needs to happen especially as parents and it comes from the youth too. We want a better future for our youth so I think really continue to really address this and doing a lot of prevention work because we tend to deal with a crisis and we're forgetting about the prevention part. How do we prevent this stuff. One great example that I always use is we're constantly supporting and trying to jump in and support people who are drowning, but we keep forgetting about, what's happening on the other side of that river. Something's happening and it's the prevention education piece that we need to start doing and continue to do. Cheryl: We're going to take a quick music break, but don't go anywhere. Next up,. You're going to be listening to “cultural worker” by power struggle. More on the ways we can work towards. Teen dating violence awareness in the Hmong Comunity when we return.    Cheryl: And we're back!. You are tuned in to KPFA on 94.1, KPFA 88.1 KFCF F in Fresno and online at kpfa.org. You were just listening to “cultural worker” by power struggle, a Filipino beat rock music artist based in the bay. We're currently here with Belle and Yi from Hmong innovating politics, hip. And Jennifer and Pana from California Hmong advocates network, building our futures, cHAN-BOF as we discuss the ways we can address teen dating violence in the Hmong community.    Jennifer Xiong: I'm gonna echo, I mean, both of you brought up the same points, but in really distinctive examples of your own, and I really appreciate that. It is about really bolstering, our community up to be proactive and engaged and informed about this, and really equipping and building them up to be a part of this, that it's not oh, you know, I think it's great that obviously we do this work as current active advocates who've had previous quote, unquote, professional experience dealing with , crisis like this, or dealing with and supporting directly individuals who have gone or are going through this and that, like, everyone is more than capable of being equipped with the knowledge and being enforced with the knowledge and the ability To learn and understand this and be proactive about it in our community. It does lead a lot back to the whole prevention and intervention work and building up our youth and young adults. Cause you know, okay. So a side note is, so we did a lot of outreach and engagement work this past year, really putting it out in front of our community, in the Hmong community. And let me tell you, I was scared to do this because I was like, oh my gosh, people are going to be bringing their pitchforks and torches and, and they're going to come around and be like, who's this girl going on TV, talking about DV and providing resources and services for our community. Interestingly enough, I got like so much of the opposite reaction and responses. And I think to me, that's really heartwarming. And it gives me a lot of hope because I got so much positive affirmation and reinforcement and feedback from even our older generations in our community and young folks too, saying this is so needed. This is critical, important. I'm so glad you're out here. Or how can we get involved? Even being like, , I'm so happy that you guys are doing this work. And we really have a lot of faith because so much of our younger folks, younger generations are stepping up to do this sort of work. So I think it's really the community, a large portion of the community, from what I've experienced, really recognize how important and needed this work is to implement this and incorporate this into our community so they know and understand like, Hey, violence is not okay. Dating violence is not okay. Domestic violence is not okay. But what can we do? , what do we do about it? And I think we're at that place where people are really curious and desiring to really step up and do something about it. And again, I think what Pana and Yi mentioned.  Belle: Thank you. I love those ideas on how the Hmong community can take action to change the violence that happens in our communities, right? I love dismantling the patriarchy and empowering our youth. I think that also really comes with, I know we didn't really touch on this, but, the 18 class system. How there really needs to be more, you mentioned, women leadership. We have a lot of women leadership in our communities, but not within our 18 class system. And why is that right? And how do we convince them that we need women in those leadership roles within our communities to represent our communities. That also ties into the same thing with Jennifer, how we really want to empower youth. We should also have youth leadership because then the folks who are in those important seats are 60 plus and so disconnected with the reality that we're living in today. So, you know, I just really appreciate everything y'all brought to the table today. I know we only have a few minutes left. , I know we talked a lot about youth empowerment, how there's a lot of women leadership. Since we're focusing on teen dating violence today, what is a tip or advice that you would have liked to receive as a teenager, now being a little bit more experienced with your relationships. And if you could say it really quick. Any of the teenagers listening out here, perk your ears up– there's a lot of great advice in here, so make sure that you absorb it like a sponge. And I'll just go ahead and leave it at that.  Pana: I think with me– it's okay to not be okay, right? It's okay to not be okay, and it's really okay to talk to someone. And really reach out for help and, you know, really understand that it's okay to say no, and we are all equal. Jennifer Xiong: For me, Oh gosh, this is hard. First things first is like, I think my teen self would have loved to know dating during your teen years. It's not a big deal. Like, it's okay. Don't feel like you're missing out or that there's something wrong with you if you aren't in a relationship while you're in your teen years. Really spend that time cherishing and valuing the time you have with yourself and getting to know yourself first, so that when you do get into a relationship, you know what you want, you know, the values that you want in a relationship, the values you want to bring into a relationship, you know yourself. And also don't forget that you are you're worthy. You matter, you're important. And that, anyone who disrespects you or does not value your work in a relationship more than likely aren't worth your time and aren't worth your tears. And so I think that's what I would have wanted to know.  Yi Thoj: for mine, it's very specific. How I came to be with my current partner. It was through an intersection of events with a lot of things that we've already discussed today as well. And so I think what I would have wanted to know is that It's very difficult to try to empower and change the hearts and minds of people on the ground level. Even if you're going in head strong. please treat yourself with grace in all of that. And then lean in on your partner to help you navigate that. It's so important. I think a lot of Hmong women and Hmong girls are taught to be hyper individualistic and independent, and it's needing to teach that sometimes you can lean into your femininity. Sometimes you can lean in on support from other people. And also from your partner, it's really important. C: Thank you. I love all the self love in the room and just really great advice on being gentle with yourself and recognize that you are deserving of all the good things in life. I hope that everyone really takes that to heart and it's just friendly reminder to continue loving yourself in the process of loving others. Love is abundant. It's not scarcity mindset. We are here to share our love and that love should be shared with ourselves as well. We're going to wrap today up and I just want to say thank you so much to Yi, Pana, Jennifer for joining us and thank you so much CHAN-BOF for collaborating with HIP for dating violence awareness month. We really appreciate all your effort and all the work you do in our communities as well. If you haven't already in the audience, please make sure to follow and like HIP and CHAN-BOF so you can continue following the work that we do and support our endeavors as community members, because you are part of the change in our communities as well. Well, all so much and have a good rest of your night. Thanks everyone.  Cheryl: And that's the end of our show. Learn more about the incredible work being done by Hmong innovating politics and CHAN-BOF by checking out our show notes.   Also HIP and CHAN-BOF ask work together to create these really helpful infographics on themes of teen dating violence awareness, such as what is consent? How do you know you're in an abusive relationship. How can you help someone who's in it? I found them to be really helpful. So I will also make sure to link those in the show notes as well.  Cheryl Truong: Apex express is produced by Miko Lee, Paige Chung, Jalena Keane-Lee, Preeti Mangala Shekar. Shekar, Anuj Vaidya, Kiki Rivera, Swati Rayasam, Nate Tan, Hien Nguyen, Nikki Chan, and Cheryl Truong   Tonight's show was produced by me, cheryl. Thanks to the team at KPFA for all of their support. And thank you for listening!  The post APEX Express – 01.23.25 – Hmong Teen Dating Violence Awareness appeared first on KPFA.

New Books Network
Robin Visser, "Questioning Borders: Ecoliteratures of China and Taiwan" (Columbia UP, 2023)

New Books Network

Play Episode Listen Later Jan 18, 2025 59:05


Indigenous knowledge of local ecosystems often challenges settler-colonial cosmologies that naturalize resource extraction and the relocation of nomadic, hunting, foraging, or fishing peoples. Questioning Borders: Ecoliteratures of China and Taiwan (Columbia UP, 2023) explores recent ecoliterature by Han and non-Han Indigenous writers of China and Taiwan, analyzing relations among humans, animals, ecosystems, and the cosmos in search of alternative possibilities for creativity and consciousness. Informed by extensive field research, Robin Visser compares literary works by Bai, Bunun, Kazakh, Mongol, Tao, Tibetan, Uyghur, Wa, Yi, and Han Chinese writers set in Xinjiang, Tibet, Inner Mongolia, Southwest China, and Taiwan, sites of extensive development, migration, and climate change impacts. Visser contrasts the dominant Han Chinese cosmology of center and periphery that informs what she calls “Beijing Westerns” with Indigenous and hybridized ways of relating to the world that challenge borders, binaries, and hierarchies. By centering Indigenous cosmologies, this book aims to decolonize approaches to ecocriticism, comparative literature, and Chinese and Sinophone studies as well as to inspire new modes of sustainable flourishing in the Anthropocene. Robin Visser is professor and associate chair of the Department of Asian and Middle Eastern Studies at the University of North Carolina at Chapel Hill. She is the author of Cities Surround the Countryside: Urban Aesthetics in Postsocialist China (2010). Li-Ping Chen is a teaching fellow in the Department of East Asian Languages and Cultures at the University of Southern California. Her research interests include literary translingualism, diaspora, and nativism in Sinophone, inter-Asian, and transpacific contexts. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/new-books-network

New Books in East Asian Studies
Robin Visser, "Questioning Borders: Ecoliteratures of China and Taiwan" (Columbia UP, 2023)

New Books in East Asian Studies

Play Episode Listen Later Jan 18, 2025 59:05


Indigenous knowledge of local ecosystems often challenges settler-colonial cosmologies that naturalize resource extraction and the relocation of nomadic, hunting, foraging, or fishing peoples. Questioning Borders: Ecoliteratures of China and Taiwan (Columbia UP, 2023) explores recent ecoliterature by Han and non-Han Indigenous writers of China and Taiwan, analyzing relations among humans, animals, ecosystems, and the cosmos in search of alternative possibilities for creativity and consciousness. Informed by extensive field research, Robin Visser compares literary works by Bai, Bunun, Kazakh, Mongol, Tao, Tibetan, Uyghur, Wa, Yi, and Han Chinese writers set in Xinjiang, Tibet, Inner Mongolia, Southwest China, and Taiwan, sites of extensive development, migration, and climate change impacts. Visser contrasts the dominant Han Chinese cosmology of center and periphery that informs what she calls “Beijing Westerns” with Indigenous and hybridized ways of relating to the world that challenge borders, binaries, and hierarchies. By centering Indigenous cosmologies, this book aims to decolonize approaches to ecocriticism, comparative literature, and Chinese and Sinophone studies as well as to inspire new modes of sustainable flourishing in the Anthropocene. Robin Visser is professor and associate chair of the Department of Asian and Middle Eastern Studies at the University of North Carolina at Chapel Hill. She is the author of Cities Surround the Countryside: Urban Aesthetics in Postsocialist China (2010). Li-Ping Chen is a teaching fellow in the Department of East Asian Languages and Cultures at the University of Southern California. Her research interests include literary translingualism, diaspora, and nativism in Sinophone, inter-Asian, and transpacific contexts. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/east-asian-studies

New Books in Literary Studies
Robin Visser, "Questioning Borders: Ecoliteratures of China and Taiwan" (Columbia UP, 2023)

New Books in Literary Studies

Play Episode Listen Later Jan 18, 2025 59:05


Indigenous knowledge of local ecosystems often challenges settler-colonial cosmologies that naturalize resource extraction and the relocation of nomadic, hunting, foraging, or fishing peoples. Questioning Borders: Ecoliteratures of China and Taiwan (Columbia UP, 2023) explores recent ecoliterature by Han and non-Han Indigenous writers of China and Taiwan, analyzing relations among humans, animals, ecosystems, and the cosmos in search of alternative possibilities for creativity and consciousness. Informed by extensive field research, Robin Visser compares literary works by Bai, Bunun, Kazakh, Mongol, Tao, Tibetan, Uyghur, Wa, Yi, and Han Chinese writers set in Xinjiang, Tibet, Inner Mongolia, Southwest China, and Taiwan, sites of extensive development, migration, and climate change impacts. Visser contrasts the dominant Han Chinese cosmology of center and periphery that informs what she calls “Beijing Westerns” with Indigenous and hybridized ways of relating to the world that challenge borders, binaries, and hierarchies. By centering Indigenous cosmologies, this book aims to decolonize approaches to ecocriticism, comparative literature, and Chinese and Sinophone studies as well as to inspire new modes of sustainable flourishing in the Anthropocene. Robin Visser is professor and associate chair of the Department of Asian and Middle Eastern Studies at the University of North Carolina at Chapel Hill. She is the author of Cities Surround the Countryside: Urban Aesthetics in Postsocialist China (2010). Li-Ping Chen is a teaching fellow in the Department of East Asian Languages and Cultures at the University of Southern California. Her research interests include literary translingualism, diaspora, and nativism in Sinophone, inter-Asian, and transpacific contexts. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/literary-studies

New Books in Environmental Studies
Robin Visser, "Questioning Borders: Ecoliteratures of China and Taiwan" (Columbia UP, 2023)

New Books in Environmental Studies

Play Episode Listen Later Jan 18, 2025 59:05


Indigenous knowledge of local ecosystems often challenges settler-colonial cosmologies that naturalize resource extraction and the relocation of nomadic, hunting, foraging, or fishing peoples. Questioning Borders: Ecoliteratures of China and Taiwan (Columbia UP, 2023) explores recent ecoliterature by Han and non-Han Indigenous writers of China and Taiwan, analyzing relations among humans, animals, ecosystems, and the cosmos in search of alternative possibilities for creativity and consciousness. Informed by extensive field research, Robin Visser compares literary works by Bai, Bunun, Kazakh, Mongol, Tao, Tibetan, Uyghur, Wa, Yi, and Han Chinese writers set in Xinjiang, Tibet, Inner Mongolia, Southwest China, and Taiwan, sites of extensive development, migration, and climate change impacts. Visser contrasts the dominant Han Chinese cosmology of center and periphery that informs what she calls “Beijing Westerns” with Indigenous and hybridized ways of relating to the world that challenge borders, binaries, and hierarchies. By centering Indigenous cosmologies, this book aims to decolonize approaches to ecocriticism, comparative literature, and Chinese and Sinophone studies as well as to inspire new modes of sustainable flourishing in the Anthropocene. Robin Visser is professor and associate chair of the Department of Asian and Middle Eastern Studies at the University of North Carolina at Chapel Hill. She is the author of Cities Surround the Countryside: Urban Aesthetics in Postsocialist China (2010). Li-Ping Chen is a teaching fellow in the Department of East Asian Languages and Cultures at the University of Southern California. Her research interests include literary translingualism, diaspora, and nativism in Sinophone, inter-Asian, and transpacific contexts. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/environmental-studies

New Books in Chinese Studies
Robin Visser, "Questioning Borders: Ecoliteratures of China and Taiwan" (Columbia UP, 2023)

New Books in Chinese Studies

Play Episode Listen Later Jan 18, 2025 59:05


Indigenous knowledge of local ecosystems often challenges settler-colonial cosmologies that naturalize resource extraction and the relocation of nomadic, hunting, foraging, or fishing peoples. Questioning Borders: Ecoliteratures of China and Taiwan (Columbia UP, 2023) explores recent ecoliterature by Han and non-Han Indigenous writers of China and Taiwan, analyzing relations among humans, animals, ecosystems, and the cosmos in search of alternative possibilities for creativity and consciousness. Informed by extensive field research, Robin Visser compares literary works by Bai, Bunun, Kazakh, Mongol, Tao, Tibetan, Uyghur, Wa, Yi, and Han Chinese writers set in Xinjiang, Tibet, Inner Mongolia, Southwest China, and Taiwan, sites of extensive development, migration, and climate change impacts. Visser contrasts the dominant Han Chinese cosmology of center and periphery that informs what she calls “Beijing Westerns” with Indigenous and hybridized ways of relating to the world that challenge borders, binaries, and hierarchies. By centering Indigenous cosmologies, this book aims to decolonize approaches to ecocriticism, comparative literature, and Chinese and Sinophone studies as well as to inspire new modes of sustainable flourishing in the Anthropocene. Robin Visser is professor and associate chair of the Department of Asian and Middle Eastern Studies at the University of North Carolina at Chapel Hill. She is the author of Cities Surround the Countryside: Urban Aesthetics in Postsocialist China (2010). Li-Ping Chen is a teaching fellow in the Department of East Asian Languages and Cultures at the University of Southern California. Her research interests include literary translingualism, diaspora, and nativism in Sinophone, inter-Asian, and transpacific contexts. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/chinese-studies

Off the Page: A Columbia University Press Podcast
Robin Visser, "Questioning Borders: Ecoliteratures of China and Taiwan" (Columbia UP, 2023)

Off the Page: A Columbia University Press Podcast

Play Episode Listen Later Jan 18, 2025 59:05


Indigenous knowledge of local ecosystems often challenges settler-colonial cosmologies that naturalize resource extraction and the relocation of nomadic, hunting, foraging, or fishing peoples. Questioning Borders: Ecoliteratures of China and Taiwan (Columbia UP, 2023) explores recent ecoliterature by Han and non-Han Indigenous writers of China and Taiwan, analyzing relations among humans, animals, ecosystems, and the cosmos in search of alternative possibilities for creativity and consciousness. Informed by extensive field research, Robin Visser compares literary works by Bai, Bunun, Kazakh, Mongol, Tao, Tibetan, Uyghur, Wa, Yi, and Han Chinese writers set in Xinjiang, Tibet, Inner Mongolia, Southwest China, and Taiwan, sites of extensive development, migration, and climate change impacts. Visser contrasts the dominant Han Chinese cosmology of center and periphery that informs what she calls “Beijing Westerns” with Indigenous and hybridized ways of relating to the world that challenge borders, binaries, and hierarchies. By centering Indigenous cosmologies, this book aims to decolonize approaches to ecocriticism, comparative literature, and Chinese and Sinophone studies as well as to inspire new modes of sustainable flourishing in the Anthropocene. Robin Visser is professor and associate chair of the Department of Asian and Middle Eastern Studies at the University of North Carolina at Chapel Hill. She is the author of Cities Surround the Countryside: Urban Aesthetics in Postsocialist China (2010). Li-Ping Chen is a teaching fellow in the Department of East Asian Languages and Cultures at the University of Southern California. Her research interests include literary translingualism, diaspora, and nativism in Sinophone, inter-Asian, and transpacific contexts.

Nuus
Swapo verkoop Namibië aan Chinese: Mudge

Nuus

Play Episode Listen Later Jan 12, 2025 0:37


China se minister van buitelandse sake, Wang Yi, het tydens sy onlangse besoek met president Nangolo Mbumba gesprekke gevoer oor hoe China Namibië se kernenergiebedryf kan bystaan. Nou het 'n skip vol droogtehulp-voedsel vanaf China, wat deur Yi belowe was, in Walvisbaai aangekom. Dit het omstredenheid ontketen met talle Namibiërs wat bekommerd is dat plaaslike boere geraak sal word. Kosmos 94.1 Nuus het gesels met die Republikeinse Party se president, Henk Mudge, wat sê China het 'n agenda en dit is om die land se natuurlike hulpbronne in die hande te kry. Mudge het meer.

Heretics by Woven Energy
#105 Xing Yi part 17 - Beng part 1

Heretics by Woven Energy

Play Episode Listen Later Jan 11, 2025 61:32


After a long break we are back with a patron-requested episode about Xing Yi's famous Beng method. What is Beng? Where does it come from? What does it have to do with the I-Ching? These questions and others answered in this episode.

Bible Reading Plan Podcast by VictoryPoint
Psalm 5 | Malachi, Flora, Peach and Yi

Bible Reading Plan Podcast by VictoryPoint

Play Episode Listen Later Jan 10, 2025 17:38


EPISODE 1070 It's Friday, January 10, and Malachi, Flora, Peach and Yi reflect on Psalm 5. For the full VP Bible Reading Plan, head to https://www.victorypoint.org/resources. For more on the context of today's passage check out the resources at https://bibleproject.com/explore/book-overviews. To find out more about VictoryPoint Church go to victorypoint.org. If you have comments on this episode or podcast send us an email at ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠info@victorypoint.org⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠. And be sure to subscribe to this podcast!

Bible Reading Plan Podcast by VictoryPoint
Psalm 4 | Malachi, Flora, Peach and Yi

Bible Reading Plan Podcast by VictoryPoint

Play Episode Listen Later Jan 9, 2025 17:38


EPISODE 1069 It's Thursday, January 9, and Malachi, Flora, Peach and Yi reflect on Psalm 4. For the full VP Bible Reading Plan, head to https://www.victorypoint.org/resources. For more on the context of today's passage check out the resources at https://bibleproject.com/explore/book-overviews. To find out more about VictoryPoint Church go to victorypoint.org. If you have comments on this episode or podcast send us an email at ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠info@victorypoint.org⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠. And be sure to subscribe to this podcast!

Living Change I Ching podcast
A reading for the year

Living Change I Ching podcast

Play Episode Listen Later Jan 9, 2025 36:33


This episode features a reading for the year ahead. Kalimah's question: "Please comment on the year ahead. What I can expect? And also, how best to work with the energies within and around me?" Yi answered with Hexagram 23, Stripping Away, changing at lines 1, 4, 5 and 6 to 17, Following: changing to The meanings of the two hexagrams turned out to blend and harmonise in a very clear, satisfying way. Kalimah's shared a reading on the podcast once before, a year ago, so we took a moment to review how that reading had worked out. (Here's the original episode.) I hope you enjoy this one - don't forget you can book your own free reading for the podcast here...

Bible Reading Plan Podcast by VictoryPoint
Psalm 3 | Malachi, Flora, Peach and Yi

Bible Reading Plan Podcast by VictoryPoint

Play Episode Listen Later Jan 8, 2025 19:14


EPISODE 1068 It's Wednesday, January 8, and Malachi, Flora, Peach and Yi reflect on Psalm 3. For the full VP Bible Reading Plan, head to https://www.victorypoint.org/resources. For more on the context of today's passage check out the resources at https://bibleproject.com/explore/book-overviews. To find out more about VictoryPoint Church go to victorypoint.org. If you have comments on this episode or podcast send us an email at ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠info@victorypoint.org⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠. And be sure to subscribe to this podcast!

Bible Reading Plan Podcast by VictoryPoint
Psalm 2 | Malachi, Flora, Peach and Yi

Bible Reading Plan Podcast by VictoryPoint

Play Episode Listen Later Jan 7, 2025 18:26


EPISODE 1067 It's Tuesday, January 7, and Malachi, Flora, Peach and Yi reflect on Psalm 2. For the full VP Bible Reading Plan, head to https://www.victorypoint.org/resources. For more on the context of today's passage check out the resources at https://bibleproject.com/explore/book-overviews. To find out more about VictoryPoint Church go to victorypoint.org. If you have comments on this episode or podcast send us an email at ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠info@victorypoint.org⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠. And be sure to subscribe to this podcast!

Bible Reading Plan Podcast by VictoryPoint
Psalm 1 | Malachi, Flora, Peach and Yi

Bible Reading Plan Podcast by VictoryPoint

Play Episode Listen Later Jan 6, 2025 16:04


EPISODE 1066 It's Monday, January 6, and Malachi, Flora, Peach and Yi reflect on Psalm 1. For the full VP Bible Reading Plan, head to https://www.victorypoint.org/resources. For more on the context of today's passage check out the resources at https://bibleproject.com/explore/book-overviews. To find out more about VictoryPoint Church go to victorypoint.org. If you have comments on this episode or podcast send us an email at ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠info@victorypoint.org⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠. And be sure to subscribe to this podcast!

Nuus
Chinese minister wil China-Afrika bande versterk

Nuus

Play Episode Listen Later Jan 6, 2025 0:38


Die Chinese minister van buitelandse sake, Wang Yi, het in Windhoek aangekom en verkose president Netumbo Nandi-Ndaitwah ontmoet. Hy sal later samesprekings op Swakopmund met president Nangolo Mbumba vergader. Yi is tot 7 Januarie op ‘n amptelike besoek aan Namibië. Volgens die minister beplan hy om betrekkinge tussen China en Afrika te versterk. Sy vertaler het sy toespraak gelewer.

Nuus
Yi se besoek het strategiese belange

Nuus

Play Episode Listen Later Jan 4, 2025 0:58


Die besoek van môre tot Dinsdag deur Wang Yi, die Chinese minister van buitelandse sake is ‘n strategiese skuif, veral met die wêreld, en plaaslike, se veranderende politiek. Yi kom môre in die land aan en is geskeduleer om samesprekings met verkose president, Netumbo Nandi-Ndaitwah, in Windhoek te voer en dan op Swakopmund met president Nangolo Mbumba te vergader. Die politieke ontleder, professor Andre Duvenhage het met Kosmos 94.1 Nuus gepraat.

New Books Network
Book Chat: Home and Island Writing in "Bubble War" with Kao Yi-feng

New Books Network

Play Episode Listen Later Dec 31, 2024 43:14


In this episode, our host, Ti-han, invited a renown Taiwanese sci-fi writer, Kao Yi-feng, to talk about his fictional writings. Yi-feng is known for his way of combining elements of fantasy and magical realism with specific “linguistic features” of Hakka. In our conversation, Yi-feng recounts how his background of living in a Hakka-speaking community influences his lifelong creativity, and how spatial and bodily movement also allows him to shape and reshape the sense of “home” in his own novels. The podcast interview focussed in part on his work, Bubble War 泡沬戰爭 (2014), but it also links with how Yi-feng conceives Island Writing for Taiwan today! Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/new-books-network

Latent Space: The AI Engineer Podcast — CodeGen, Agents, Computer Vision, Data Science, AI UX and all things Software 3.0

Applications for the 2025 AI Engineer Summit are up, and you can save the date for AIE Singapore in April and AIE World's Fair 2025 in June.Happy new year, and thanks for 100 great episodes! Please let us know what you want to see/hear for the next 100!Full YouTube Episode with Slides/ChartsLike and subscribe and hit that bell to get notifs!Timestamps* 00:00 Welcome to the 100th Episode!* 00:19 Reflecting on the Journey* 00:47 AI Engineering: The Rise and Impact* 03:15 Latent Space Live and AI Conferences* 09:44 The Competitive AI Landscape* 21:45 Synthetic Data and Future Trends* 35:53 Creative Writing with AI* 36:12 Legal and Ethical Issues in AI* 38:18 The Data War: GPU Poor vs. GPU Rich* 39:12 The Rise of GPU Ultra Rich* 40:47 Emerging Trends in AI Models* 45:31 The Multi-Modality War* 01:05:31 The Future of AI Benchmarks* 01:13:17 Pionote and Frontier Models* 01:13:47 Niche Models and Base Models* 01:14:30 State Space Models and RWKB* 01:15:48 Inference Race and Price Wars* 01:22:16 Major AI Themes of the Year* 01:22:48 AI Rewind: January to March* 01:26:42 AI Rewind: April to June* 01:33:12 AI Rewind: July to September* 01:34:59 AI Rewind: October to December* 01:39:53 Year-End Reflections and PredictionsTranscript[00:00:00] Welcome to the 100th Episode![00:00:00] Alessio: Hey everyone, welcome to the Latent Space Podcast. This is Alessio, partner and CTO at Decibel Partners, and I'm joined by my co host Swyx for the 100th time today.[00:00:12] swyx: Yay, um, and we're so glad that, yeah, you know, everyone has, uh, followed us in this journey. How do you feel about it? 100 episodes.[00:00:19] Alessio: Yeah, I know.[00:00:19] Reflecting on the Journey[00:00:19] Alessio: Almost two years that we've been doing this. We've had four different studios. Uh, we've had a lot of changes. You know, we used to do this lightning round. When we first started that we didn't like, and we tried to change the question. The answer[00:00:32] swyx: was cursor and perplexity.[00:00:34] Alessio: Yeah, I love mid journey. It's like, do you really not like anything else?[00:00:38] Alessio: Like what's, what's the unique thing? And I think, yeah, we, we've also had a lot more research driven content. You know, we had like 3DAO, we had, you know. Jeremy Howard, we had more folks like that.[00:00:47] AI Engineering: The Rise and Impact[00:00:47] Alessio: I think we want to do more of that too in the new year, like having, uh, some of the Gemini folks, both on the research and the applied side.[00:00:54] Alessio: Yeah, but it's been a ton of fun. I think we both started, I wouldn't say as a joke, we were kind of like, Oh, we [00:01:00] should do a podcast. And I think we kind of caught the right wave, obviously. And I think your rise of the AI engineer posts just kind of get people. Sombra to congregate, and then the AI engineer summit.[00:01:11] Alessio: And that's why when I look at our growth chart, it's kind of like a proxy for like the AI engineering industry as a whole, which is almost like, like, even if we don't do that much, we keep growing just because there's so many more AI engineers. So did you expect that growth or did you expect that would take longer for like the AI engineer thing to kind of like become, you know, everybody talks about it today.[00:01:32] swyx: So, the sign of that, that we have won is that Gartner puts it at the top of the hype curve right now. So Gartner has called the peak in AI engineering. I did not expect, um, to what level. I knew that I was correct when I called it because I did like two months of work going into that. But I didn't know, You know, how quickly it could happen, and obviously there's a chance that I could be wrong.[00:01:52] swyx: But I think, like, most people have come around to that concept. Hacker News hates it, which is a good sign. But there's enough people that have defined it, you know, GitHub, when [00:02:00] they launched GitHub Models, which is the Hugging Face clone, they put AI engineers in the banner, like, above the fold, like, in big So I think it's like kind of arrived as a meaningful and useful definition.[00:02:12] swyx: I think people are trying to figure out where the boundaries are. I think that was a lot of the quote unquote drama that happens behind the scenes at the World's Fair in June. Because I think there's a lot of doubt or questions about where ML engineering stops and AI engineering starts. That's a useful debate to be had.[00:02:29] swyx: In some sense, I actually anticipated that as well. So I intentionally did not. Put a firm definition there because most of the successful definitions are necessarily underspecified and it's actually useful to have different perspectives and you don't have to specify everything from the outset.[00:02:45] Alessio: Yeah, I was at um, AWS reInvent and the line to get into like the AI engineering talk, so to speak, which is, you know, applied AI and whatnot was like, there are like hundreds of people just in line to go in.[00:02:56] Alessio: I think that's kind of what enabled me. People, right? Which is what [00:03:00] you kind of talked about. It's like, Hey, look, you don't actually need a PhD, just, yeah, just use the model. And then maybe we'll talk about some of the blind spots that you get as an engineer with the earlier posts that we also had on on the sub stack.[00:03:11] Alessio: But yeah, it's been a heck of a heck of a two years.[00:03:14] swyx: Yeah.[00:03:15] Latent Space Live and AI Conferences[00:03:15] swyx: You know, I was, I was trying to view the conference as like, so NeurIPS is I think like 16, 17, 000 people. And the Latent Space Live event that we held there was 950 signups. I think. The AI world, the ML world is still very much research heavy. And that's as it should be because ML is very much in a research phase.[00:03:34] swyx: But as we move this entire field into production, I think that ratio inverts into becoming more engineering heavy. So at least I think engineering should be on the same level, even if it's never as prestigious, like it'll always be low status because at the end of the day, you're manipulating APIs or whatever.[00:03:51] swyx: But Yeah, wrapping GPTs, but there's going to be an increasing stack and an art to doing these, these things well. And I, you know, I [00:04:00] think that's what we're focusing on for the podcast, the conference and basically everything I do seems to make sense. And I think we'll, we'll talk about the trends here that apply.[00:04:09] swyx: It's, it's just very strange. So, like, there's a mix of, like, keeping on top of research while not being a researcher and then putting that research into production. So, like, people always ask me, like, why are you covering Neuralibs? Like, this is a ML research conference and I'm like, well, yeah, I mean, we're not going to, to like, understand everything Or reproduce every single paper, but the stuff that is being found here is going to make it through into production at some point, you hope.[00:04:32] swyx: And then actually like when I talk to the researchers, they actually get very excited because they're like, oh, you guys are actually caring about how this goes into production and that's what they really really want. The measure of success is previously just peer review, right? Getting 7s and 8s on their um, Academic review conferences and stuff like citations is one metric, but money is a better metric.[00:04:51] Alessio: Money is a better metric. Yeah, and there were about 2200 people on the live stream or something like that. Yeah, yeah. Hundred on the live stream. So [00:05:00] I try my best to moderate, but it was a lot spicier in person with Jonathan and, and Dylan. Yeah, that it was in the chat on YouTube.[00:05:06] swyx: I would say that I actually also created.[00:05:09] swyx: Layen Space Live in order to address flaws that are perceived in academic conferences. This is not NeurIPS specific, it's ICML, NeurIPS. Basically, it's very sort of oriented towards the PhD student, uh, market, job market, right? Like literally all, basically everyone's there to advertise their research and skills and get jobs.[00:05:28] swyx: And then obviously all the, the companies go there to hire them. And I think that's great for the individual researchers, but for people going there to get info is not great because you have to read between the lines, bring a ton of context in order to understand every single paper. So what is missing is effectively what I ended up doing, which is domain by domain, go through and recap the best of the year.[00:05:48] swyx: Survey the field. And there are, like NeurIPS had a, uh, I think ICML had a like a position paper track, NeurIPS added a benchmarks, uh, datasets track. These are ways in which to address that [00:06:00] issue. Uh, there's always workshops as well. Every, every conference has, you know, a last day of workshops and stuff that provide more of an overview.[00:06:06] swyx: But they're not specifically prompted to do so. And I think really, uh, Organizing a conference is just about getting good speakers and giving them the correct prompts. And then they will just go and do that thing and they do a very good job of it. So I think Sarah did a fantastic job with the startups prompt.[00:06:21] swyx: I can't list everybody, but we did best of 2024 in startups, vision, open models. Post transformers, synthetic data, small models, and agents. And then the last one was the, uh, and then we also did a quick one on reasoning with Nathan Lambert. And then the last one, obviously, was the debate that people were very hyped about.[00:06:39] swyx: It was very awkward. And I'm really, really thankful for John Franco, basically, who stepped up to challenge Dylan. Because Dylan was like, yeah, I'll do it. But He was pro scaling. And I think everyone who is like in AI is pro scaling, right? So you need somebody who's ready to publicly say, no, we've hit a wall.[00:06:57] swyx: So that means you're saying Sam Altman's wrong. [00:07:00] You're saying, um, you know, everyone else is wrong. It helps that this was the day before Ilya went on, went up on stage and then said pre training has hit a wall. And data has hit a wall. So actually Jonathan ended up winning, and then Ilya supported that statement, and then Noam Brown on the last day further supported that statement as well.[00:07:17] swyx: So it's kind of interesting that I think the consensus kind of going in was that we're not done scaling, like you should believe in a better lesson. And then, four straight days in a row, you had Sepp Hochreiter, who is the creator of the LSTM, along with everyone's favorite OG in AI, which is Juergen Schmidhuber.[00:07:34] swyx: He said that, um, we're pre trading inside a wall, or like, we've run into a different kind of wall. And then we have, you know John Frankel, Ilya, and then Noam Brown are all saying variations of the same thing, that we have hit some kind of wall in the status quo of what pre trained, scaling large pre trained models has looked like, and we need a new thing.[00:07:54] swyx: And obviously the new thing for people is some make, either people are calling it inference time compute or test time [00:08:00] compute. I think the collective terminology has been inference time, and I think that makes sense because test time, calling it test, meaning, has a very pre trained bias, meaning that the only reason for running inference at all is to test your model.[00:08:11] swyx: That is not true. Right. Yeah. So, so, I quite agree that. OpenAI seems to have adopted, or the community seems to have adopted this terminology of ITC instead of TTC. And that, that makes a lot of sense because like now we care about inference, even right down to compute optimality. Like I actually interviewed this author who recovered or reviewed the Chinchilla paper.[00:08:31] swyx: Chinchilla paper is compute optimal training, but what is not stated in there is it's pre trained compute optimal training. And once you start caring about inference, compute optimal training, you have a different scaling law. And in a way that we did not know last year.[00:08:45] Alessio: I wonder, because John is, he's also on the side of attention is all you need.[00:08:49] Alessio: Like he had the bet with Sasha. So I'm curious, like he doesn't believe in scaling, but he thinks the transformer, I wonder if he's still. So, so,[00:08:56] swyx: so he, obviously everything is nuanced and you know, I told him to play a character [00:09:00] for this debate, right? So he actually does. Yeah. He still, he still believes that we can scale more.[00:09:04] swyx: Uh, he just assumed the character to be very game for, for playing this debate. So even more kudos to him that he assumed a position that he didn't believe in and still won the debate.[00:09:16] Alessio: Get rekt, Dylan. Um, do you just want to quickly run through some of these things? Like, uh, Sarah's presentation, just the highlights.[00:09:24] swyx: Yeah, we can't go through everyone's slides, but I pulled out some things as a factor of, like, stuff that we were going to talk about. And we'll[00:09:30] Alessio: publish[00:09:31] swyx: the rest. Yeah, we'll publish on this feed the best of 2024 in those domains. And hopefully people can benefit from the work that our speakers have done.[00:09:39] swyx: But I think it's, uh, these are just good slides. And I've been, I've been looking for a sort of end of year recaps from, from people.[00:09:44] The Competitive AI Landscape[00:09:44] swyx: The field has progressed a lot. You know, I think the max ELO in 2023 on LMSys used to be 1200 for LMSys ELOs. And now everyone is at least at, uh, 1275 in their ELOs, and this is across Gemini, Chadjibuti, [00:10:00] Grok, O1.[00:10:01] swyx: ai, which with their E Large model, and Enthopic, of course. It's a very, very competitive race. There are multiple Frontier labs all racing, but there is a clear tier zero Frontier. And then there's like a tier one. It's like, I wish I had everything else. Tier zero is extremely competitive. It's effectively now three horse race between Gemini, uh, Anthropic and OpenAI.[00:10:21] swyx: I would say that people are still holding out a candle for XAI. XAI, I think, for some reason, because their API was very slow to roll out, is not included in these metrics. So it's actually quite hard to put on there. As someone who also does charts, XAI is continually snubbed because they don't work well with the benchmarking people.[00:10:42] swyx: Yeah, yeah, yeah. It's a little trivia for why XAI always gets ignored. The other thing is market share. So these are slides from Sarah. We have it up on the screen. It has gone from very heavily open AI. So we have some numbers and estimates. These are from RAMP. Estimates of open AI market share in [00:11:00] December 2023.[00:11:01] swyx: And this is basically, what is it, GPT being 95 percent of production traffic. And I think if you correlate that with stuff that we asked. Harrison Chase on the LangChain episode, it was true. And then CLAUD 3 launched mid middle of this year. I think CLAUD 3 launched in March, CLAUD 3. 5 Sonnet was in June ish.[00:11:23] swyx: And you can start seeing the market share shift towards opening, uh, towards that topic, uh, very, very aggressively. The more recent one is Gemini. So if I scroll down a little bit, this is an even more recent dataset. So RAM's dataset ends in September 2 2. 2024. Gemini has basically launched a price war at the low end, uh, with Gemini Flash, uh, being basically free for personal use.[00:11:44] swyx: Like, I think people don't understand the free tier. It's something like a billion tokens per day. Unless you're trying to abuse it, you cannot really exhaust your free tier on Gemini. They're really trying to get you to use it. They know they're in like third place, um, fourth place, depending how you, how you count.[00:11:58] swyx: And so they're going after [00:12:00] the Lower tier first, and then, you know, maybe the upper tier later, but yeah, Gemini Flash, according to OpenRouter, is now 50 percent of their OpenRouter requests. Obviously, these are the small requests. These are small, cheap requests that are mathematically going to be more.[00:12:15] swyx: The smart ones obviously are still going to OpenAI. But, you know, it's a very, very big shift in the market. Like basically 2023, 2022, To going into 2024 opening has gone from nine five market share to Yeah. Reasonably somewhere between 50 to 75 market share.[00:12:29] Alessio: Yeah. I'm really curious how ramped does the attribution to the model?[00:12:32] Alessio: If it's API, because I think it's all credit card spin. . Well, but it's all, the credit card doesn't say maybe. Maybe the, maybe when they do expenses, they upload the PDF, but yeah, the, the German I think makes sense. I think that was one of my main 2024 takeaways that like. The best small model companies are the large labs, which is not something I would have thought that the open source kind of like long tail would be like the small model.[00:12:53] swyx: Yeah, different sizes of small models we're talking about here, right? Like so small model here for Gemini is AB, [00:13:00] right? Uh, mini. We don't know what the small model size is, but yeah, it's probably in the double digits or maybe single digits, but probably double digits. The open source community has kind of focused on the one to three B size.[00:13:11] swyx: Mm-hmm . Yeah. Maybe[00:13:12] swyx: zero, maybe 0.5 B uh, that's moon dream and that is small for you then, then that's great. It makes sense that we, we have a range for small now, which is like, may, maybe one to five B. Yeah. I'll even put that at, at, at the high end. And so this includes Gemma from Gemini as well. But also includes the Apple Foundation models, which I think Apple Foundation is 3B.[00:13:32] Alessio: Yeah. No, that's great. I mean, I think in the start small just meant cheap. I think today small is actually a more nuanced discussion, you know, that people weren't really having before.[00:13:43] swyx: Yeah, we can keep going. This is a slide that I smiley disagree with Sarah. She's pointing to the scale SEAL leaderboard. I think the Researchers that I talked with at NeurIPS were kind of positive on this because basically you need private test [00:14:00] sets to prevent contamination.[00:14:02] swyx: And Scale is one of maybe three or four people this year that has really made an effort in doing a credible private test set leaderboard. Llama405B does well compared to Gemini and GPT 40. And I think that's good. I would say that. You know, it's good to have an open model that is that big, that does well on those metrics.[00:14:23] swyx: But anyone putting 405B in production will tell you, if you scroll down a little bit to the artificial analysis numbers, that it is very slow and very expensive to infer. Um, it doesn't even fit on like one node. of, uh, of H100s. Cerebras will be happy to tell you they can serve 4 or 5B on their super large chips.[00:14:42] swyx: But, um, you know, if you need to do anything custom to it, you're still kind of constrained. So, is 4 or 5B really that relevant? Like, I think most people are basically saying that they only use 4 or 5B as a teacher model to distill down to something. Even Meta is doing it. So with Lama 3. [00:15:00] 3 launched, they only launched the 70B because they use 4 or 5B to distill the 70B.[00:15:03] swyx: So I don't know if like open source is keeping up. I think they're the, the open source industrial complex is very invested in telling you that the, if the gap is narrowing, I kind of disagree. I think that the gap is widening with O1. I think there are very, very smart people trying to narrow that gap and they should.[00:15:22] swyx: I really wish them success, but you cannot use a chart that is nearing 100 in your saturation chart. And look, the distance between open source and closed source is narrowing. Of course it's going to narrow because you're near 100. This is stupid. But in metrics that matter, is open source narrowing?[00:15:38] swyx: Probably not for O1 for a while. And it's really up to the open source guys to figure out if they can match O1 or not.[00:15:46] Alessio: I think inference time compute is bad for open source just because, you know, Doc can donate the flops at training time, but he cannot donate the flops at inference time. So it's really hard to like actually keep up on that axis.[00:15:59] Alessio: Big, big business [00:16:00] model shift. So I don't know what that means for the GPU clouds. I don't know what that means for the hyperscalers, but obviously the big labs have a lot of advantage. Because, like, it's not a static artifact that you're putting the compute in. You're kind of doing that still, but then you're putting a lot of computed inference too.[00:16:17] swyx: Yeah, yeah, yeah. Um, I mean, Llama4 will be reasoning oriented. We talked with Thomas Shalom. Um, kudos for getting that episode together. That was really nice. Good, well timed. Actually, I connected with the AI meta guy, uh, at NeurIPS, and, um, yeah, we're going to coordinate something for Llama4. Yeah, yeah,[00:16:32] Alessio: and our friend, yeah.[00:16:33] Alessio: Clara Shi just joined to lead the business agent side. So I'm sure we'll have her on in the new year.[00:16:39] swyx: Yeah. So, um, my comment on, on the business model shift, this is super interesting. Apparently it is wide knowledge that OpenAI wanted more than 6. 6 billion dollars for their fundraise. They wanted to raise, you know, higher, and they did not.[00:16:51] swyx: And what that means is basically like, it's very convenient that we're not getting GPT 5, which would have been a larger pre train. We should have a lot of upfront money. And [00:17:00] instead we're, we're converting fixed costs into variable costs, right. And passing it on effectively to the customer. And it's so much easier to take margin there because you can directly attribute it to like, Oh, you're using this more.[00:17:12] swyx: Therefore you, you pay more of the cost and I'll just slap a margin in there. So like that lets you control your growth margin and like tie your. Your spend, or your sort of inference spend, accordingly. And it's just really interesting to, that this change in the sort of inference paradigm has arrived exactly at the same time that the funding environment for pre training is effectively drying up, kind of.[00:17:36] swyx: I feel like maybe the VCs are very in tune with research anyway, so like, they would have noticed this, but, um, it's just interesting.[00:17:43] Alessio: Yeah, and I was looking back at our yearly recap of last year. Yeah. And the big thing was like the mixed trial price fights, you know, and I think now it's almost like there's nowhere to go, like, you know, Gemini Flash is like basically giving it away for free.[00:17:55] Alessio: So I think this is a good way for the labs to generate more revenue and pass down [00:18:00] some of the compute to the customer. I think they're going to[00:18:02] swyx: keep going. I think that 2, will come.[00:18:05] Alessio: Yeah, I know. Totally. I mean, next year, the first thing I'm doing is signing up for Devin. Signing up for the pro chat GBT.[00:18:12] Alessio: Just to try. I just want to see what does it look like to spend a thousand dollars a month on AI?[00:18:17] swyx: Yes. Yes. I think if your, if your, your job is a, at least AI content creator or VC or, you know, someone who, whose job it is to stay on, stay on top of things, you should already be spending like a thousand dollars a month on, on stuff.[00:18:28] swyx: And then obviously easy to spend, hard to use. You have to actually use. The good thing is that actually Google lets you do a lot of stuff for free now. So like deep research. That they just launched. Uses a ton of inference and it's, it's free while it's in preview.[00:18:45] Alessio: Yeah. They need to put that in Lindy.[00:18:47] Alessio: I've been using Lindy lately. I've been a built a bunch of things once we had flow because I liked the new thing. It's pretty good. I even did a phone call assistant. Um, yeah, they just launched Lindy voice. Yeah, I think once [00:19:00] they get advanced voice mode like capability today, still like speech to text, you can kind of tell.[00:19:06] Alessio: Um, but it's good for like reservations and things like that. So I have a meeting prepper thing. And so[00:19:13] swyx: it's good. Okay. I feel like we've, we've covered a lot of stuff. Uh, I, yeah, I, you know, I think We will go over the individual, uh, talks in a separate episode. Uh, I don't want to take too much time with, uh, this stuff, but that suffice to say that there is a lot of progress in each field.[00:19:28] swyx: Uh, we covered vision. Basically this is all like the audience voting for what they wanted. And then I just invited the best people I could find in each audience, especially agents. Um, Graham, who I talked to at ICML in Vienna, he is currently still number one. It's very hard to stay on top of SweetBench.[00:19:45] swyx: OpenHand is currently still number one. switchbench full, which is the hardest one. He had very good thoughts on agents, which I, which I'll highlight for people. Everyone is saying 2025 is the year of agents, just like they said last year. And, uh, but he had [00:20:00] thoughts on like eight parts of what are the frontier problems to solve in agents.[00:20:03] swyx: And so I'll highlight that talk as well.[00:20:05] Alessio: Yeah. The number six, which is the Hacken agents learn more about the environment, has been a Super interesting to us as well, just to think through, because, yeah, how do you put an agent in an enterprise where most things in an enterprise have never been public, you know, a lot of the tooling, like the code bases and things like that.[00:20:23] Alessio: So, yeah, there's not indexing and reg. Well, yeah, but it's more like. You can't really rag things that are not documented. But people know them based on how they've been doing it. You know, so I think there's almost this like, you know, Oh, institutional knowledge. Yeah, the boring word is kind of like a business process extraction.[00:20:38] Alessio: Yeah yeah, I see. It's like, how do you actually understand how these things are done? I see. Um, and I think today the, the problem is that, Yeah, the agents are, that most people are building are good at following instruction, but are not as good as like extracting them from you. Um, so I think that will be a big unlock just to touch quickly on the Jeff Dean thing.[00:20:55] Alessio: I thought it was pretty, I mean, we'll link it in the, in the things, but. I think the main [00:21:00] focus was like, how do you use ML to optimize the systems instead of just focusing on ML to do something else? Yeah, I think speculative decoding, we had, you know, Eugene from RWKB on the podcast before, like he's doing a lot of that with Fetterless AI.[00:21:12] swyx: Everyone is. I would say it's the norm. I'm a little bit uncomfortable with how much it costs, because it does use more of the GPU per call. But because everyone is so keen on fast inference, then yeah, makes sense.[00:21:24] Alessio: Exactly. Um, yeah, but we'll link that. Obviously Jeff is great.[00:21:30] swyx: Jeff is, Jeff's talk was more, it wasn't focused on Gemini.[00:21:33] swyx: I think people got the wrong impression from my tweet. It's more about how Google approaches ML and uses ML to design systems and then systems feedback into ML. And I think this ties in with Lubna's talk.[00:21:45] Synthetic Data and Future Trends[00:21:45] swyx: on synthetic data where it's basically the story of bootstrapping of humans and AI in AI research or AI in production.[00:21:53] swyx: So her talk was on synthetic data, where like how much synthetic data has grown in 2024 in the pre training side, the post training side, [00:22:00] and the eval side. And I think Jeff then also extended it basically to chips, uh, to chip design. So he'd spend a lot of time talking about alpha chip. And most of us in the audience are like, we're not working on hardware, man.[00:22:11] swyx: Like you guys are great. TPU is great. Okay. We'll buy TPUs.[00:22:14] Alessio: And then there was the earlier talk. Yeah. But, and then we have, uh, I don't know if we're calling them essays. What are we calling these? But[00:22:23] swyx: for me, it's just like bonus for late in space supporters, because I feel like they haven't been getting anything.[00:22:29] swyx: And then I wanted a more high frequency way to write stuff. Like that one I wrote in an afternoon. I think basically we now have an answer to what Ilya saw. It's one year since. The blip. And we know what he saw in 2014. We know what he saw in 2024. We think we know what he sees in 2024. He gave some hints and then we have vague indications of what he saw in 2023.[00:22:54] swyx: So that was the Oh, and then 2016 as well, because of this lawsuit with Elon, OpenAI [00:23:00] is publishing emails from Sam's, like, his personal text messages to Siobhan, Zelis, or whatever. So, like, we have emails from Ilya saying, this is what we're seeing in OpenAI, and this is why we need to scale up GPUs. And I think it's very prescient in 2016 to write that.[00:23:16] swyx: And so, like, it is exactly, like, basically his insights. It's him and Greg, basically just kind of driving the scaling up of OpenAI, while they're still playing Dota. They're like, no, like, we see the path here.[00:23:30] Alessio: Yeah, and it's funny, yeah, they even mention, you know, we can only train on 1v1 Dota. We need to train on 5v5, and that takes too many GPUs.[00:23:37] Alessio: Yeah,[00:23:37] swyx: and at least for me, I can speak for myself, like, I didn't see the path from Dota to where we are today. I think even, maybe if you ask them, like, they wouldn't necessarily draw a straight line. Yeah,[00:23:47] Alessio: no, definitely. But I think like that was like the whole idea of almost like the RL and we talked about this with Nathan on his podcast.[00:23:55] Alessio: It's like with RL, you can get very good at specific things, but then you can't really like generalize as much. And I [00:24:00] think the language models are like the opposite, which is like, you're going to throw all this data at them and scale them up, but then you really need to drive them home on a specific task later on.[00:24:08] Alessio: And we'll talk about the open AI reinforcement, fine tuning, um, announcement too, and all of that. But yeah, I think like scale is all you need. That's kind of what Elia will be remembered for. And I think just maybe to clarify on like the pre training is over thing that people love to tweet. I think the point of the talk was like everybody, we're scaling these chips, we're scaling the compute, but like the second ingredient which is data is not scaling at the same rate.[00:24:35] Alessio: So it's not necessarily pre training is over. It's kind of like What got us here won't get us there. In his email, he predicted like 10x growth every two years or something like that. And I think maybe now it's like, you know, you can 10x the chips again, but[00:24:49] swyx: I think it's 10x per year. Was it? I don't know.[00:24:52] Alessio: Exactly. And Moore's law is like 2x. So it's like, you know, much faster than that. And yeah, I like the fossil fuel of AI [00:25:00] analogy. It's kind of like, you know, the little background tokens thing. So the OpenAI reinforcement fine tuning is basically like, instead of fine tuning on data, you fine tune on a reward model.[00:25:09] Alessio: So it's basically like, instead of being data driven, it's like task driven. And I think people have tasks to do, they don't really have a lot of data. So I'm curious to see how that changes, how many people fine tune, because I think this is what people run into. It's like, Oh, you can fine tune llama. And it's like, okay, where do I get the data?[00:25:27] Alessio: To fine tune it on, you know, so it's great that we're moving the thing. And then I really like he had this chart where like, you know, the brain mass and the body mass thing is basically like mammals that scaled linearly by brain and body size, and then humans kind of like broke off the slope. So it's almost like maybe the mammal slope is like the pre training slope.[00:25:46] Alessio: And then the post training slope is like the, the human one.[00:25:49] swyx: Yeah. I wonder what the. I mean, we'll know in 10 years, but I wonder what the y axis is for, for Ilya's SSI. We'll try to get them on.[00:25:57] Alessio: Ilya, if you're listening, you're [00:26:00] welcome here. Yeah, and then he had, you know, what comes next, like agent, synthetic data, inference, compute, I thought all of that was like that.[00:26:05] Alessio: I don't[00:26:05] swyx: think he was dropping any alpha there. Yeah, yeah, yeah.[00:26:07] Alessio: Yeah. Any other new reps? Highlights?[00:26:10] swyx: I think that there was comparatively a lot more work. Oh, by the way, I need to plug that, uh, my friend Yi made this, like, little nice paper. Yeah, that was really[00:26:20] swyx: nice.[00:26:20] swyx: Uh, of, uh, of, like, all the, he's, she called it must read papers of 2024.[00:26:26] swyx: So I laid out some of these at NeurIPS, and it was just gone. Like, everyone just picked it up. Because people are dying for, like, little guidance and visualizations And so, uh, I thought it was really super nice that we got there.[00:26:38] Alessio: Should we do a late in space book for each year? Uh, I thought about it. For each year we should.[00:26:42] Alessio: Coffee table book. Yeah. Yeah. Okay. Put it in the will. Hi, Will. By the way, we haven't introduced you. He's our new, you know, general organist, Jamie. You need to[00:26:52] swyx: pull up more things. One thing I saw that, uh, Okay, one fun one, and then one [00:27:00] more general one. So the fun one is this paper on agent collusion. This is a paper on steganography.[00:27:06] swyx: This is secret collusion among AI agents, multi agent deception via steganography. I tried to go to NeurIPS in order to find these kinds of papers because the real reason Like NeurIPS this year has a lottery system. A lot of people actually even go and don't buy tickets because they just go and attend the side events.[00:27:22] swyx: And then also the people who go and end up crowding around the most popular papers, which you already know and already read them before you showed up to NeurIPS. So the only reason you go there is to talk to the paper authors, but there's like something like 10, 000 other. All these papers out there that, you know, are just people's work that they, that they did on the air and they failed to get attention for one reason or another.[00:27:42] swyx: And this was one of them. Uh, it was like all the way at the back. And this is a deep mind paper that actually focuses on collusion between AI agents, uh, by hiding messages in the text that they generate. Uh, so that's what steganography is. So a very simple example would be the first letter of every word.[00:27:57] swyx: If you Pick that out, you know, and the code sends a [00:28:00] different message than that. But something I've always emphasized is to LLMs, we read left to right. LLMs can read up, down, sideways, you know, in random character order. And it's the same to them as it is to us. So if we were ever to get You know, self motivated, underlined LLMs that we're trying to collaborate to take over the planet.[00:28:19] swyx: This would be how they do it. They spread messages among us in the messages that we generate. And he developed a scaling law for that. So he marked, I'm showing it on screen right now, the emergence of this phenomenon. Basically, for example, for Cypher encoding, GPT 2, Lama 2, mixed trial, GPT 3. 5, zero capabilities, and sudden 4.[00:28:40] swyx: And this is the kind of Jason Wei type emergence properties that people kind of look for. I think what made this paper stand out as well, so he developed the benchmark for steganography collusion, and he also focused on shelling point collusion, which is very low coordination. For agreeing on a decoding encoding format, you kind of need to have some [00:29:00] agreement on that.[00:29:00] swyx: But, but shelling point means like very, very low or almost no coordination. So for example, if I, if I ask someone, if the only message I give you is meet me in New York and you're not aware. Or when you would probably meet me at Grand Central Station. That is the Grand Central Station is a shelling point.[00:29:16] swyx: And it's probably somewhere, somewhere during the day. That is the shelling point of New York is Grand Central. To that extent, shelling points for steganography are things like the, the, the common decoding methods that we talked about. It will be interesting at some point in the future when we are worried about alignment.[00:29:30] swyx: It is not interesting today, but it's interesting that DeepMind is already thinking about this.[00:29:36] Alessio: I think that's like one of the hardest things about NeurIPS. It's like the long tail. I[00:29:41] swyx: found a pricing guy. I'm going to feature him on the podcast. Basically, this guy from NVIDIA worked out the optimal pricing for language models.[00:29:51] swyx: It's basically an econometrics paper at NeurIPS, where everyone else is talking about GPUs. And the guy with the GPUs is[00:29:57] Alessio: talking[00:29:57] swyx: about economics instead. [00:30:00] That was the sort of fun one. So the focus I saw is that model papers at NeurIPS are kind of dead. No one really presents models anymore. It's just data sets.[00:30:12] swyx: This is all the grad students are working on. So like there was a data sets track and then I was looking around like, I was like, you don't need a data sets track because every paper is a data sets paper. And so data sets and benchmarks, they're kind of flip sides of the same thing. So Yeah. Cool. Yeah, if you're a grad student, you're a GPU boy, you kind of work on that.[00:30:30] swyx: And then the, the sort of big model that people walk around and pick the ones that they like, and then they use it in their models. And that's, that's kind of how it develops. I, I feel like, um, like, like you didn't last year, you had people like Hao Tian who worked on Lava, which is take Lama and add Vision.[00:30:47] swyx: And then obviously actually I hired him and he added Vision to Grok. Now he's the Vision Grok guy. This year, I don't think there was any of those.[00:30:55] Alessio: What were the most popular, like, orals? Last year it was like the [00:31:00] Mixed Monarch, I think, was like the most attended. Yeah, uh, I need to look it up. Yeah, I mean, if nothing comes to mind, that's also kind of like an answer in a way.[00:31:10] Alessio: But I think last year there was a lot of interest in, like, furthering models and, like, different architectures and all of that.[00:31:16] swyx: I will say that I felt the orals, oral picks this year were not very good. Either that or maybe it's just a So that's the highlight of how I have changed in terms of how I view papers.[00:31:29] swyx: So like, in my estimation, two of the best papers in this year for datasets or data comp and refined web or fine web. These are two actually industrially used papers, not highlighted for a while. I think DCLM got the spotlight, FineWeb didn't even get the spotlight. So like, it's just that the picks were different.[00:31:48] swyx: But one thing that does get a lot of play that a lot of people are debating is the role that's scheduled. This is the schedule free optimizer paper from Meta from Aaron DeFazio. And this [00:32:00] year in the ML community, there's been a lot of chat about shampoo, soap, all the bathroom amenities for optimizing your learning rates.[00:32:08] swyx: And, uh, most people at the big labs are. Who I asked about this, um, say that it's cute, but it's not something that matters. I don't know, but it's something that was discussed and very, very popular. 4Wars[00:32:19] Alessio: of AI recap maybe, just quickly. Um, where do you want to start? Data?[00:32:26] swyx: So to remind people, this is the 4Wars piece that we did as one of our earlier recaps of this year.[00:32:31] swyx: And the belligerents are on the left, journalists, writers, artists, anyone who owns IP basically, New York Times, Stack Overflow, Reddit, Getty, Sarah Silverman, George RR Martin. Yeah, and I think this year we can add Scarlett Johansson to that side of the fence. So anyone suing, open the eye, basically. I actually wanted to get a snapshot of all the lawsuits.[00:32:52] swyx: I'm sure some lawyer can do it. That's the data quality war. On the right hand side, we have the synthetic data people, and I think we talked about Lumna's talk, you know, [00:33:00] really showing how much synthetic data has come along this year. I think there was a bit of a fight between scale. ai and the synthetic data community, because scale.[00:33:09] swyx: ai published a paper saying that synthetic data doesn't work. Surprise, surprise, scale. ai is the leading vendor of non synthetic data. Only[00:33:17] Alessio: cage free annotated data is useful.[00:33:21] swyx: So I think there's some debate going on there, but I don't think it's much debate anymore that at least synthetic data, for the reasons that are blessed in Luna's talk, Makes sense.[00:33:32] swyx: I don't know if you have any perspectives there.[00:33:34] Alessio: I think, again, going back to the reinforcement fine tuning, I think that will change a little bit how people think about it. I think today people mostly use synthetic data, yeah, for distillation and kind of like fine tuning a smaller model from like a larger model.[00:33:46] Alessio: I'm not super aware of how the frontier labs use it outside of like the rephrase, the web thing that Apple also did. But yeah, I think it'll be. Useful. I think like whether or not that gets us the big [00:34:00] next step, I think that's maybe like TBD, you know, I think people love talking about data because it's like a GPU poor, you know, I think, uh, synthetic data is like something that people can do, you know, so they feel more opinionated about it compared to, yeah, the optimizers stuff, which is like,[00:34:17] swyx: they don't[00:34:17] Alessio: really work[00:34:18] swyx: on.[00:34:18] swyx: I think that there is an angle to the reasoning synthetic data. So this year, we covered in the paper club, the star series of papers. So that's star, Q star, V star. It basically helps you to synthesize reasoning steps, or at least distill reasoning steps from a verifier. And if you look at the OpenAI RFT, API that they released, or that they announced, basically they're asking you to submit graders, or they choose from a preset list of graders.[00:34:49] swyx: Basically It feels like a way to create valid synthetic data for them to fine tune their reasoning paths on. Um, so I think that is another angle where it starts to make sense. And [00:35:00] so like, it's very funny that basically all the data quality wars between Let's say the music industry or like the newspaper publishing industry or the textbooks industry on the big labs.[00:35:11] swyx: It's all of the pre training era. And then like the new era, like the reasoning era, like nobody has any problem with all the reasoning, especially because it's all like sort of math and science oriented with, with very reasonable graders. I think the more interesting next step is how does it generalize beyond STEM?[00:35:27] swyx: We've been using O1 for And I would say like for summarization and creative writing and instruction following, I think it's underrated. I started using O1 in our intro songs before we killed the intro songs, but it's very good at writing lyrics. You know, I can actually say like, I think one of the O1 pro demos.[00:35:46] swyx: All of these things that Noam was showing was that, you know, you can write an entire paragraph or three paragraphs without using the letter A, right?[00:35:53] Creative Writing with AI[00:35:53] swyx: So like, like literally just anything instead of token, like not even token level, character level manipulation and [00:36:00] counting and instruction following. It's, uh, it's very, very strong.[00:36:02] swyx: And so no surprises when I ask it to rhyme, uh, and to, to create song lyrics, it's going to do that very much better than in previous models. So I think it's underrated for creative writing.[00:36:11] Alessio: Yeah.[00:36:12] Legal and Ethical Issues in AI[00:36:12] Alessio: What do you think is the rationale that they're going to have in court when they don't show you the thinking traces of O1, but then they want us to, like, they're getting sued for using other publishers data, you know, but then on their end, they're like, well, you shouldn't be using my data to then train your model.[00:36:29] Alessio: So I'm curious to see how that kind of comes. Yeah, I mean, OPA has[00:36:32] swyx: many ways to publish, to punish people without bringing, taking them to court. Already banned ByteDance for distilling their, their info. And so anyone caught distilling the chain of thought will be just disallowed to continue on, on, on the API.[00:36:44] swyx: And it's fine. It's no big deal. Like, I don't even think that's an issue at all, just because the chain of thoughts are pretty well hidden. Like you have to work very, very hard to, to get it to leak. And then even when it leaks the chain of thought, you don't know if it's, if it's [00:37:00] The bigger concern is actually that there's not that much IP hiding behind it, that Cosign, which we talked about, we talked to him on Dev Day, can just fine tune 4.[00:37:13] swyx: 0 to beat 0. 1 Cloud SONET so far is beating O1 on coding tasks without, at least O1 preview, without being a reasoning model, same for Gemini Pro or Gemini 2. 0. So like, how much is reasoning important? How much of a moat is there in this, like, All of these are proprietary sort of training data that they've presumably accomplished.[00:37:34] swyx: Because even DeepSeek was able to do it. And they had, you know, two months notice to do this, to do R1. So, it's actually unclear how much moat there is. Obviously, you know, if you talk to the Strawberry team, they'll be like, yeah, I mean, we spent the last two years doing this. So, we don't know. And it's going to be Interesting because there'll be a lot of noise from people who say they have inference time compute and actually don't because they just have fancy chain of thought.[00:38:00][00:38:00] swyx: And then there's other people who actually do have very good chain of thought. And you will not see them on the same level as OpenAI because OpenAI has invested a lot in building up the mythology of their team. Um, which makes sense. Like the real answer is somewhere in between.[00:38:13] Alessio: Yeah, I think that's kind of like the main data war story developing.[00:38:18] The Data War: GPU Poor vs. GPU Rich[00:38:18] Alessio: GPU poor versus GPU rich. Yeah. Where do you think we are? I think there was, again, going back to like the small model thing, there was like a time in which the GPU poor were kind of like the rebel faction working on like these models that were like open and small and cheap. And I think today people don't really care as much about GPUs anymore.[00:38:37] Alessio: You also see it in the price of the GPUs. Like, you know, that market is kind of like plummeted because there's people don't want to be, they want to be GPU free. They don't even want to be poor. They just want to be, you know, completely without them. Yeah. How do you think about this war? You[00:38:52] swyx: can tell me about this, but like, I feel like the, the appetite for GPU rich startups, like the, you know, the, the funding plan is we will raise 60 million and [00:39:00] we'll give 50 of that to NVIDIA.[00:39:01] swyx: That is gone, right? Like, no one's, no one's pitching that. This was literally the plan, the exact plan of like, I can name like four or five startups, you know, this time last year. So yeah, GPU rich startups gone.[00:39:12] The Rise of GPU Ultra Rich[00:39:12] swyx: But I think like, The GPU ultra rich, the GPU ultra high net worth is still going. So, um, now we're, you know, we had Leopold's essay on the trillion dollar cluster.[00:39:23] swyx: We're not quite there yet. We have multiple labs, um, you know, XAI very famously, you know, Jensen Huang praising them for being. Best boy number one in spinning up 100, 000 GPU cluster in like 12 days or something. So likewise at Meta, likewise at OpenAI, likewise at the other labs as well. So like the GPU ultra rich are going to keep doing that because I think partially it's an article of faith now that you just need it.[00:39:46] swyx: Like you don't even know what it's going to, what you're going to use it for. You just, you just need it. And it makes sense that if, especially if we're going into. More researchy territory than we are. So let's say 2020 to 2023 was [00:40:00] let's scale big models territory because we had GPT 3 in 2020 and we were like, okay, we'll go from 1.[00:40:05] swyx: 75b to 1. 8b, 1. 8t. And that was GPT 3 to GPT 4. Okay, that's done. As far as everyone is concerned, Opus 3. 5 is not coming out, GPT 4. 5 is not coming out, and Gemini 2, we don't have Pro, whatever. We've hit that wall. Maybe I'll call it the 2 trillion perimeter wall. We're not going to 10 trillion. No one thinks it's a good idea, at least from training costs, from the amount of data, or at least the inference.[00:40:36] swyx: Would you pay 10x the price of GPT Probably not. Like, like you want something else that, that is at least more useful. So it makes sense that people are pivoting in terms of their inference paradigm.[00:40:47] Emerging Trends in AI Models[00:40:47] swyx: And so when it's more researchy, then you actually need more just general purpose compute to mess around with, uh, at the exact same time that production deployments of the old, the previous paradigm is still ramping up,[00:40:58] swyx: um,[00:40:58] swyx: uh, pretty aggressively.[00:40:59] swyx: So [00:41:00] it makes sense that the GPU rich are growing. We have now interviewed both together and fireworks and replicates. Uh, we haven't done any scale yet. But I think Amazon, maybe kind of a sleeper one, Amazon, in a sense of like they, at reInvent, I wasn't expecting them to do so well, but they are now a foundation model lab.[00:41:18] swyx: It's kind of interesting. Um, I think, uh, you know, David went over there and started just creating models.[00:41:25] Alessio: Yeah, I mean, that's the power of prepaid contracts. I think like a lot of AWS customers, you know, they do this big reserve instance contracts and now they got to use their money. That's why so many startups.[00:41:37] Alessio: Get bought through the AWS marketplace so they can kind of bundle them together and prefer pricing.[00:41:42] swyx: Okay, so maybe GPU super rich doing very well, GPU middle class dead, and then GPU[00:41:48] Alessio: poor. I mean, my thing is like, everybody should just be GPU rich. There shouldn't really be, even the GPU poorest, it's like, does it really make sense to be GPU poor?[00:41:57] Alessio: Like, if you're GPU poor, you should just use the [00:42:00] cloud. Yes, you know, and I think there might be a future once we kind of like figure out what the size and shape of these models is where like the tiny box and these things come to fruition where like you can be GPU poor at home. But I think today is like, why are you working so hard to like get these models to run on like very small clusters where it's like, It's so cheap to run them.[00:42:21] Alessio: Yeah, yeah,[00:42:22] swyx: yeah. I think mostly people think it's cool. People think it's a stepping stone to scaling up. So they aspire to be GPU rich one day and they're working on new methods. Like news research, like probably the most deep tech thing they've done this year is Distro or whatever the new name is.[00:42:38] swyx: There's a lot of interest in heterogeneous computing, distributed computing. I tend generally to de emphasize that historically, but it may be coming to a time where it is starting to be relevant. I don't know. You know, SF compute launched their compute marketplace this year, and like, who's really using that?[00:42:53] swyx: Like, it's a bunch of small clusters, disparate types of compute, and if you can make that [00:43:00] useful, then that will be very beneficial to the broader community, but maybe still not the source of frontier models. It's just going to be a second tier of compute that is unlocked for people, and that's fine. But yeah, I mean, I think this year, I would say a lot more on device, We are, I now have Apple intelligence on my phone.[00:43:19] swyx: Doesn't do anything apart from summarize my notifications. But still, not bad. Like, it's multi modal.[00:43:25] Alessio: Yeah, the notification summaries are so and so in my experience.[00:43:29] swyx: Yeah, but they add, they add juice to life. And then, um, Chrome Nano, uh, Gemini Nano is coming out in Chrome. Uh, they're still feature flagged, but you can, you can try it now if you, if you use the, uh, the alpha.[00:43:40] swyx: And so, like, I, I think, like, you know, We're getting the sort of GPU poor version of a lot of these things coming out, and I think it's like quite useful. Like Windows as well, rolling out RWKB in sort of every Windows department is super cool. And I think the last thing that I never put in this GPU poor war, that I think I should now, [00:44:00] is the number of startups that are GPU poor but still scaling very well, as sort of wrappers on top of either a foundation model lab, or GPU Cloud.[00:44:10] swyx: GPU Cloud, it would be Suno. Suno, Ramp has rated as one of the top ranked, fastest growing startups of the year. Um, I think the last public number is like zero to 20 million this year in ARR and Suno runs on Moto. So Suno itself is not GPU rich, but they're just doing the training on, on Moto, uh, who we've also talked to on, on the podcast.[00:44:31] swyx: The other one would be Bolt, straight cloud wrapper. And, and, um, Again, another, now they've announced 20 million ARR, which is another step up from our 8 million that we put on the title. So yeah, I mean, it's crazy that all these GPU pores are finding a way while the GPU riches are also finding a way. And then the only failures, I kind of call this the GPU smiling curve, where the edges do well, because you're either close to the machines, and you're like [00:45:00] number one on the machines, or you're like close to the customers, and you're number one on the customer side.[00:45:03] swyx: And the people who are in the middle. Inflection, um, character, didn't do that great. I think character did the best of all of them. Like, you have a note in here that we apparently said that character's price tag was[00:45:15] Alessio: 1B.[00:45:15] swyx: Did I say that?[00:45:16] Alessio: Yeah. You said Google should just buy them for 1B. I thought it was a crazy number.[00:45:20] Alessio: Then they paid 2. 7 billion. I mean, for like,[00:45:22] swyx: yeah.[00:45:22] Alessio: What do you pay for node? Like, I don't know what the game world was like. Maybe the starting price was 1B. I mean, whatever it was, it worked out for everybody involved.[00:45:31] The Multi-Modality War[00:45:31] Alessio: Multimodality war. And this one, we never had text to video in the first version, which now is the hottest.[00:45:37] swyx: Yeah, I would say it's a subset of image, but yes.[00:45:40] Alessio: Yeah, well, but I think at the time it wasn't really something people were doing, and now we had VO2 just came out yesterday. Uh, Sora was released last month, last week. I've not tried Sora, because the day that I tried, it wasn't, yeah. I[00:45:54] swyx: think it's generally available now, you can go to Sora.[00:45:56] swyx: com and try it. Yeah, they had[00:45:58] Alessio: the outage. Which I [00:46:00] think also played a part into it. Small things. Yeah. What's the other model that you posted today that was on Replicate? Video or OneLive?[00:46:08] swyx: Yeah. Very, very nondescript name, but it is from Minimax, which I think is a Chinese lab. The Chinese labs do surprisingly well at the video models.[00:46:20] swyx: I'm not sure it's actually Chinese. I don't know. Hold me up to that. Yep. China. It's good. Yeah, the Chinese love video. What can I say? They have a lot of training data for video. Or a more relaxed regulatory environment.[00:46:37] Alessio: Uh, well, sure, in some way. Yeah, I don't think there's much else there. I think like, you know, on the image side, I think it's still open.[00:46:45] Alessio: Yeah, I mean,[00:46:46] swyx: 11labs is now a unicorn. So basically, what is multi modality war? Multi modality war is, do you specialize in a single modality, right? Or do you have GodModel that does all the modalities? So this is [00:47:00] definitely still going, in a sense of 11 labs, you know, now Unicorn, PicoLabs doing well, they launched Pico 2.[00:47:06] swyx: 0 recently, HeyGen, I think has reached 100 million ARR, Assembly, I don't know, but they have billboards all over the place, so I assume they're doing very, very well. So these are all specialist models, specialist models and specialist startups. And then there's the big labs who are doing the sort of all in one play.[00:47:24] swyx: And then here I would highlight Gemini 2 for having native image output. Have you seen the demos? Um, yeah, it's, it's hard to keep up. Literally they launched this last week and a shout out to Paige Bailey, who came to the Latent Space event to demo on the day of launch. And she wasn't prepared. She was just like, I'm just going to show you.[00:47:43] swyx: So they have voice. They have, you know, obviously image input, and then they obviously can code gen and all that. But the new one that OpenAI and Meta both have but they haven't launched yet is image output. So you can literally, um, I think their demo video was that you put in an image of a [00:48:00] car, and you ask for minor modifications to that car.[00:48:02] swyx: They can generate you that modification exactly as you asked. So there's no need for the stable diffusion or comfy UI workflow of like mask here and then like infill there in paint there and all that, all that stuff. This is small model nonsense. Big model people are like, huh, we got you in as everything in the transformer.[00:48:21] swyx: This is the multimodality war, which is, do you, do you bet on the God model or do you string together a whole bunch of, uh, Small models like a, like a chump. Yeah,[00:48:29] Alessio: I don't know, man. Yeah, that would be interesting. I mean, obviously I use Midjourney for all of our thumbnails. Um, they've been doing a ton on the product, I would say.[00:48:38] Alessio: They launched a new Midjourney editor thing. They've been doing a ton. Because I think, yeah, the motto is kind of like, Maybe, you know, people say black forest, the black forest models are better than mid journey on a pixel by pixel basis. But I think when you put it, put it together, have you tried[00:48:53] swyx: the same problems on black forest?[00:48:55] Alessio: Yes. But the problem is just like, you know, on black forest, it generates one image. And then it's like, you got to [00:49:00] regenerate. You don't have all these like UI things. Like what I do, no, but it's like time issue, you know, it's like a mid[00:49:06] swyx: journey. Call the API four times.[00:49:08] Alessio: No, but then there's no like variate.[00:49:10] Alessio: Like the good thing about mid journey is like, you just go in there and you're cooking. There's a lot of stuff that just makes it really easy. And I think people underestimate that. Like, it's not really a skill issue, because I'm paying mid journey, so it's a Black Forest skill issue, because I'm not paying them, you know?[00:49:24] Alessio: Yeah,[00:49:25] swyx: so, okay, so, uh, this is a UX thing, right? Like, you, you, you understand that, at least, we think that Black Forest should be able to do all that stuff. I will also shout out, ReCraft has come out, uh, on top of the image arena that, uh, artificial analysis has done, has apparently, uh, Flux's place. Is this still true?[00:49:41] swyx: So, Artificial Analysis is now a company. I highlighted them I think in one of the early AI Newses of the year. And they have launched a whole bunch of arenas. So, they're trying to take on LM Arena, Anastasios and crew. And they have an image arena. Oh yeah, Recraft v3 is now beating Flux 1. 1. Which is very surprising [00:50:00] because Flux And Black Forest Labs are the old stable diffusion crew who left stability after, um, the management issues.[00:50:06] swyx: So Recurve has come from nowhere to be the top image model. Uh, very, very strange. I would also highlight that Grok has now launched Aurora, which is, it's very interesting dynamics between Grok and Black Forest Labs because Grok's images were originally launched, uh, in partnership with Black Forest Labs as a, as a thin wrapper.[00:50:24] swyx: And then Grok was like, no, we'll make our own. And so they've made their own. I don't know, there are no APIs or benchmarks about it. They just announced it. So yeah, that's the multi modality war. I would say that so far, the small model, the dedicated model people are winning, because they are just focused on their tasks.[00:50:42] swyx: But the big model, People are always catching up. And the moment I saw the Gemini 2 demo of image editing, where I can put in an image and just request it and it does, that's how AI should work. Not like a whole bunch of complicated steps. So it really is something. And I think one frontier that we haven't [00:51:00] seen this year, like obviously video has done very well, and it will continue to grow.[00:51:03] swyx: You know, we only have Sora Turbo today, but at some point we'll get full Sora. Oh, at least the Hollywood Labs will get Fulsora. We haven't seen video to audio, or video synced to audio. And so the researchers that I talked to are already starting to talk about that as the next frontier. But there's still maybe like five more years of video left to actually be Soda.[00:51:23] swyx: I would say that Gemini's approach Compared to OpenAI, Gemini seems, or DeepMind's approach to video seems a lot more fully fledged than OpenAI. Because if you look at the ICML recap that I published that so far nobody has listened to, um, that people have listened to it. It's just a different, definitely different audience.[00:51:43] swyx: It's only seven hours long. Why are people not listening? It's like everything in Uh, so, so DeepMind has, is working on Genie. They also launched Genie 2 and VideoPoet. So, like, they have maybe four years advantage on world modeling that OpenAI does not have. Because OpenAI basically only started [00:52:00] Diffusion Transformers last year, you know, when they hired, uh, Bill Peebles.[00:52:03] swyx: So, DeepMind has, has a bit of advantage here, I would say, in, in, in showing, like, the reason that VO2, while one, They cherry pick their videos. So obviously it looks better than Sora, but the reason I would believe that VO2, uh, when it's fully launched will do very well is because they have all this background work in video that they've done for years.[00:52:22] swyx: Like, like last year's NeurIPS, I already was interviewing some of their video people. I forget their model name, but for, for people who are dedicated fans, they can go to NeurIPS 2023 and see, see that paper.[00:52:32] Alessio: And then last but not least, the LLMOS. We renamed it to Ragops, formerly known as[00:52:39] swyx: Ragops War. I put the latest chart on the Braintrust episode.[00:52:43] swyx: I think I'm going to separate these essays from the episode notes. So the reason I used to do that, by the way, is because I wanted to show up on Hacker News. I wanted the podcast to show up on Hacker News. So I always put an essay inside of there because Hacker News people like to read and not listen.[00:52:58] Alessio: So episode essays,[00:52:59] swyx: I remember [00:53:00] purchasing them separately. You say Lanchain Llama Index is still growing.[00:53:03] Alessio: Yeah, so I looked at the PyPy stats, you know. I don't care about stars. On PyPy you see Do you want to share your screen? Yes. I prefer to look at actual downloads, not at stars on GitHub. So if you look at, you know, Lanchain still growing.[00:53:20] Alessio: These are the last six months. Llama Index still growing. What I've basically seen is like things that, One, obviously these things have A commercial product. So there's like people buying this and sticking with it versus kind of hopping in between things versus, you know, for example, crew AI, not really growing as much.[00:53:38] Alessio: The stars are growing. If you look on GitHub, like the stars are growing, but kind of like the usage is kind of like flat. In the last six months, have they done some[00:53:4

god ceo new york amazon spotify time world europe google ai china apple vision pr voice future speaking san francisco new york times phd video thinking chinese simple data predictions elon musk iphone surprise impact legal code tesla chatgpt reflecting memory ga discord busy reddit lgbt cloud flash stem honestly ab pros jeff bezos windows excited researchers unicorns lower ip tackling sort survey insane tier cto vc whispers applications doc signing seal fireworks f1 genie academic openai sf gemini organizing nvidia ux api assembly davos frontier chrome makes scarlett johansson ui mm turbo gpt bash soda aws ml lama dropbox mosaic creative writing github drafting reinvent canvas 1b bolt apis lava ruler exact stripe dev pico strawberry hundred wwdc vm sander bt flux vcs taiwanese 200k moto arr gartner opus assumption sora google docs nemo parting sam altman blackwell llm google drive sombra gpu opa tbd ramp 3b elia elo agi gnome 5b estimates midjourney bytedance leopold dota ciso haiku dx sarah silverman coursera rag gpus sonnets george rr martin cypher quill getty cobalt sdks deepmind ilya perplexity noam sheesh grok v2 ttc alessio future trends anthropic lms satya r1 ssi stack overflow 8b rl itc emerging trends theoretically sota vo2 yi replicate suno mistral veo black forest inflection graphql aitor xai brain trust databricks chinchillas gpts adept nosql mcp jensen huang grand central ai models grand central station hacker news zep hacken ethical issues cosign claud ai news gpc distro lubna autogpt neo4j tpu o3 jeremy howard gbt o1 gpd quent heygen gradients exa loras 70b langchain minimax neurips 400b jeff dean 128k elos gemini pro cerebras code interpreter icml john franco lstm r1s ai winter aws reinvent muser latent space pypy dan gross nova pro paige bailey noam brown quiet capital john frankel
Latent Space: The AI Engineer Podcast — CodeGen, Agents, Computer Vision, Data Science, AI UX and all things Software 3.0

Happy holidays! We'll be sharing snippets from Latent Space LIVE! through the break bringing you the best of 2024! We want to express our deepest appreciation to event sponsors AWS, Daylight Computer, Thoth.ai, StrongCompute, Notable Capital, and most of all our LS supporters who helped fund the venue and A/V production!For NeurIPS last year we did our standard conference podcast coverage interviewing selected papers (that we have now also done for ICLR and ICML), however we felt that we could be doing more to help AI Engineers 1) get more industry-relevant content, and 2) recap 2024 year in review from experts. As a result, we organized the first Latent Space LIVE!, our first in person miniconference, at NeurIPS 2024 in Vancouver.Since Nathan Lambert ( Interconnects ) joined us for the hit RLHF 201 episode at the start of this year, it is hard to overstate how much Open Models have exploded this past year. In 2023 only five names were playing in the top LLM ranks, Mistral, Mosaic's MPT, TII UAE's Falcon, Yi from Kai-Fu Lee's 01.ai, and of course Meta's Llama 1 and 2. This year a whole cast of new open models have burst on the scene, from Google's Gemma and Cohere's Command R, to Alibaba's Qwen and Deepseek models, to LLM 360 and DCLM and of course to the Allen Institute's OLMo, OL MOE, Pixmo, Molmo, and Olmo 2 models. We were honored to host Luca Soldaini, one of the research leads on the Olmo series of models at AI2.Pursuing Open Model research comes with a lot of challenges beyond just funding and access to GPUs and datasets, particularly the regulatory debates this year across Europe, California and the White House. We also were honored to hear from and Sophia Yang, head of devrel at Mistral, who also presented a great session at the AI Engineer World's Fair Open Models track!Full Talk on YouTubePlease like and subscribe!Timestamps* 00:00 Welcome to Latent Space Live * 00:12 Recap of 2024: Best Moments and Keynotes * 01:22 Explosive Growth of Open Models in 2024 * 02:04 Challenges in Open Model Research * 02:38 Keynote by Luca Soldani: State of Open Models * 07:23 Significance of Open Source AI Licenses * 11:31 Research Constraints and Compute Challenges * 13:46 Fully Open Models: A New Trend * 27:46 Mistral's Journey and Innovations * 32:57 Interactive Demo: Lachat Capabilities * 36:50 Closing Remarks and NetworkingTranscriptSession3Audio[00:00:00] AI Charlie: Welcome to Latent Space Live, our first mini conference held at NeurIPS 2024 in Vancouver. This is Charlie, your AI co host. As a special treat this week, we're recapping the best of 2024 going domain by domain. We sent out a survey to the over 900 of you who told us what you wanted, and then invited the best speakers in the latent space network to cover each field.[00:00:28] AI Charlie: 200 of you joined us in person throughout the day, with over 2, 200 watching live online. Our next keynote covers the state of open models in 2024, with Luca Soldani and Nathan Lambert of the Allen Institute for AI, with a special appearance from Dr. Sophia Yang of Mistral. Our first hit episode of 2024 was with Nathan Lambert on RLHF 201 back in January.[00:00:57] AI Charlie: Where he discussed both reinforcement learning for language [00:01:00] models and the growing post training and mid training stack with hot takes on everything from constitutional AI to DPO to rejection sampling and also previewed the sea change coming to the Allen Institute. And to Interconnects, his incredible substack on the technical aspects of state of the art AI training.[00:01:18] AI Charlie: We highly recommend subscribing to get access to his Discord as well. It is hard to overstate how much open models have exploded this past year. In 2023, only five names were playing in the top LLM ranks. Mistral, Mosaics MPT, and Gatsby. TII UAE's Falcon, Yi, from Kaifu Lee's 01. ai, And of course, Meta's Lama 1 and 2.[00:01:43] AI Charlie: This year, a whole cast of new open models have burst on the scene. From Google's Jemma and Cohere's Command R, To Alibaba's Quen and DeepSeq models, to LLM360 and DCLM, and of course, to the Allen Institute's OLMO, [00:02:00] OLMOE, PIXMO, MOLMO, and OLMO2 models. Pursuing open model research comes with a lot of challenges beyond just funding and access to GPUs and datasets, particularly the regulatory debates this year across Europe.[00:02:14] AI Charlie: California and the White House. We also were honored to hear from Mistral, who also presented a great session at the AI Engineer World's Fair Open Models track. As always, don't forget to check the show notes for the YouTube link to their talk, as well as their slides. Watch out and take care.[00:02:35] Luca Intro[00:02:35] Luca Soldaini: Cool. Yeah, thanks for having me over. I'm Luca. I'm a research scientist at the Allen Institute for AI. I threw together a few slides on sort of like a recap of like interesting themes in open models for, for 2024. Have about maybe 20, 25 minutes of slides, and then we can chat if there are any questions.[00:02:57] Luca Soldaini: If I can advance to the next slide. [00:03:00] Okay, cool. So I did the quick check of like, to sort of get a sense of like, how much 2024 was different from 2023. So I went on Hugging Face and sort of get, tried to get a picture of what kind of models were released in 2023 and like, what do we get in 2024?[00:03:16] Luca Soldaini: 2023 we get, we got things like both LLAMA 1 and 2, we got Mistral, we got MPT, Falcon models, I think the YI model came in at the end. Tail end of the year. It was a pretty good year. But then I did the same for 2024. And it's actually quite stark difference. You have models that are, you know, reveling frontier level.[00:03:38] Luca Soldaini: Performance of what you can get from closed models from like Quen, from DeepSeq. We got Llama3. We got all sorts of different models. I added our own Olmo at the bottom. There's this growing group of like, Fully open models that I'm going to touch on a little bit later. But you know, just looking at the slides, it feels like 2024 [00:04:00] was just smooth sailing, happy knees, much better than previous year.[00:04:04] Luca Soldaini: And you know, you can plot you can pick your favorite benchmark Or least favorite, I don't know, depending on what point you're trying to make. And plot, you know, your closed model, your open model and sort of spin it in ways that show that, oh, you know open models are much closer to where closed models are today versus to Versus last year where the gap was fairly significant.[00:04:29] Luca Soldaini: So one thing that I think I don't know if I have to convince people in this room, but usually when I give this talks about like open models, there is always like this background question in, in, in people's mind of like, why should we use open models? APIs argument, you know, it's, it's. Just an HTTP request to get output from a, from one of the best model out there.[00:04:53] Luca Soldaini: Why do I have to set up infra and use local models? And there are really like two answer. There is the more [00:05:00] researchy answer for this, which is where it might be. Background lays, which is just research. If you want to do research on language models, research thrives on, on open models, there is like large swath of research on modeling, on how these models behave on evaluation and inference on mechanistic interpretability that could not happen at all if you didn't have open models they're also for AI builders, they're also like.[00:05:30] Luca Soldaini: Good use cases for using local models. You know, you have some, this is like a very not comprehensive slides, but you have things like there are some application where local models just blow closed models out of the water. So like retrieval, it's a very clear example. We might have like constraints like Edge AI applications where it makes sense.[00:05:51] Luca Soldaini: But even just like in terms of like stability, being able to say this model is not changing under the hood. It's, there's plenty of good cases for, [00:06:00] for open models. And the community is just not models. Is I stole this slide from one of the Quent2 announcement blog posts. But it's super cool to see like how much tech exists around open models and serving them on making them efficient and hosting them.[00:06:18] Luca Soldaini: It's pretty cool. And so. It's if you think about like where the term opens come from, comes from like the open source really open models meet the core tenants of, of open, of open source specifically when it comes around collaboration, there is truly a spirit, like through these open models, you can build on top of other people.[00:06:41] Luca Soldaini: innovation. We see a lot of these even in our own work of like, you know, as we iterate in the various versions of Alma it's not just like every time we collect from scratch all the data. No, the first step is like, okay, what are the cool data sources and datasets people have put [00:07:00] together for language model for training?[00:07:01] Luca Soldaini: Or when it comes to like our post training pipeline We one of the steps is you want to do some DPO and you use a lot of outputs of other models to improve your, your preference model. So it's really having like an open sort of ecosystem benefits and accelerates the development of open models.[00:07:23] The Definition of Open Models[00:07:23] Luca Soldaini: One thing that we got in 2024, which is not a specific model, but I thought it was really significant, is we first got we got our first open source AI definition. So this is from the open source initiative they've been generally the steward of a lot of the open source licenses when it comes to software and so they embarked on this journey in trying to figure out, okay, How does a license, an open source license for a model look like?[00:07:52] Luca Soldaini: Majority of the work is very dry because licenses are dry. So I'm not going to walk through the license step by [00:08:00] step, but I'm just going to pick out one aspect that is very good and then one aspect that personally feels like it needs improvement on the good side. This this open source AI license actually.[00:08:13] Luca Soldaini: This is very intuitive. If you ever build open source software and you have some expectation around like what open source looks like for software for, for AI, sort of matches your intuition. So, the weights need to be fairly available the code must be released with an open source license and there shouldn't be like license clauses that block specific use cases.[00:08:39] Luca Soldaini: So. Under this definition, for example, LLAMA or some of the QUEN models are not open source because the license says you can't use this model for this or it says if you use this model you have to name the output this way or derivative needs to be named that way. Those clauses don't meet open source [00:09:00] definition and so they will not be covered.[00:09:02] Luca Soldaini: The LLAMA license will not be covered under the open source definition. It's not perfect. One of the thing that, um, internally, you know, in discussion with with OSI, we were sort of disappointed is around the language. For data. So you might imagine that an open source AI model means a model where the data is freely available.[00:09:26] Luca Soldaini: There were discussion around that, but at the end of the day, they decided to go with a softened stance where they say a model is open source if you provide sufficient detail information. On how to sort of replicate the data pipeline. So you have an equivalent system, sufficient, sufficiently detailed.[00:09:46] Luca Soldaini: It's very, it's very fuzzy. Don't like that. An equivalent system is also very fuzzy. And this doesn't take into account the accessibility of the process, right? It might be that you provide enough [00:10:00] information, but this process costs, I don't know, 10 million to do. Now the open source definition. Like, any open source license has never been about accessibility, so that's never a factor in open source software, how accessible software is.[00:10:14] Luca Soldaini: I can make a piece of open source, put it on my hard drive, and never access it. That software is still open source, the fact that it's not widely distributed doesn't change the license, but practically there are expectations of like, what we want good open sources to be. So, it's, It's kind of sad to see that the data component in this license is not as, as, Open as some of us would like would like it to be.[00:10:40] Challenges for Open Models[00:10:40] Luca Soldaini: and I linked a blog post that Nathan wrote on the topic that it's less rambly and easier to follow through. One thing that in general, I think it's fair to say about the state of open models in 2024 is that we know a lot more than what we knew in, [00:11:00] in 2023. Like both on the training data, like And the pre training data you curate on like how to do like all the post training, especially like on the RL side.[00:11:10] Luca Soldaini: You know, 2023 was a lot of like throwing random darts at the board. I think 2024, we have clear recipes that, okay, don't get the same results as a closed lab because there is a cost in, in actually matching what they do. But at least we have a good sense of like, okay, this is, this is the path to get state of the art language model.[00:11:31] Luca Soldaini: I think that one thing that it's a downside of 2024 is that I think we are more research constrained in 2023. It feels that, you know, the barrier for compute that you need to, to move innovation along as just being right rising and rising. So like, if you go back to this slide, there is now this, this cluster of models that are sort of released by the.[00:11:57] Luca Soldaini: Compute rich club. Membership is [00:12:00] hotly debated. You know, some people don't want to be. Called the rich because it comes to expectations. Some people want to be called rich, but I don't know, there's debate, but like, these are players that have, you know, 10, 000, 50, 000 GPUs at minimum. And so they can do a lot of work and a lot of exploration and improving models that it's not very accessible.[00:12:21] Luca Soldaini: To give you a sense of like how I personally think about. Research budget for each part of the, of the language model pipeline is like on the pre training side, you can maybe do something with a thousand GPUs, really you want 10, 000. And like, if you want real estate of the art, you know, your deep seek minimum is like 50, 000 and you can scale to infinity.[00:12:44] Luca Soldaini: The more you have, the better it gets. Everyone on that side still complains that they don't have enough GPUs. Post training is a super wide sort of spectrum. You can do as little with like eight GPUs as long as you're able to [00:13:00] run, you know, a good version of, say, a LLAMA model, you can do a lot of work there.[00:13:05] Luca Soldaini: You can scale a lot of the methodology, just like scales with compute, right? If you're interested in you know, your open replication of what OpenAI's O1 is you're going to be on the 10K spectrum of our GPUs. Inference, you can do a lot with very few resources. Evaluation, you can do a lot with, well, I should say at least one GPUs if you want to evaluate GPUs.[00:13:30] Luca Soldaini: Open models but in general, like if you are, if you care a lot about intervention to do on this model, which it's my prefer area of, of research, then, you know, the resources that you need are quite, quite significant. Yeah. One other trends that has emerged in 2024 is this cluster of fully open models.[00:13:54] Luca Soldaini: So Omo the model that we built at ai, two being one of them and you know, it's nice [00:14:00] that it's not just us. There's like a cluster of other mostly research efforts who are working on this. And so it's good to to give you a primer of what like fully open means. So fully open, the easy way to think about it is instead of just releasing a model checkpoint that you run, you release a full recipe so that other people working on it.[00:14:24] Luca Soldaini: Working on that space can pick and choose whatever they want from your recipe and create their own model or improve on top of your model. You're giving out the full pipeline and all the details there instead of just like the end output. So I pull up the screenshot from our recent MOE model.[00:14:43] Luca Soldaini: And like for this model, for example, we released the model itself. Data that was trained on, the code, both for training and inference all the logs that we got through the training run, as well as every intermediate checkpoint and like the fact that you release different part of the pipeline [00:15:00] allows others to do really cool things.[00:15:02] Luca Soldaini: So for example, this tweet from early this year from folks in news research they use our pre training data to do a replication of the BitNet paper in the open. So they took just a Really like the initial part of a pipeline and then the, the thing on top of it. It goes both ways.[00:15:21] Luca Soldaini: So for example, for the Olmo2 model a lot of our pre trained data for the first stage of pre training was from this DCLM initiative that was led by folks Ooh, a variety of ins a variety of institutions. It was a really nice group effort. But you know, for When it was nice to be able to say, okay, you know, the state of the art in terms of like what is done in the open has improved.[00:15:46] AI2 Models - Olmo, Molmo, Pixmo etc[00:15:46] Luca Soldaini: We don't have to like do all this work from scratch to catch up the state of the art. We can just take it directly and integrate it and do our own improvements on top of that. I'm going to spend a few minutes doing like a [00:16:00] shameless plug for some of our fully open recipes. So indulge me in this.[00:16:05] Luca Soldaini: So a few things that we released this year was, as I was mentioning, there's OMOE model which is, I think still is state of the art MOE model in its size class. And it's also. Fully open, so every component of this model is available. We released a multi modal model called Molmo. Molmo is not just a model, but it's a full recipe of how you go from a text only model to a multi modal model, and we apply this recipe on top of Quent checkpoints, on top of Olmo checkpoints, as well as on top of OlmoE.[00:16:37] Luca Soldaini: And I think there'd be a replication doing that on top of Mistral as well. The post training side we recently released 2. 0. 3. Same story. This is a recipe on how you go from a base model to A state of the art post training model. We use the Tulu recipe on top of Olmo, on top of Llama, and then there's been open replication effort [00:17:00] to do that on top of Quen as well.[00:17:02] Luca Soldaini: It's really nice to see like, you know, when your recipe sort of, it's kind of turnkey, you can apply it to different models and it kind of just works. And finally, the last thing we released this year was Olmo 2, which so far is the best state of the art. Fully open language model a Sera combines aspect from all three of these previous models.[00:17:22] Luca Soldaini: What we learn on the data side from MomoE and what we learn on like making models that are easy to adapt from the Momo project and the Tulu project. I will close with a little bit of reflection of like ways this, this ecosystem of open models like it's not all roses. It's not all happy. It feels like day to day, it's always in peril.[00:17:44] Luca Soldaini: And, you know, I talked a little bit about like the compute issues that come with it. But it's really not just compute. One thing that is on top of my mind is due to like the environment and how you know, growing feelings about like how AI is treated. [00:18:00] It's actually harder to get access to a lot of the data that was used to train a lot of the models up to last year.[00:18:06] Luca Soldaini: So this is a screenshot from really fabulous work from Shane Longpre who's, I think is in Europe about Just access of like diminishing access to data for language model pre training. So what they did is they went through every snapshot of common crawl. Common crawl is this publicly available scrape of the, of a subset of the internet.[00:18:29] Luca Soldaini: And they looked at how For any given website whether a website that was accessible in say 2017, what, whether it was accessible or not in 2024. And what they found is as a reaction to like the close like of the existence of closed models like OpenAI or Cloud GPT or Cloud a lot of content owners have blanket Blocked any type of crawling to your website.[00:18:57] Luca Soldaini: And this is something that we see also internally at [00:19:00] AI2. Like one project that we started this year is we wanted to, we wanted to understand, like, if you're a good citizen of the internet and you crawl following sort of norms and policy that have been established in the last 25 years, what can you crawl?[00:19:17] Luca Soldaini: And we found that there's a lot of website where. The norms of how you express preference of whether to crawl your data or not are broken. A lot of people would block a lot of crawling, but do not advertise that in RobustDXT. You can only tell that they're crawling, that they're blocking you in crawling when you try doing it.[00:19:37] Luca Soldaini: Sometimes you can't even crawl the robots. txt to, to check whether you're allowed or not. And then a lot of websites there's, there's like all these technologies that historically have been, have existed to make websites serving easier such as Cloudflare or DNS. They're now being repurposed for blocking AI or any type of crawling [00:20:00] in a way that is Very opaque to the content owners themselves.[00:20:04] Luca Soldaini: So, you know, you go to these websites, you try to access them and they're not available and you get a feeling it's like, Oh, someone changed, something changed on the, on the DNS side that it's blocking this and likely the content owner has no idea. They're just using a Cloudflare for better, you know, load balancing.[00:20:25] Luca Soldaini: And this is something that was sort of sprung on them with very little notice. And I think the problem is this, this blocking or ideas really, it impacts people in different ways. It disproportionately helps companies that have a headstart, which are usually the closed labs and it hurts incoming newcomer players where either have now to do things in a sketchy way or you're never going to get that content that the closed lab might have.[00:20:54] Luca Soldaini: So there's a lot, it was a lot of coverage. I'm going to plug Nathan's blog post again. That is, [00:21:00] that I think the title of this one is very succinct which is like, we're actually not, You know, before thinking about running out of training data, we're actually running out of open training data. And so if we want better open models they should be on top of our mind.[00:21:13] Regulation and Lobbying[00:21:13] Luca Soldaini: The other thing that has emerged is that there is strong lobbying efforts on trying to define any kind of, AI as like a new extremely risky and I want to be precise here. Like the problem is now, um, like the problem is not not considering the risk of this technology. Every technology has risks that, that should always be considered.[00:21:37] Luca Soldaini: The thing that it's like to me is sorry, is ingenious is like just putting this AI on a pedestal and calling it like, An unknown alien technology that has like new and undiscovered potentials to destroy humanity. When in reality, all the dangers I think are rooted in [00:22:00] dangers that we know from existing software industry or existing issues that come with when using software on on a lot of sensitive domains, like medical areas.[00:22:13] Luca Soldaini: And I also noticed a lot of efforts that have actually been going on and trying to make this open model safe. I pasted one here from AI2, but there's actually like a lot of work that has been going on on like, okay, how do you make, if you're distributing this model, Openly, how do you make it safe?[00:22:31] Luca Soldaini: How, what's the right balance between accessibility on open models and safety? And then also there's annoying brushing of sort of concerns that are then proved to be unfounded under the rug. You know, if you remember the beginning of this year, it was all about bio risk of these open models.[00:22:48] Luca Soldaini: The whole thing fizzled because as being Finally, there's been like rigorous research, not just this paper from Cohere folks, but it's been rigorous research showing [00:23:00] that this is really not a concern that we should be worried about. Again, there is a lot of dangerous use of AI applications, but this one was just like, A lobbying ploy to just make things sound scarier than they actually are.[00:23:15] Luca Soldaini: So I got to preface this part. It says, this is my personal opinion. It's not my employer, but I look at things like the SP 1047 from, from California. And I think we kind of dodged a bullet on, on this legislation. We, you know, the open source community, a lot of the community came together at the last, sort of the last minute and did a very good effort trying to explain all the negative impact of this bill.[00:23:43] Luca Soldaini: But There's like, I feel like there's a lot of excitement on building these open models or like researching on these open models. And lobbying is not sexy it's kind of boring but it's sort of necessary to make sure that this ecosystem can, can really [00:24:00] thrive. This end of presentation, I have Some links, emails, sort of standard thing in case anyone wants to reach out and if folks have questions or anything they wanted to discuss.[00:24:13] Luca Soldaini: Is there an open floor? I think we have Sophia[00:24:16] swyx: who wants to who one, one very important open model that we haven't covered is Mistral. Ask her on this slide. Yeah, yeah. Well, well, it's nice to have the Mistral person talk recap the year in Mistral. But while Sophia gets set up, does anyone have like, just thoughts or questions about the progress in this space?[00:24:32] Questions - Incentive Alignment[00:24:32] swyx: Do you always have questions?[00:24:34] Quesiton: I'm very curious how we should build incentives to build open models, things like Francois Chollet's ArcPrize, and other initiatives like that. What is your opinion on how we should better align incentives in the community so that open models stay open?[00:24:49] Luca Soldaini: The incentive bit is, like, really hard.[00:24:51] Luca Soldaini: Like, even It's something that I actually, even we think a lot about it internally because like building open models is risky. [00:25:00] It's very expensive. And so people don't want to take risky bets. I think the, definitely like the challenges like our challenge, I think those are like very valid approaches for it.[00:25:13] Luca Soldaini: And then I think in general, promoting, building, so, any kind of effort to participate in this challenge, in those challenges, if we can promote doing that on top of open models and sort of really lean into like this multiplier effect, I think that is a good way to go. If there were more money for that.[00:25:35] Luca Soldaini: For efforts like research efforts around open models. There's a lot of, I think there's a lot of investments in companies that at the moment are releasing their model in the open, which is really cool. But it's usually more because of commercial interest and not wanting to support this, this like open models in the longterm, it's a really hard problem because I think everyone is operating sort of [00:26:00] in what.[00:26:01] Luca Soldaini: Everyone is at their local maximum, right? In ways that really optimize their position on the market. Global maximum is harder to achieve.[00:26:11] Question2: Can I ask one question? No.[00:26:12] Luca Soldaini: Yeah.[00:26:13] Question2: So I think one of the gap between the closed and open source models is the mutability. So the closed source models like chat GPT works pretty good on the low resource languages, which is not the same on the open, open source models, right?[00:26:27] Question2: So is it in your plan to improve on that?[00:26:32] Luca Soldaini: I think in general,[00:26:32] Luca Soldaini: yes, is I think it's. I think we'll see a lot of improvements there in, like, 2025. Like, there's groups like, Procurement English on the smaller side that are already working on, like, better crawl support, multilingual support. I think what I'm trying to say here is you really want to be experts.[00:26:54] Luca Soldaini: who are actually in those countries that teach those languages to [00:27:00] participate in the international community. To give you, like, a very easy example I'm originally from Italy. I think I'm terribly equipped to build a model that works well in Italian. Because one of the things you need to be able to do is having that knowledge of, like, okay, how do I access, you know, how Libraries, or content that is from this region that covers this language.[00:27:23] Luca Soldaini: I've been in the US long enough that I no longer know. So, I think that's the efforts that folks in Central Europe, for example, are doing. Around like, okay, let's tap into regional communities. To get access you know, to bring in collaborators from those areas. I think it's going to be, like, very crucial for getting products there.[00:27:46] Mistral intro[00:27:46] Sophia Yang: Hi everyone. Yeah, I'm super excited to be here to talk to you guys about Mistral. A really short and quick recap of what we have done, what kind of models and products we have released in the [00:28:00] past year and a half. So most of you We have already known that we are a small startup funded about a year and a half ago in Paris in May, 2003, it was funded by three of our co founders, and in September, 2003, we released our first open source model, Mistral 7b yeah, how, how many of you have used or heard about Mistral 7b?[00:28:24] Sophia Yang: Hey, pretty much everyone. Thank you. Yeah, it's our Pretty popular and community. Our committee really loved this model, and in December 23, we, we released another popular model with the MLE architecture Mr. A X seven B and oh. Going into this year, you can see we have released a lot of things this year.[00:28:46] Sophia Yang: First of all, in February 2004, we released MrSmall, MrLarge, LeChat, which is our chat interface, I will show you in a little bit. We released an embedding model for, you [00:29:00] know, converting your text into embedding vectors, and all of our models are available. The, the big cloud resources. So you can use our model on Google cloud, AWS, Azure Snowflake, IBM.[00:29:16] Sophia Yang: So very useful for enterprise who wants to use our model through cloud. And in April and May this year, we released another powerful open source MOE model, AX22B. And we also released our first code. Code Model Coastal, which is amazing at 80 plus languages. And then we provided another fine tuning service for customization.[00:29:41] Sophia Yang: So because we know the community love to fine tune our models, so we provide you a very nice and easy option for you to fine tune our model on our platform. And also we released our fine tuning code base called Menstrual finetune. It's open source, so feel free to take it. Take a look and.[00:29:58] Sophia Yang: More models. [00:30:00] On July 2, November this year, we released many, many other models. First of all is the two new small, best small models. We have Minestra 3B great for Deploying on edge devices we have Minstrel 8B if you used to use Minstrel 7B, Minstrel 8B is a great replacement with much stronger performance than Minstrel 7B.[00:30:25] Sophia Yang: We also collaborated with NVIDIA and open sourced another model, Nemo 12B another great model. And Just a few weeks ago, we updated Mistral Large with the version 2 with the updated, updated state of the art features and really great function calling capabilities. It's supporting function calling in LatentNate.[00:30:45] Sophia Yang: And we released two multimodal models Pixtral 12b. It's this open source and Pixtral Large just amazing model for, models for not understanding images, but also great at text understanding. So. Yeah, a [00:31:00] lot of the image models are not so good at textual understanding, but pixel large and pixel 12b are good at both image understanding and textual understanding.[00:31:09] Sophia Yang: And of course, we have models for research. Coastal Mamba is built on Mamba architecture and MathRoll, great with working with math problems. So yeah, that's another model.[00:31:29] Sophia Yang: Here's another view of our model reference. We have several premier models, which means these models are mostly available through our API. I mean, all of the models are available throughout our API, except for Ministry 3B. But for the premier model, they have a special license. Minstrel research license, you can use it for free for exploration, but if you want to use it for enterprise for production use, you will need to purchase a license [00:32:00] from us.[00:32:00] Sophia Yang: So on the top row here, we have Minstrel 3b and 8b as our premier model. Minstrel small for best, best low latency use cases, MrLarge is great for your most sophisticated use cases. PixelLarge is the frontier class multimodal model. And, and we have Coastral for great for coding and then again, MrEmbedding model.[00:32:22] Sophia Yang: And The bottom, the bottom of the slides here, we have several Apache 2. 0 licensed open way models. Free for the community to use, and also if you want to fine tune it, use it for customization, production, feel free to do so. The latest, we have Pixtros 3 12b. We also have Mr. Nemo mum, Coastal Mamba and Mastro, as I mentioned, and we have three legacy models that we don't update anymore.[00:32:49] Sophia Yang: So we recommend you to move to our newer models if you are still using them. And then, just a few weeks ago, [00:33:00] we did a lot of, uh, improvements to our code interface, Lachette. How many of you have used Lachette? Oh, no. Only a few. Okay. I highly recommend Lachette. It's chat. mistral. ai. It's free to use.[00:33:16] Sophia Yang: It has all the amazing capabilities I'm going to show you right now. But before that, Lachette in French means cat. So this is actually a cat logo. If you You can tell this is the cat eyes. Yeah. So first of all, I want to show you something Maybe let's, let's take a look at image understanding.[00:33:36] Sophia Yang: So here I have a receipts and I want to ask, just going to get the prompts. Cool. So basically I have a receipt and I said I ordered I don't know. Coffee and the sausage. How much do I owe? Add a 18 percent tip. So hopefully it was able to get the cost of the coffee and the [00:34:00] sausage and ignore the other things.[00:34:03] Sophia Yang: And yeah, I don't really understand this, but I think this is coffee. It's yeah. Nine, eight. And then cost of the sausage, we have 22 here. And then it was able to add the cost, calculate the tip, and all that. Great. So, it's great at image understanding, it's great at OCR tasks. So, if you have OCR tasks, please use it.[00:34:28] Sophia Yang: It's free on the chat. It's also available through our API. And also I want to show you a Canvas example. A lot of you may have used Canvas with other tools before. But, With Lachat, it's completely free again. Here, I'm asking it to create a canvas that's used PyScript to execute Python in my browser.[00:34:51] Sophia Yang: Let's see if it works. Import this. Okay, so, yeah, so basically it's executing [00:35:00] Python here. Exactly what we wanted. And the other day, I was trying to ask Lachat to create a game for me. Let's see if we can make it work. Yeah, the Tetris game. Yep. Let's just get one row. Maybe. Oh no. Okay. All right. You get the idea. I failed my mission. Okay. Here we go. Yay! Cool. Yeah. So as you can see, Lachet can write, like, a code about a simple game pretty easily. And you can ask Lachet to explain the code. Make updates however you like. Another example. There is a bar here I want to move.[00:35:48] Sophia Yang: Okay, great, okay. And let's go back to another one. Yeah, we also have web search capabilities. Like, you can [00:36:00] ask what's the latest AI news. Image generation is pretty cool. Generate an image about researchers. Okay. In Vancouver? Yeah, it's Black Forest Labs flux Pro. Again, this is free, so Oh, cool.[00:36:19] Sophia Yang: I guess researchers here are mostly from University of British Columbia. That's smart. Yeah. So this is Laia ira. Please feel free to use it. And let me know if you have any feedback. We're always looking for improvement and we're gonna release a lot more powerful features in the coming years.[00:36:37] Sophia Yang: Thank you. Get full access to Latent Space at www.latent.space/subscribe

Kings and Generals: History for our Future
3.130 Fall and Rise of China: Long March

Kings and Generals: History for our Future

Play Episode Listen Later Dec 16, 2024 31:46


Last time we spoke about the Fujian Rebellion of 1933. In the midst of political turmoil, the 19th Route Army, once vital in campaigns for Chiang Kai-shek, found itself at odds with his leadership during Japan's invasion of Shanghai in 1932. Facing internal rebellion and external threats, Chiang Kai-Shek prioritized fighting the Communists over the Japanese. The 19th Route Army, disillusioned, resisted both Japan and the CCP but ultimately faced betrayal when Chiang Kai-Shek forced them into civil conflict in Fujian, deepening divisions within China. In 1933, Chiang Kai-shek faced opposition for his appeasement of Japan, leading the 19th Route Army, frustrated by his inaction, to plot a coup. Under Chen Mingshu's leadership, they sought alliances against Chiang Kai-Shek but struggled amid civil war pressures and Red Army conflicts. On November 20, they declared the People's Revolutionary Government in Fuzhou, aiming to unify against Japanese aggression. However, lack of support led to rapid failure; by January 1934, Chiang's forces crushed the rebellion, and its leaders fled, marking the end of the Fujian Revolution.   #130 The Long March Welcome to the Fall and Rise of China Podcast, I am your dutiful host Craig Watson. But, before we start I want to also remind you this podcast is only made possible through the efforts of Kings and Generals over at Youtube. Perhaps you want to learn more about the history of Asia? Kings and Generals have an assortment of episodes on history of asia and much more  so go give them a look over on Youtube. So please subscribe to Kings and Generals over at Youtube and to continue helping us produce this content please check out www.patreon.com/kingsandgenerals. If you are still hungry for some more history related content, over on my channel, the Pacific War Channel where I cover the history of China and Japan from the 19th century until the end of the Pacific War. As we saw 2 episodes ago, the CCP had been taken over by the 28 Bolsheviks and Otto Braun who initiated a dramatic offensive strategy for the Red Army. Unfortunately this also came during the 5th encirclement campaign. This resulted in repeated defeats for the Red Army and the gradual shrinking of the Soviet area. In April 1934, the Central Red Army engaged in a decisive battle against the Nationalist Army in Guangchang, Jiangxi Province, suffering severe losses and now faced a critical situation. As the NRA's grip tightened, the Red Army and the Central Committee of the CCP sought new strategies. With offensive tactics no longer feasible, the Red Army considered alternative approaches to navigate its current challenges. One overarching strategy involved co-opting the NRA by harnessing nationalistic sentiment to form a united front against the Japanese. The leadership of the Red Army hoped that by identifying a common enemy, they could temporarily alleviate the conflict with the KMT. In July 1934, they attempted to implement this strategy by deploying the Seventh Red Army Corps to western Fujian to join the 10th Red Army, commanded by Su Yu. This combined force was labeled the Anti-Japanese Vanguard Column to attract Nationalist support; however, the propaganda effort failed. The NRA subsequently obliterated the Red Army Column, resulting in the death or execution of most of its members. Approximately 800 survivors escaped and regrouped as a guerrilla unit under Su Yu, continuing to fight independently until the establishment of the Second United Front in 1937. Another breakout occurred on July 23, 1934, when the 6th Red Army Corps, operating from the Hunan-Guangdong border, traversed Hunan and joined forces with the Third Red Army, forming the Second Front Red Army, led by He Long, on October 22, 1934. It is uncertain whether either operation impacted the KMT. The escalating costs and ongoing casualties placed a heavy burden on the Red Army, complicating its ability to maintain its position. A secure new location was essential for the Red Army to reorganize, resupply, and recruit personnel. In August 1934, Bo Gu and Otto Braun secretly decided to abandon the Jiangxi Soviet. Their initial plan was to head southwest towards Hunan, seeking friendlier territory and aiming to connect with the 2nd Front Red Army. While the precise whereabouts of the 2nd Front Red Army were unclear, the leadership considered Hunan the most probable destination and devised a route to reach it. Meanwhile, the rest of the Red Army intensified its recruitment efforts, raised funds, and gathered supplies. On the night of October 10, 1934, the leadership of the Red Army issued marching orders to the 1st Front Red Army, which advanced southwest in two columns, consisting of the 1st, 3rd, 5th, 8th, and 9th Red Army Corps. The total strength of this force was about 87,000 soldiers. Many of these soldiers were unaware that it would be their final sight of the Jiangxi Soviet, as most believed they were simply executing another maneuver to outflank the KMT and strike at its rear. A contingent of 16,000 troops, including several wounded soldiers like their leader Chen Yi, remained in Ruijin to defend against and delay the KMT forces, providing the First Front Red Army with the necessary time to depart unnoticed. Thus, began what has famously been called the Long March. The first few days of the Long March were relatively calm. The Red Army steered clear of significant confrontations with the NRA forces and easily maneuvered through a gap in the encirclement. Previously, Zhou Enlai had brokered a truce with the Guangdong and Guangxi warlords involved in the Extermination campaign, allowing the Red Army safe passage through the region. Meanwhile, the Red Army troops remaining in Ruijin fiercely resisted the NRA, effectively masking the fact that the main force had already departed. Until November 8, Nationalist newspapers claimed that the Red Army was nearly annihilated. The 1st Front Red Army traveled at night, using small trails to evade detection and attacks from the air. The troop formation included the 1st and 9th Red Army Corps on the left flank, the 3rd and 8th Red Army Corps on the right, with leadership and logistical units positioned in the center, while the 5th Red Army Corps provided rear guard support. The Red Army employed porters to transport heavy equipment, such as printing presses, X-ray machines, and currency. Additional porters carried litters for the wounded and key leaders. During this period, several Red Army leaders, including Zhou Enlai, were unwell or injured, while others, like Mao Zedong, rested in litters during the day after long nights of planning. By mid-November 1934, the NRA learned that the Red Army had broken free from their encirclement and was heading westward, prompting them to pursue. Observing the Red Army's movements, Chiang Kai-shek and the NRA leadership inferred that southern Hunan was likely their destination, so they deployed troops accordingly. The Red Army advanced rapidly to the west, aiming to cross the Xiang River before the NRA could catch up. On November 27, 1934, the Red Army reached Daoxian and launched an assault on the NRA blockhouses guarding the Xiang River crossings. They quickly overran these defenses and began moving troops across the river. However, the central column of the Red Army, hindered by heavy equipment and injured soldiers, fell behind the main force. On November 28, the NRA struck the rear elements of the Red Army before they could reach the river. For 5 days, the Red Army engaged in a fierce rear guard action, trying to disengage from the NRA and successfully cross the river. By December 2, 1934, all Red Army units had successfully crossed the Xiang River, albeit at a significant cost. The Red Army lost over two divisions from the 3rd and 5th Red Army Corps, leaving just over 30,000 soldiers remaining in their ranks. Furthermore, much of the Army's heavy equipment and supplies were abandoned along the way to lighten their load. After the Red Army crossed the Xiang River, it continued to evade direct confrontations with the NRA. The challenging battle at the Xiang River had a profound impact on the Red Army, leading to a rise in desertions as soldiers recognized that the movement had turned into an exodus from Jiangxi. Many porters responsible for transporting heavy equipment also began to leave during the night, especially while navigating the difficult, muddy trails in the mountains. The Red Army made several attempts to head north to join He Long and the 2nd Front Red Army, but each time, they found their routes blocked by the NRA. As a result, they altered their plans and headed west toward Guizhou, aiming to reach Sichuan and connect with the 4th Front Red Army to establish a new Soviet. Upon arriving in Liping, Guizhou province, the Red Army leadership decided on December 18th to advance north toward Zunyi in pursuit of their goal in Sichuan. Initially, Guiyang, the provincial capital, was the intended destination, but it had been fortified with seven NRA divisions. In contrast, Zunyi appeared to be a more feasible target as the second-largest city in the province, defended only by local Guizhou forces. On January 1st, 1935, the Red Army began its march toward Zunyi, crossing the Wu River under heavy fire from Guizhou provincial troops. Within three days, they successfully crossed the river and continued toward Zunyi. On January 7, the Red Army launched an attack on Zunyi, which fell two days later. Following the capture of the city, the Red Army initiated a recruitment drive, adding 30,000 new recruits to its ranks. To enhance its mobility, they buried or abandoned much of their heavy equipment. The Red Army had originally planned to remain in the area for an extended period to refit, reorganize, and bolster their forces. The staff of the Central Cadre Unit's Red Army Medical School seized the opportunity to conduct a week-long course on basic first aid for soldiers. However, local conditions hindered any long-term presence. The area's primary crop was opium, useful for barter but inadequate for sustaining the Red Army. Additionally, the city's position along a river bend restricted the Red Army's escape routes in the event of an NRA attack. Given these challenges, Communist leadership convened a conference to deliberate on their military strategy. The conference held on January 15th, 1935, marked a pivotal moment in Communist history. In attendance were Politburo members, including Mao Zedong, Zhu De, Chen Yun, Zhou Enlai, Luo Fu, and Bo Gu, along with Liu Bocheng, Liu Shaoqi, Lin Biao, Nie Rongzhen, Peng Dehuai, and Otto Braun. The primary focus of the meeting was the unsuccessful military strategy employed during the 5th Extermination Campaign. Bo Gu and Zhou Enlai opened the discussion, both acknowledging their mistakes and accepting responsibility for the failures. Mao Zedong followed with a sharp critique of the strategy's use of "short, swift thrusts" and the lack of cooperation with the Fujian 19th route NRA Army. The conference continued for three more days, during which much of the Red Army leadership criticized Bo Gu and Otto Braun's approach, aligning themselves with Mao. By the end of the meeting, key leaders of the CCP and Red Army had distanced themselves from the 28 Bolsheviks, effectively making Mao Zedong the de facto leader of the CCP, despite not being formally elected to any new position at Zunyi. A significant change was the disbanding of the triumvirate leadership of Bo Gu, Otto Braun, and Zhou Enlai. Zhu De and Zhou Enlai were assigned to lead the Red Army, which then moved towards Sichuan to connect with the 4th Front Red Army. Departing Zunyi, the Red Army comprised four army corps: the 1st, 3rd, 5th and 9th Red Army Corps, although all were considerably smaller than before. The total strength of the 1st Front Red Army was approximately 35,000 soldiers. The army advanced north through Tongzi, gathering gold and opium to procure food and supplies for the journey. The 1st Army Corps, led by Lin Biao, took the lead in searching for a route to cross the Yangtze River. While attempting to secure a crossing near Chishui, the remainder of the Red Army engaged in a fierce battle with a Sichuan NRA force near Tucheng. The fighting escalated to such a degree that Mao Zedong ordered Lin Biao and his corps to return and assist. Ultimately, on January 29th, 1935, the Red Army lost contact with the enemy and abandoned its plan to cross the Yangtze River, instead retreating west to Zhaxi in Yunnan province to evade NRA forces. However, this provided only a temporary reprieve, as more NRA troops moved west into Sichuan, covering all potential crossing points along the Yangtze. Faced with limited options, Mao proposed an audacious plan on February 7th: the Red Army would split into separate columns and head back east into Guizhou to mislead the NRA, then reunite and proceed southwest into Yunnan to find a safer crossing point over the Yangtze. Executing this plan, the Red Army conducted a series of feints, diversionary attacks, and deception operations to confuse NRA leadership, as well as some of its own ranks. Mao Zedong aimed to create an opening for the Red Army to escape into Yunnan and cross the Yangtze in the Jinsha River area. The Red Army began moving east, achieving victories over the NRA, such as at Loushan Pass, where they captured about a division's worth of personnel and equipment. They continued eastward, seizing the city of Maotai and acquiring additional gold and opium for trade. In March 1935, Mao was appointed as the political commissar of the Red Army, with Zhu De serving as the commander-in-chief. His leadership role was further solidified when he was included in the triumvirate Military Council alongside Zhou Enlai and Wang Jiaxiang. Mao Zedong then initiated a deception operation, sending the 9th Red Army Corps north as a feint toward the Yangtze River, intending to reinforce NRA intelligence assessments. Chiang believed that these erratic movements indicated the Red Army was preparing for a decisive battle. Consequently, he relocated his NRA headquarters to Guiyang and deployed nearly all of Guizhou's NRA forces to the Yangtze area to encircle and eliminate the Red Army. This deployment inadvertently opened a north-south corridor in Guizhou, allowing the Red Army to move south towards Guiyang, which was now vulnerable due to the concentration of NRA forces along the Yangtze. Capitalizing on these fears, Mao sent additional Red Army units toward the provincial capital. In response, Chiang hurriedly redirected NRA forces from Yunnan to bolster defenses in Guiyang, thus creating yet another escape route for the Red Army. The Red Army swiftly exploited this corridor and advanced into Yunnan. They employed a similar feint tactic as used in Guiyang, deploying units from the 1st Red Army Corps to threaten Kunming. With the main Yunnan forces still occupied in Guiyang, the Yunnan government was forced to reallocate its frontier and militia troops to defend the capital, thus opening one final corridor for the Red Army to escape through a crossing at the Jinsha River. By April 1935, the Red Army had executed one of its most daring maneuvers, evading the NRA forces by making a sweeping maneuver into Yunnan. Despite this strategy, the Red Army still needed to cross the Yangtze River. One section of the river, known as the Jinsha River, flows from Tibet through Yunnan to Sichuan and offered excellent crossing points for the Red Army. On April 29th, Mao Zedong identified three crossing locations. The 1st Red Army Corps was assigned to cross in the north at Longjie, while the 3rd Red Army Corps would cross in the center at Hongmen. The Central Cadre Unit was designated to use the southern crossing point at Jiaopingdu. Meanwhile, the Fifth and Ninth Army Corps were tasked with rear guard operations and would cross at the nearest crossing point. Although the 1st and 3rd Red Army Corps struggled to secure their crossing locations, the Central Cadre Unit successfully acquired seven boats, established security on both riverbanks, and commenced a ferrying operation that would last nine days. Consequently, the 1st and 3rd Red Army Corps abandoned their original crossing points and moved to Jiaopingdu. The 3rd Red Army Corps crossed on May 7th, followed by the 1st Red Army Corps the next day. The 5th Red Army Corps maintained its rear guard before quickly crossing at Jiaopingdu on May 9th. Upon reaching Sichuan, the weary Red Army troops began to contemplate their next steps. After nearly nine months of travel, with minimal rest and significant losses, the Red Army's numbers had dwindled to around 25,000 soldiers, with much of their heavy equipment abandoned along their retreat route. They attempted to seize Huili but were met with fierce resistance from the 24th NRA Division. Outside the city, Red Army leaders held a conference on May 12th and resolved to continue north through Sichuan, aiming to cross the Dadu River to join forces with the 4th Front Red Army. As the Red Army advanced through the territory of the Yi minority, they faced hostility from the Yi people, who harbored animosity toward the Han and attacked straggling Red Army soldiers, stealing their weapons and clothing and leaving many to perish. Fortunately, Liu Bocheng and his vanguard unit from the 1st Red Army Corps negotiated a truce with the Yi, securing safe passage in exchange for promises of equal land rights and treatment after the war. On May 23rd, the Red Army reached Anshunchang along the Dadu River. Their initial attempts to cross by ferry were thwarted by strong NRA defenses on the opposite bank, and they only managed to secure three boats, which were insufficient for a crossing. On May 27th, Red Army leaders decided to take a calculated risk and dispatched troops northward to seize Luding Bridge. This iron-chain suspension bridge, located along a challenging trail through the mountain passes, crossed the Dadu River. In a remarkable act of bravery, the 4th Regiment of the 2nd Division, 1st Red Army Corps, led by Yang Chengwu, marched nearly 100 miles in under 3 days to secure the bridge. Despite facing a defending NRA brigade on sheer cliffs, the 4 Regiment acted swiftly and captured the bridge amid constant gunfire, with only 18 of the 22 men who launched the final assault surviving. Their sacrifice allowed the Red Army to evade the main KMT force and successfully cross the Dadu River, ultimately establishing themselves in Hualingping for refitting operations. However, the challenges for the Red Army persisted even after crossing the Dadu. They were still unaware of the 4th Front Red Army's location, with one possible area being directly north behind the Jiajin Mountains. To avoid detection from NRA forces or ambushes by Tibetans, Mao opted for a central walking trail through the Jiajin Mountains rather than the more accessible eastern and western routes. For many survivors of the Long March, the leg through the Jiajin Mountains proved to be the most arduous and challenging segment. The Red Army soldiers faced hunger, cold, thirst, avalanches, and the high altitude as they attempted to traverse the snow-capped peaks with little more than the clothes on their backs. On June 12th, the first units of the Red Army arrived at Danwei, located at the northern foot of the Jiajin Mountains. By June 14th, the remaining soldiers descended from the mountains and linked up with Li Xiannian, a liaison officer from the 4th Front Red Army. Approximately 10,000 soldiers endured the harsh conditions and made it down the mountain. Thankfully, they rejoined their fellow Red Army comrades, allowing them to take a much-needed rest. On June 18, 1935, the 1st and 4th Front Red Armies finally connected at Lianghekou. The Fourth Front Red Army fared significantly better than its counterpart, having originated from the Hubei-Henan-Anhui Soviet before relocating to the Shaanxi-Sichuan border and settling in northwest Sichuan in March 1935. Their forces numbered nearly 80,000, surpassing the 1st Front Red Army. Some soldiers from the 1st Front looked on with admiration and envy at the robust condition of the 4th Front soldiers and their horses. On June 26th, the leadership of both armies convened to discuss their future movements. Mao Zedong proposed advancing north to Gansu, then heading east toward Ningxia, with the ultimate goal of reaching Mongolia to establish communication with the Soviet Union. Conversely, Zhang Guotao suggested moving west to Xinjiang, aiming to connect with the Soviet Union via the Central Asian Republics. Beneath these military discussions lay political maneuvering as both Mao Zedong and Zhang Guotao sought to assert dominance over the Red Army. Ultimately, both sides maintained cordial relations and established a unified strategy and command. The Red Army was set to advance north to southern Gansu to establish a Soviet presence in the border areas. Zhang Guotao was appointed vice-chairman of the Military Council. By June 30, the 1st Front Red Army had moved into the Grasslands, with Zhang Guotao and the 4th Front Red Army following a day later. The meeting at Lianghekou did not resolve the political tensions between the factions led by Zhang Guotao and Mao Zedong, and these conflicts intensified over time. While Zhang Guotao continued to advocate for a westward movement toward Xinjiang, he also sought to recruit key leaders from the 1st Front Red Army to support his cause, but to no avail. Mao Zedong remained steadfast in his commitment to the agreed plan to proceed to Gansu and took measures to prevent any subversion from Zhang Guotao's camp. Tensions escalated during a conference at Maoergai on August 6th. The Red Army had arrived at Maoergai the previous day to rest and reorganize. According to one account, Mao Zedong held the meeting in the neighboring town of Shawo, securing the location ahead of Zhang Guotao arrival. As the sole representative from the 4th Front Red Army on the Politburo and Central Committee, Zhang Guotao intended to introduce additional representatives to enhance his influence, but they were unable to bypass security. This infuriated Zhang Guotao, highlighting the political maneuvering at play. Another account claims the meeting took place at Zhang Guotao's 11th Red Army Division headquarters, with his loyal soldiers ensuring that Mao Zdong could not undermine him. Regardless, no agreements were reached during this meeting. A second meeting was held on August 20th at Maoergai, resulting in a negotiated settlement. The Red Army remained under the command of Zhu De but was divided into two columns. The Right Column included the 1st and 3rd Red Army Corps, led by Lin Biao and Peng Dehuai, respectively, and also incorporated the 13th and 3th Red Armies from the 4th Front. Mao, Zhou Enlai, Bo Gu, and Otto Braun traveled with the Right Column. The Left Column comprised the remainder of the 4th Front Army, along with the 5th and 9th Red Army Corps, and was led by Zhang Guotao and Liu Bocheng, with Zhu De accompanying them. Both columns would advance north while skirting the Grasslands, with the Left Column heading toward Aba and the Right Column toward Baxi. Once the plan was finalized, they began their movement into the Grasslands on August 23rd. In the Grasslands, the Red Army encountered conditions as challenging as those in the mountains. This region was home to a minority population, and the Tibetan locals were just as hostile as the Yi had been, attacking and killing many stragglers. Food sources were scarce, and many Red Army soldiers were unfamiliar with edible plant species. Water supplies were also limited, as most sources were stagnant and contaminated. The soldiers ended up consuming wheat kernels, which severely upset their digestive systems. The trailing units faced even greater difficulties, as the vanguard troops turned the dirt paths into muddy pits, leaving little food for foraging. The Right Column reached Baxi on August 27th, suffering heavy losses during the week-long trek; the 3rd Red Army Corps alone lost 400 soldiers. The Left Column progressed more slowly and arrived in Aba about a week later. Once they exited the Grasslands, the Red Army faced another internal struggle that threatened their retreat. On September 3rd, Zhang Guotao sent a wireless message to Mao Zedong and the Right Column, stating that his forces were stationed at Aba and that the White River, north of Aba, was impassable. Mao Zedong urged Zhang Guotao to adhere to the Maoergai decision and even offered additional troops to assist in crossing the river, which Zhang Guotao politely declined. On September 9th, Mao Zedong learned of a secret message Zhang Guotao had sent to his aide in the Right Column. Zhang Guotao wanted the Right Column to move back south through the Grasslands to reunite the two columns and convene a meeting to discuss a new strategy, indicating an intention to initiate an intraparty power struggle. Fearing that Zhang Guotao would use his superior numbers to impose his strategy on the Red Army, the 1st and 3rd Red Army Corps quietly departed Baxi and continued north to Gansu. This approximately 8,000-strong force arrived at Ejie and held an emergency conference. The Red Army reorganized its forces as the Anti-Japanese Vanguard Force to garner support from the local population. They also issued a “Resolution Concerning the Mistakes of Comrade Zhang Guotao,” reprimanding his actions without expelling him from the Communist Party. On September 14th, the Red Army continued north and captured the Lazikou Pass, defeating two of Zhang Guotao's forces as he and his 4th Front Army moved south toward Chengdu. Zhang Guotao was furious upon discovering that Mao Zdong and his loyal Red Army troops had left without notice, but he chose not to pursue them and instead redirected his troops toward Chengdu. The 4th Front Red Army achieved initial victories in October 1935 against the NRA at Baoxing and Tianquan, coming within sixty miles of the Sichuan provincial capital. In response to this threat, Chiang Kai-shek dispatched over 80 NRA regiments to defend Chengdu. The NRA launched a counteroffensive at Baizhang, inflicting heavy losses on the Fourth Front Red Army, which retreated in disarray back to Ganzi in western Sichuan province, where they would remain until they linked up with the 2nd Front Army in June 1936. As the 4th Front Army moved south toward Sichuan, the Red Army completed the final stage of its arduous journey. On September 21st, 1935, Mao Zedong and the Anti-Japanese Vanguard arrived in Hadapu, a Han city in Gansu province. The soldiers rejoiced at being among their own ethnic group and took a few days to rest. During their stay, Mao Zedong and other leaders of the Red Army learned that a Soviet force, led by Liu Zhidan, a friend of Mao Zedong, was present in northern Shaanxi, supporting the 25th and 26th Red Armies. 10 days later, the Anti-Japanese Vanguard left Hadapu and swiftly moved west to avoid the NRA's Muslim cavalry units, aiming to connect with their allied units in Shaanxi. On October 19, 1935, Mao Zedong joined forces with the 25th and 26th Red Armies and settled near Wuqi. The remnants of the 1st Front Red Army had completed their year-long, 6,000-mile journey with approximately 4,000 soldiers. Once they reached the relative safety of Shaanxi, the Red Army reverted to its traditional strategy of political mobilization to gather resources, recruit new members, and propagate the communist revolution. On February 5th, 1936, the 1st Front Red Army moved east to carry out political mobilization efforts. Over the following two months, the Red Army defeated seven provincial divisions, capturing more than 4,000 soldiers. They also recruited 8,000 new members, raised $300,000 in revenue, and added 20 counties in Shanxi to their new Soviet. In May, the 1st Front Army advanced westward for a two-month operation, acquiring over 2,000 rifles and 400 horses, thereby expanding the Soviet's reach into Gansu and Ningxia. However, these efforts were ultimately thwarted by NRA forces, compelling the Red Army to relocate from Wuqi to Bao'an in June 1936. In October 1936, the 2nd and 4th Front Armies finally reached Bao'an, marking the completion of the Long March for the Red Army. With all three units reunited, the Red Army War College reopened in Dengjiaqiao, with Liu Bocheng eventually returning to lead it. Additionally, the Red Army military school began training in Tai'erwan. From 1934 to 1936, the Red Army evaded annihilation through a combination of courage, determination, and fortunate circumstances. Enduring harsh conditions and traversing some of China's most challenging terrain to escape the NRA and provincial forces, the Red Army demonstrated remarkable resilience. Mao Zedong skillfully navigated the political landscape within the Red Army, emerging as its supreme leader. The Communists also capitalized on the challenges facing the NRA and KMT leadership. The Red Army effectively utilized Chiang Kai-Sheks inability to exert full control over his subordinate warlords and their military units to avoid unnecessary confrontations. Upon reaching Shaanxi in late 1935, the survivors of the Long March were not only battle-hardened by their experiences but also carried valuable lessons learned from previous campaigns. In the relative security of the new Soviet, the Red Army expanded its ranks and resumed training and mobilization efforts. The Red Army had survived its greatest challenge to date and was poised to develop into the professional military force that would ultimately defeat the NRA and overthrow the KMT government. I would like to take this time to remind you all that this podcast is only made possible through the efforts of Kings and Generals over at Youtube. Please go subscribe to Kings and Generals over at Youtube and to continue helping us produce this content please check out www.patreon.com/kingsandgenerals. If you are still hungry after that, give my personal channel a look over at The Pacific War Channel at Youtube, it would mean a lot to me. Thus not only did the Red Army escape death at the hands of the NRA, but the experience of the Long March would actually contribute to the downfall of the NRA. Mao Zedong had emerged a top figure in the CCP and now would oversee it and the Red Army's future development until the ultimate clash with Chiang Kai-Shek for the future of China.

Ben Okurum
Gecenin Sonuna Yolculuk

Ben Okurum

Play Episode Listen Later Dec 2, 2024 98:26


Deniz Yüce Başarır, “ben okurum” da bu kez Fransız ve dünya edebiyatının en kült eserlerinden birini alıyor odağa: Gecenin Sonuna Yolculuk. Ve konu bu roman olunca, elbette dünya edebiyatının en tartışmalı yazarlarından biri de enine boyuna konuşuluyor: Louis Ferdinand Celine. Başarır'ın, kitabı dilimize kazandıran, edebiyatımızın seçkin isimlerinden Yiğit Bener ile gerçekleştirdiği, Celine'nin Yahudi düşmanlığından edebiyat anlayışına, dilde gerçekleştirdiği devrimden özel hayatına kadar geniş bir çerçevede uzanan sohbet, edebiyat severleri çok mutlu edecek gibi görünüyor. Tabii romandan özenle seçilmiş, Celine'nin o sert ama etkili üslubunu derinden hissedebileceğiniz alıntılar eşliğinde… 

Clean Power Hour
Ensuring Safety in Grid-Scale Battery Systems | EP243

Clean Power Hour

Play Episode Listen Later Nov 21, 2024 51:35 Transcription Available


In this episode of the Clean Power Hour, Tim Montague engages with industry leaders Dr. Zhehan Yi and Ryan Mayfield to discuss the critical topic of safety in grid-scale battery systems. As the demand for renewable energy surges, understanding the safety protocols and standards surrounding battery storage has never been more essential. The conversation delves into the rapid growth of battery installations across the United States, which have skyrocketed from approximately one gigawatt to nearly 17 gigawatts in just four years. With projections that BESS capacity could exceed 100 gigawatts in the US by the end of the decade, addressing safety concerns becomes paramount.Dr. Yi emphasizes the importance of integrating safety measures early in the design process, particularly regarding thermal runaway risks associated with lithium-ion batteries, which dominate the market. He discusses how engaging with authorities having jurisdiction (AHJs) and adhering to fire codes can mitigate potential hazards and foster the successful adoption of these transformative technologies. The episode also covers fundamental safety standards and testing protocols, such as UL 9540 and UL 9540A, shedding light on how companies like CPS America are innovating to enhance reliability and safety.Ryan Mayfield brings over 25 years of experience in solar PV and energy storage to the table, sharing insights from his work with developers and EPCs (engineering, procurement, and construction firms). He highlights the need for ongoing education within the industry to ensure all stakeholders—from facility owners to grid operators—are informed about the risks and best practices associated with battery systems.For more information about Mayfield Renewables, visitors can check out https://www.mayfield.energy/, about CPS America check out https://www.chintpowersystems.com/, or follow Clean Power Hour at cleanpowerhour.com.Social Media HandlesDr. Zhehan YiRyan MayfieldCPS AmericaMayfield Renewables Support the showConnect with Tim Clean Power Hour Clean Power Hour on YouTubeTim on TwitterTim on LinkedIn Email tim@cleanpowerhour.com Review Clean Power Hour on Apple PodcastsThe Clean Power Hour is produced by the Clean Power Consulting Group and created by Tim Montague. Contact us by email: CleanPowerHour@gmail.com Corporate sponsors who share our mission to speed the energy transition are invited to check out https://www.cleanpowerhour.com/support/The Clean Power Hour is brought to you by CPS America, maker of North America's number one 3-phase string inverter, with over 6GW shipped in the US. With a focus on commercial and utility-scale solar and energy storage, the company partners with customers to provide unparalleled performance and service. The CPS America product lineup includes 3-phase string inverters from 25kW to 275kW, exceptional data communication and controls, and energy storage solutions designed for seamless integration with CPS America systems. Learn more at www.chintpowersystems.com

Les Nuits de France Culture
Un livre des voix - Entretien Yi Munyol (1ère diffusion : 28/11/1995)

Les Nuits de France Culture

Play Episode Listen Later Nov 15, 2024 30:00


durée : 00:30:00 - Les Nuits de France Culture - par : Philippe Garbit - Par Claude Mourthé et André Velter - Avec Yi Munyol - Réalisation Claude Guerre - réalisation : Virginie Mourthé

The Sure Shot Entrepreneur
Do Great Work. VCs Will Find You.

The Sure Shot Entrepreneur

Play Episode Listen Later Nov 12, 2024 30:10


Yiğit Ihlamur, co-founder and general partner at Vela Partners, shares his unique venture capital approach, which combines data analysis with personal connections to identify exceptional founders. At Vela Partners, an algorithmic scan of companies helps narrow down investment choices, but final decisions hinge on assessing founder resilience and motivation during meetings. Yiğit explains that, typically, he has already decided to invest by the time he meets a founder; the meeting confirms his conviction. He also discusses AI's role in shaping the future of productivity tools and offers valuable advice to founders on preparation and authenticity when seeking investment.In this episode, you'll learn:[02:06] Yiğit's journey from Google to venture capital, and his unique approach to combining engineering and venture investing.[06:27] How Vela Partners integrates artificial intelligence and quant-driven algorithms to identify high-potential startups[10:21] The significance of a balanced quantitative and qualitative assessment in VC investing[12:41] How AI is enabling the next generation of productivity tools[19:08] Articulate your personal story, intrinsic motivation, and ability to articulate your commitment to solving a particular problem.[24:07] What should startups do to attract algorithm-driven funds like Vela Partners? [27:02] Slow decision-making pace in VC is a problemThe nonprofit organizations Yiğit is passionate about: American College Institute (ACI) in TurkeyAbout Yiğit IhlamurYiğit Ihlamur is a co-founder, general partner, and Chief Technology Officer at Vela Partners, a venture capital firm specializing in AI and product-led solutions. Prior to Vela, Yiğit had an impactful career at Google, where he led initiatives for Gmail, Chrome, and Google Workspace, focusing on product-led growth strategies such as self-service signups, activation, and pricing flows. Starting in Google's European headquarters, he initially worked on technical support for IT admins and created productivity tools for internal teams. Yiğit is passionate about advancing human productivity through technology. Outside his professional pursuits, he enjoys kitesurfing, skiing, running, and spending time with his family.About Vela PartnersVela Partners is a Silicon Valley-based venture capital firm specializing in product-led, AI-native startups from inception to Series A. Acting as an "AI startup investing in other AI startups," Vela leverages its proprietary AI to guide investment decisions and enhance its own returns. The firm's investor-focused tools drive its fund's performance, while self-service tools for entrepreneurs expand Vela's brand and distribution channels. Its portfolio includes innovative companies like Vieu, Cartken, Goooods, Nominal, LeakSignal, Base Operations, Cerby, Lightup, Vartana, and Axiom Cloud, among others.Subscribe to our podcast and stay tuned for our next episode.

New Books Network
Dennis Wuerthner, "Poems and Stories for Overcoming Idleness: P'ahan chip by Yi Illo" (U Hawaii Press, 2024)

New Books Network

Play Episode Listen Later Oct 28, 2024 97:55


Dr. Dennis Wuerthner's Poems and Stories for Overcoming Idleness: P'ahan chip by Yi Illo (U Hawaii Press, 2024) is the first complete English translation of one of the oldest extant Korean source materials. The scholar, Yi Illo (1152–1220), filled this collection with poetry by himself and diverse writers, ranging from Chinese master poets and Koryŏ-era kings, to long-forgotten lower-level officials and rural scholars. The verse compositions are embedded in short narratives by Yi that provide context for the poems, a combination called sihwa. The book contains a comprehensive introduction that explores the lives of Yi Illo and his contemporaries, and the political landscape at the time this collection came into being. The translation itself is richly annotated to provide context to the allusions and to explore possible meanings. The publication is an excellent resource for readers interested in the political and social environment of the Koryŏ Dynasty (918–1392) and for anyone with a love for poetry and prose. Dr. Dennis Wuerthner is assistant professor of East Asian literature in the Department of World Languages and Literatures, at Boston University. He holds a PhD from Ruhr University in Bochum and his main field of research is Korean literature, history and culture in a broader East Asian context. Leslie Hickman is a translator and writer. She has an MA in Korean Studies from Yonsei University and lives in Seoul, South Korea. You can follow her activities at https://twitter.com/AJuseyo. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/new-books-network

New Books in East Asian Studies
Dennis Wuerthner, "Poems and Stories for Overcoming Idleness: P'ahan chip by Yi Illo" (U Hawaii Press, 2024)

New Books in East Asian Studies

Play Episode Listen Later Oct 28, 2024 97:55


Dr. Dennis Wuerthner's Poems and Stories for Overcoming Idleness: P'ahan chip by Yi Illo (U Hawaii Press, 2024) is the first complete English translation of one of the oldest extant Korean source materials. The scholar, Yi Illo (1152–1220), filled this collection with poetry by himself and diverse writers, ranging from Chinese master poets and Koryŏ-era kings, to long-forgotten lower-level officials and rural scholars. The verse compositions are embedded in short narratives by Yi that provide context for the poems, a combination called sihwa. The book contains a comprehensive introduction that explores the lives of Yi Illo and his contemporaries, and the political landscape at the time this collection came into being. The translation itself is richly annotated to provide context to the allusions and to explore possible meanings. The publication is an excellent resource for readers interested in the political and social environment of the Koryŏ Dynasty (918–1392) and for anyone with a love for poetry and prose. Dr. Dennis Wuerthner is assistant professor of East Asian literature in the Department of World Languages and Literatures, at Boston University. He holds a PhD from Ruhr University in Bochum and his main field of research is Korean literature, history and culture in a broader East Asian context. Leslie Hickman is a translator and writer. She has an MA in Korean Studies from Yonsei University and lives in Seoul, South Korea. You can follow her activities at https://twitter.com/AJuseyo. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/east-asian-studies

New Books in Literary Studies
Dennis Wuerthner, "Poems and Stories for Overcoming Idleness: P'ahan chip by Yi Illo" (U Hawaii Press, 2024)

New Books in Literary Studies

Play Episode Listen Later Oct 28, 2024 97:55


Dr. Dennis Wuerthner's Poems and Stories for Overcoming Idleness: P'ahan chip by Yi Illo (U Hawaii Press, 2024) is the first complete English translation of one of the oldest extant Korean source materials. The scholar, Yi Illo (1152–1220), filled this collection with poetry by himself and diverse writers, ranging from Chinese master poets and Koryŏ-era kings, to long-forgotten lower-level officials and rural scholars. The verse compositions are embedded in short narratives by Yi that provide context for the poems, a combination called sihwa. The book contains a comprehensive introduction that explores the lives of Yi Illo and his contemporaries, and the political landscape at the time this collection came into being. The translation itself is richly annotated to provide context to the allusions and to explore possible meanings. The publication is an excellent resource for readers interested in the political and social environment of the Koryŏ Dynasty (918–1392) and for anyone with a love for poetry and prose. Dr. Dennis Wuerthner is assistant professor of East Asian literature in the Department of World Languages and Literatures, at Boston University. He holds a PhD from Ruhr University in Bochum and his main field of research is Korean literature, history and culture in a broader East Asian context. Leslie Hickman is a translator and writer. She has an MA in Korean Studies from Yonsei University and lives in Seoul, South Korea. You can follow her activities at https://twitter.com/AJuseyo. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/literary-studies