POPULARITY
Categories
On this episode, Heather is back with Rosebud Baker for a light pregame just in time for The Absolutely Knot Cruise. Rosebud chats about her relationship, becoming a mom, and taking her comedy in a new direction this time around on tour. They also drop some piping hot takes on the Real Housewives and give the scoop on their upcoming cruise shenanigans. It's been a long road and the ladies are at Brat Status - as they should be!Episode Sponsors:Pick up GOODLES on your next shopping trip… it's available nationwide at major grocery stores and retailers.Absolutely Not is sponsored by BetterHelp. Visit BetterHelp.com/ABSOLUTELY today to get 10% off your first month.Cancel your unwanted subscriptions and reach your financial goals faster with Rocket Money. Go to RocketMoney.com/absolutely today.Head to http://primalkitchen.com/absolutelynotpodcast to save 20% off your next online order with code ABSOLUTELY at checkout.Nuuly is a great value at $98 a month for any 6 styles, but right now you can get $28 off your first month of Nuuly when you sign up with the code ABSOLUTELY at Nuuly.comHead over to Addyi's website — Addyi.com — and see if Addyi is right for you.Produced by Dear MediaSee Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.
In a complete one-off, Rosebud is releasing an episode on a Saturday, because our guest is the 2022 Grand National winner, and two-times winner of the Cheltenham Gold Cup, Sam Waley-Cohen. Sam is an amateur jockey, but is one of the most successful Grand National riders of all time, and finally won the race in his last ever race with the 50 to one shot horse Noble Yeats in 2022. Sam talks to Gyles about what it takes to make a great partnership with a horse, how to find the best starting position, the challenges of the Grand National, and also about his life, which has been full of daring and adventure. There's also been great sadness, as Sam tells Gyles about the death of his younger brother Thomas. Sam also gives his tip for the race today. Our thanks to him for this great conversation. Learn more about your ad choices. Visit podcastchoices.com/adchoices
We're really delighted that Sue Perkins is our guest on Rosebud this week, and this is a special conversation. Gyles and Sue have known and worked with each other for years on shows like Radio 4's Just a Minute, which Sue now hosts, so this conversation about Sue's first memories, family history, childhood and student days is intimate, honest, and at times very moving. Sue tells Gyles about her childhood, growing up in a close but slightly sheltered family in south London. She tells Gyles about being left-handed, about playing the flute and about her first boyfriend. She talks about how she defied her teachers' predictions by getting a place at Cambridge, and about the life-changing friends she made there. She talks about coming out and about finding herself. Thank you very much to Sue for your warmth, your wit and for your wonderful stories. Cue the music. Learn more about your ad choices. Visit podcastchoices.com/adchoices
(Airdate 4/1/25) On this podcast we share the story of the Rosebud Academy and their heartwarming quest to rebuild and bounce back following the tragedy of the Eaton Canyon/Altadena wildfire. Shawn Brown is the founder and executive director of the Rosebud Academy. https://www.rosebudacademy.com/ https://www.dominiquediprima.com/ https://www.showclix.com/event/ice-house-standup-for-students-4-6-25-4pm
Today Gyles speaks to a remarkable person: Sir Grayson Perry. Artist, presenter, thinker and fashion icon, Perry is a unique figure who has led an unusual life. Grayson was born and brought up in Chelmsford, but his father left the family when Grayson was only four years old, and his mother married the local milkman - a Tom Jones lookalike with an unpredictable temper. Grayson talks to Gyles about the effect this lonely and unsettled childhood had on him: this is a moving episode. Gyles also asks Grayson about his transvestism and about his art. We hope you enjoy this very interesting edition of Rosebud. Grayson Perry is on tour nationwide this autumn, and tickets are available at https://www.fane.co.uk/grayson-perry. Learn more about your ad choices. Visit podcastchoices.com/adchoices
Comedian Rosebud Baker returns to catch up with the guys and promote her new Netflix special "The Mother Lode." Bobby and Rosebud were with comic Pete Lee soon after his home burned down in the L.A. fires. Pete goes on Jimmy Fallon's show to tell his story and Bob discovers that he was renting his L.A. house not an owner. Jay remembers when his home got flooded in Hurricane Sandy and his maid had to dry out all his pornographic materials. Bobby shares a video of him falling down yet again. Footage of Bob falling the first time is played along with DJ Mike Calta embarrassingly hitting the dirt. *To hear the full show to go www.siriusxm.com/bonfire to learn more FOLLOW THE CREW ON SOCIAL MEDIA: @thebonfiresxm @louisjohnson @christinemevans @bigjayoakerson @robertkellylive @louwitzkee @jjbwolfSubscribe to SiriusXM Podcasts+ to listen to new episodes of The Bonfire ad-free and a whole week early. Start a free trial now on Apple Podcasts or by visiting siriusxm.com/podcastsplus.
In this episode you join Gyles and Harriet at the Oldie of the Year lunch, where Harriet asks some of the guests at the event for their first memories. Including memories from Radio 4 legend John Humphrys, children's writer Dame Jacqueline Wilson, choreographer Arlene Philips, actress Nanette Newman and 102 year-old skydiver Manette Baillie. Plus some chat from Gyles and Harriet and one of your emails, about synaesthesia. Learn more about your ad choices. Visit podcastchoices.com/adchoices
We're back, diary fans, with another instalment from Gyles's diary. This one starts in the 1964 summer holidays, Gyles is 16, and he's off on a road trip with his Canadian cousin Johnnie. Gyles is not impressed. In the autumn it's time for the General Election and, at Bedales, the school mock election, in which Gyles is running as the Conservative candidate. He's also appearing as Isaac Newton in the school play, taking over the school magazine, and getting his O-level results. As usual, there's never a dull moment. This episode also features an extended chat from Gyles and Harriet at the start, and a couple of your emails. Learn more about your ad choices. Visit podcastchoices.com/adchoices
In this episode of the HiTech podcast, we explore game creation with the AI platform @rosebud-ai We experiment with building a tank game using various prompts. Throughout the episode, we discuss our experiences and challenges—from generating game assets to incorporating game mechanics and troubleshooting issues. Tune in to learn about the platform's potential for creating educational and recreational games, and see how it compares to ChatGPT for game development. For more on our conversation, check out the episode page here.Want to build your business like we have? Join us over at Notion by signing up with our affiliate link to start organizing EVERYTHING you do.Head over to our website at hitechpod.us for all of our episode pages, send some support at Buy Me a Coffee, our Twitter, our YouTube, our connection to Education Podcast Network, and to see our faces (maybe skip the last one).Need a journal that's secure and reflective? Sign-up for the Reflection App today! We promise that the free version is enough, but if you want the extra features, paying up is even better with our affiliate discount.
Dame Kristin Scott Thomas is one of our best, most distinctive, and most watchable actresses, and we're delighted that she's our guest on Rosebud today. Dame Kristin tells Gyles about her childhood, which was at times idyllic but was coloured by the tragic deaths of both her father and step-father. She talks about her impressive and highly talented mother, who brought up five children in the midst of loss. She talks about her move to Paris as a teenager, how she met and married her first husband, and how important her new French family became to her. She talks about her career, working with Prince, how she was cast in The English Patient and her work on stage in The Audience and Elektra. Kristin's life is fascinating, and this is a fascinating, and moving, episode. Learn more about your ad choices. Visit podcastchoices.com/adchoices
Zoomer Orson Welles protege joins me for a Citizen Kane-like adventure through the entire career of the man who believed in magic perhaps too much -- or, perhaps, too little. We look at almost every film he made, we lean in for his radio, we applaud his theater, we gasp at his tricks and read his biography ("Rosebud" by David Thomson). Maybe we can find out what this giant boy was missing... With Darius Csiky host of 8pl8s podcast Part 1 of 2. For the upcoming second half, double the overall episodes, and smoke break mini-eps, subscribe to the show at patreon.com/filthyarmenian Hope to see you there. X/insta @filthyarmenian
In another special episode dedicated to the how, what and why of memory, Gyles talks to his long-time friend and colleague Professor Brett Kahr. Professor Kahr is a practising psychotherapist and an expert on Sigmund Freud, the father of psychotherapy and the inventor of the "talking cure". In this fascinating conversation, Gyles and Prof. Kahr take a detailed look at the power of childhood memories, particularly traumatic ones, to effect our adult lives, and the benefits to be had from examining them and learning from them. Brett also tells Gyles about Freud himself, how he developed his ideas and how he escaped the Nazis and came to London. Gyles talks to Brett about some of the memories we've heard from our guests on Rosebud, and Brett talks about some of the common themes which come up in psychoanalysis - such as dreams and sex. This is a wide-ranging and thought-provoking conversation, we hope you enjoy it. Professor Kahr's book: Coffee with Freud, is available from major bookshops online. Learn more about your ad choices. Visit podcastchoices.com/adchoices
We've hit the Mother Lode with two of the best in the biz! Rosebud Baker and Rachel Feinstein join us for an epic hang with a bunch of laughs. We're drinking Patron on this episode and make it through almost a whole bottle. Check out Rosebud's new special on Netflix, The Mother Lode. Go see Rachel on the road! Support the show and sign up for you $1 per month trial period. Head to https://www.shopify.com/drunk Subscribe to We Might Be Drunk: https://bit.ly/SubscribeToWMBD WMBD Merch: https://wemightbedrunkpod.com/ WMBD Clips Page: https://bit.ly/WMBDClips Rosebud Baker: https://www.rosebudbaker.com https://www.instagram.com/rosebudbaker Rachel Feinstein: https://rachel-feinstein.com/ https://www.instagram.com/rachelfeinstein_ Sam Morril: YouTube Channel: @sammorril Instagram: https://instagram.com/sammorril Tickets/Tour: https://punchup.live/sammorril/tickets Mark Normand: YouTube Channel: @marknormand Instagram: https://www.instagram.com/marknormand Tickets/Tour: https://punchup.live/marknormand/tickets We Might Be Drunk is produced by Gotham Production Studios https://www.gothamproductionstudios.com/ @GothamProductionStudios Producer Matt Peters: https://www.instagram.com/mrmatthewpeters #wemightbedrunk #marknormand #sammorril #podcast #drunkpodcast #comedy #comedian #funny #gothampodcastomedy Tour Dates Announcement
Rosebud Baker's new comedy special "The Mother Lode" is available on Netflix now! She bonds with Nana over life as a new mother, gives a behind the scenes look into the Saturday Night Live writers' room, and names all of the things she's obsessed with...including TV shows White Lotus, Severance and Real Housewives. Rosebud, Nana and Trish also all show each other what their Instagram explore pages look like and everyone's is VERY different. Rosebud Baker's Comedy Special: https://www.netflix.com/title/81711233 Chapters 0:00 Intro 1:34 SNL 50 & Rosebud Baker's Start at Saturday Night Live 5:01 Rosebud is the Mother of the house! 9:00 Life as a Dog Mom 14:25 X vs Threads & Sharing What Our Explore Pages Look Like 19:20 Rosebud's Marriage & Sobriety 25:18 Rosebud Baker's New Netflix Comedy Special "The Mother Lode" 31:18 Rosebud also has a chihuahua named Mouse 36:33 Shane Gillis Hosting SNL & Rosebud Writing for Update 40:25 Obsessions 57:46 Chevy Chase & Burt Reynolds *** Pat McAuliffe and Joey Camasta host a series of hilarious, no holds barred conversations that will leave you laughing on the floor. Shows air every Wednesday and Friday, with new guests weekly. Headphones required. iTunes: https://podcasts.apple.com/us/podcast/out-about/id1534217005 Spotify: https://open.spotify.com/show/7rjGpD7sOD4zKOJ2eGXK2Q Follow us on... Instagram: @outandaboutpod, @barstoolpat, @JoeyCamasta Twitter: @OutAndAboutPod, @BarstoolPat, @JoeyCamasta TikTok: @outandaboutpod, @pat.mcauliffe, @JoeyCamastaYou can find every episode of this show on Apple Podcasts, Spotify or YouTube. Prime Members can listen ad-free on Amazon Music. For more, visit barstool.link/outandabout
In this episode of the Journey of My Mother's Son podcast, I talk with fellow author, Mark Connor. Mark Connor is a Boxing Trainer and a Writer from Saint Paul, Minnesota. His first book, It's About Time (Millions of Copies Sold for Dad), is a saga wrapped around a package of poems, guarded by angels. Through an autobiography reading like a novel, he weaves together a story of love, family, and life with twenty poems running through it, sharing his growth in the Catholic faith, the influence of Irish heritage in his hometown's American identity, his exploration of Lakota tradition within the urban American Indian community, and his understanding of how truth found in different spiritual approaches can lead others—as it led himself back—to its fullness in the revelation of Christ. Mark Connor grew up in Saint Paul, calling himself the product of a “mixed marriage,” because his father—a combat wounded Vietnam veteran—grew up across the street from St. Columba parish in the Midway district, while his mother—a school teacher who later became a lawyer—came from the Holy Rosary parish “across the border, in South Minneapolis.” Born in Minneapolis and raised in Saint Paul, he began boxing at age 10, at the Mexican American Boxing Club on the city's East Side, the area of the city from which he formed his understanding of the world, anchoring his perception of direction to the family house and the rising of the sun outside his bedroom window. He had 102 amateur fights, made it to three national tournaments, and competed against some of the nation's top world class boxers. He became the Upper Midwest Golden Gloves lightweight champion at 17 and traveled to the Olympic Training Center in Colorado Springs, CO, two days after graduating high school, competing in the 1987 trials for the Pan American Games. Raised in the East Side parish of St. Pascal Baylon, where he attended first through sixth grade, Mark's father, a graduate of [Bishop] Cretin High School in Saint Paul, insisted Mark and his brother, David (13 days less than one year older than Mark), each attend its rival, St. Thomas Academy, in suburban Mendota Heights, from 7th through 12th grade, an all-boys Catholic Military high school. Having begun writing seriously at 16 and starting college at 18, Mark began an internal struggle between the academic path and boxing, spending one and a half years, respectively, at three schools—Regis University in Denver, Co., the University of St. Thomas in Saint Paul, and the University of Minnesota in Minneapolis—earning his BA in English from the University of Minnesota. He was inactive as a boxer for only one and a half of those years, but never felt he was able to reach his potential while emersed in study, so upon graduation, he continued Boxing. Mark boxed competitively for two and half more years, then, deciding not to follow his gym mates—two of whom became world champions—in a professional boxing career, and believing it was already late in life to join the military, he went on an adventure, driving to Seattle, WA, securing a job on a salmon fishing boat headed to Southeast Alaska. A Year later, instead of returning to the commercial fisherman's life, he traveled with a friend to a Lakota Sundance ceremony on the Rosebud reservation, leading eventually to a job at Aín Dah Yung (Our Home) Center, a Native American Indian temporary emergency homeless shelter for youth aged 5 to 17, in Saint Paul. Within this setting, continuing to write freelance articles and periodically working on fiction and poetry, he eventually began a personal training service and worked with both competitive and recreational boxers, as well professionals and amateurs, wrote about boxing, and contemplated his faith. While recognizing that truth, goodness, and beauty are indeed present in the faith traditions of the indigenous community of friends welcoming him, as both a guest and a relative, he eventually reembraced the beauty, goodness, and truth of his Catholic faith and has since attempted to responsibly discern God's will for him, according to his legitimate talents and desires. Within that sincere effort, at the end of September, 2019, his father, who'd been patiently guiding him, died from a heat attack, just before America—and the world—appeared to enter a new era of chaos within which we are attempting to stabilize ourselves. Mark wrote the first lines of his book, It's About Time (Millions of Copies Sold for Dad) the day his father died, Monday, September 30, 2019. However, over the next year, as his country went through the impeachment and acquittal of a president, endured the trauma of an economic shutdown over a mysterious virus coming from a lab leak in China, and his beloved Twin Cities blew up in fiery riots, Mark worked when he could (the Boxing gyms and churches were closed due to Governor's orders), helped his mother who was diagnosed with a fatal heart disease, and daily mourned his father. He helped protect American Indian buildings with American Indian Movement (AIM) Patrol, and he eventually got part-time work as a bouncer, working bar security when restaurants were allowed to reopen. But he didn't do much until, as Christmas 2020 approached, he resolved that in the coming year he would do something with which his father would be happy. Organizing himself and setting his goal, he began writing the book his father—who'd nagged Mark about always insisting he was a writer yet never publishing a book—was never to see published in his earthly lifetime. Beginning the daily process of writing on February 9, 2021, Mark completed the first draft of It's About Time (Millions of Copies Sold for Dad) just before Easter on the Monday of Holy Week, March 29, 2021. In this book he tells the tale of his search for a meaningful life, appreciating the gift of God's love that life actually is, and how he sees now that the guardian angels were always guiding him and his family through it all. A contract with a humble little local publisher was severed over editorial differences on Christmas Eve, 2022, so Mark relied on his father's gift, his high school education, accepting help from his St. Thomas Academy contacts, specifically his literary advisor, Dan Flynn (Author of Famous Minnesotans: Past and Present) and legal advisor Kelly Rowe, and Mark's classmate, Tony Zirnhelt, and the book won the 2024 Irish Network Minnesota Bloomsday Literary Award and was published, through Connemara Patch Press, on Father's Day, June 16. Unfortunately, Mark's mother, who'd read the manuscript, never saw it in print, having collapsed in his arms and died October 22, 2023. Yet Mark continues on in hopeful and confident prayer that she—Mrs. Nanette Jane Connor—is watching over him, as she promised she would, next to his father—Robert J. Connor—while gazing perpetually into the Beatific Vision of the face of God. To find out more about Mark, you can check out his website at https://boxersandwritersmagazine.com/.
Civic Media's Political Editor and Founder of The Recombobulation Area, Dan Shafer is here to talk about the latest Marquette Law School Poll, including numbers not being discussed by the press! And those numbers, give us some comfort on how we feel, versus how we're represented in Madison. Then, Pete Schwaba, Host of Nite Lite stops by to talk about the new Green And Gold film and we see if he has pointed out the one aspect of the movie that Jane can't seem to move past. You know what's the best pairing with a Wisconsin Fish Fry? This Shouldn't Be A Thing - Hot Stuff Edition As always, thank you for listening, texting and calling, we couldn't do this without you! Don't forget to download the free Civic Media app and take us wherever you are in the world! Matenaer On Air is a part of the Civic Media radio network and airs Monday through Friday from 10 am - noon across the state. Subscribe to the podcast to be sure not to miss out on a single episode! You can also rate us on your podcast distribution center of choice, they go a long way! To learn more about the show and all of the programming across the Civic Media network, head over to https://civicmedia.us/shows to see the entire broadcast line up. Follow the show on Facebook, X and YouTube to keep up with Jane and the show! Guests: Dan Shafer, Pete Schwaba
We've another action-packed episode of Gyles's teenage diary for you, diary fans. The young Brandreth continues to busy himself on multiple fronts - with drama, journalism and politics. As his form teacher says in his report: "Gyles appears to be a person of boundless energy and enthusiasm..." But all is not entirely plain-sailing in this episode, as Gyles gets involved in a scandal at school and learns a painful lesson, with the help of his extremely understanding teachers. If you're enjoying the diaries but haven't heard them all, you can go back to the start - they're all marked chronologically and begin with episode 1. We'll be back with more diaries in two weeks' time. Learn more about your ad choices. Visit podcastchoices.com/adchoices
“Gold Watches” from East Rosebud customers and a Hot Chick from Wyoming…..#flyfishing #eastrosebud2025#eastrosebudflyandtackle#eastrosebudthermop2025#thefoulhookedwhitey
Rosebud is thrilled to present to you one of the truly great Hollywood actresses of her generation: Sigourney Weaver. Known for her groundbreaking performances as Ripley in Ridley Scott's Alien movies, Sigourney redefined what women could do in the movies. Sigourney talks to Gyles about her fascinating parents - her father, Pat Weaver, was a leading US television executive who pioneered morning television, and her mother, the English actress Elizabeth Inglis. She tells Gyles how she came by her name, and about her first kiss, and how she struggled to get used to her height. She talks about beginning acting at Stanford and later Yale, and how her confidence was knocked by her Yale tutors. She talks about working with Woody Allen and on the set of Alien. Halfway through the show, Sigourney and Gyles are joined by a special guest - an Oscar-winning British actress who Sigourney has loved since she was a child. Learn more about your ad choices. Visit podcastchoices.com/adchoices
Tariq Ali was one of the most famous and recognisable figures of the revolutionary student movements of the 1960s. In his distinctive red Aquascutum mac he led marches, gave speeches and galvanised opposition to the Vietnam War and other big issues of the time. Born in Lahore just before Partition, he grew up in a prominent family of politicians and became interested in radical politics from a young age. At only 16 he organised his first strike, among the latrine-wallahs in the hill station where his family holidayed. At 18 he was sent to Oxford University, where he became president of the Union and impressed people with his brilliance at public speaking and debating. He tells Gyles about all this and about some of the amazing people he met along the way - including Mick Jagger, John Lennon, Marlon Brando and Malcolm X. If you aren't familiar with Tariq Ali's story, this is really worth listening to for a fascinating insight into the radical and optimistic days of the late 60s and early 70s, when anything seemed possible. Tariq's memoir, You Can't Please All, is out now, published by Verso. https://www.versobooks.com/en-gb/blogs/authors/ali-tariq?srsltid=AfmBOorNoyy2ZaKtYHa5Zay30ii1E1ieQJ76ERKTHvDgvfIEB9t-zreO Learn more about your ad choices. Visit podcastchoices.com/adchoices
My HoneyDew this week is comedian Rosebud Baker! Check out Rosebud's newest special, The Mother Lode, out on Netflix today! Rosebud joins me to highlight the lowlights of her younger sister's sudden and tragic passing during their childhood. She opens up about the personal impact of her sister's death and how her perspective has evolved since becoming a mother herself. We dive into the complexities of navigating emotions and family dynamics after loss, the ways we keep lost loved ones' memories alive, and how Rosebud's mother went on to pass the Virginia Graeme Baker Pool and Spa Safety Act in honor of her late daughter. CATCH ME ON TOUR https://www.ryansickler.com/tour San Jose, CA - Feb. 28 - March 1st Madison, WI - April 12th (Special Taping) SUBSCRIBE TO MY YOUTUBE and watch full episodes of The Dew every toozdee! https://youtube.com/@rsickler SUBSCRIBE TO MY PATREON - The HoneyDew with Y'all, where I Highlight the Lowlights with Y'all! Get audio and video of The HoneyDew a day early, ad-free at no additional cost! It's only $5/month! AND we just added a second tier. For a total of $8/month, you get everything from the first tier, PLUS The Wayback a day early, ad-free AND censor free AND extra bonus content you won't see anywhere else! https://www.patreon.com/TheHoneyDew What's your story?? Submit at honeydewpodcast@gmail.com Get Your HoneyDew Gear Today! https://shop.ryansickler.com/ Ringtones Are Available Now! https://www.apple.com/itunes/ http://ryansickler.com/ https://thehoneydewpodcast.com/ SUBSCRIBE TO THE CRABFEAST PODCAST https://podcasts.apple.com/us/podcast/the-crabfeast-with-ryan-sickler-and-jay-larson/id1452403187 BetterHelp -The HoneyDew is sponsored by BetterHelp, get 10% off your first month at https://www.Betterhelp.com/HONEYDEW
Rosebud Baker doesn't pull punches—she lands them. With her razor-sharp wit and unflinching honesty, she's become one of the most fearless voices in comedy. Her latest stand-up special, The Mother Lode, proves why she's a powerhouse, delivering darkly hilarious takes on life, love, and everything in between. Beyond the stage, she's bringing the heat in the writers' room at Saturday Night Live, shaping some of the sharpest comedy on TV. Whether she's on stage or behind the scenes, Rosebud's comedy is as bold as it is unforgettable. #rosebudbaker #andrewsantino #whiskeyginger #podcast ============================================ Sponsor Whiskey Ginger: https://public.liveread.io/media-kit/whiskeyginger SUPPORT OUR SPONSORS SQUARESPACE Get that site up and running now! 10% off your order https://squarespace.com/whiskey KICKOFF GET YOUR 1ST MONTH FOR $1 http://getkickoff.com/whiskey INDOCHINO PROMO CODE: WHISKEY GET 20% OFF $499 OR MORE https://indochino.com/ NORD VPN 4 MONTHS FREE WITH A 2 YEAR PLAN https://nordvpn.com/whiskey ======================================= Follow Andrew Santino: https://www.instagram.com/cheetosantino/ https://twitter.com/CheetoSantino Follow Whiskey Ginger: https://www.instagram.com/whiskeygingerpodcast https://twitter.com/whiskeygingerpodcast Produced and edited by Joe Faria https://www.instagram.com/itsjoefaria Learn more about your ad choices. Visit megaphone.fm/adchoices
It's Chit Chat Wednesday on The JTrain Podcast, and Jared is joined by the razor-sharp, brutally funny Rosebud Baker! Fresh off the release of her new Netflix special The Motherlode, Rosebud gives us the inside scoop on what it was like filming half her special while eight months pregnant and the other half while sleep-deprived and covered in baby spit-up.Jared and Rosebud take a hilarious deep dive into the rollercoaster of motherhood, marriage, and stand-up comedy, playing a game of "Peaks and Pits" to unpack everything from parenting in NYC to the subtle art of keeping a relationship spicy when your biggest turn-on is uninterrupted sleep. Rosebud also shares the surreal experience of being worshipped like a rockstar by her toddler—right before being judged by her nanny for prioritizing podcasting over nap time.The two swap takes on sobriety, why being the only sober person at a work party is a social experiment in patience, and their irrational (but very real) fears—like plane crashes, parenting fails, and the existential horror of running into an ex while holding a screaming child.Get ready for sharp observations, unfiltered laughs, and a much-needed reminder to stream The Motherlode early and often (because algorithms, people!). Don't miss this episode packed with Chit Chat chaos and top-tier parenting wisdom—aka “don't drop the baby, and don't drop the bit.”Want more JTRAIN? Join the Patreon for more, listen to Coffee with JTrain, & send an email to
After the relationship turbulence of the first six months of the year (you can hear all about that in the last episode - episode 8), our hero, the 15 year-old Gyles, sensibly turns his attention away from romance and onto other things. The most pressing one being his upcoming performance in the school production of Twelfth Night. Can he follow the advice of his drama teacher and learn to act "from the inside out?" And stop putting on silly voices and being a ham? Meanwhile, there's major news on the international stage as President John F Kennedy is shot while riding in a motorcade through the centre of Dallas, Texas. Gyles, in his inimitable style, keeps us up to date with all this and more in another episode of his not-to-be-missed teenage diaries. Cue the music. Learn more about your ad choices. Visit podcastchoices.com/adchoices
It's Valentine's Day, and we at Rosebud knew we needed to find someone special, to spread the love to you, our gorgeous listeners: we needed to find our very own Rosebud Romeo. Someone debonair, someone handsome, someone with a talent for being in love... and who better than possibly the most charming actor of them all: Nigel Havers? Nigel talks to Gyles about his enchanted life. The son of an eminent lawyer, Nigel was sent to boarding school at 6, where he discovered his talent for acting. He went to the Arts Educational School as a teenager, and this is where the fun begins... this is a tale of girlfriends, the Rolling Stones, and meeting Margaret Thatcher. Enjoy this fun, frothy and fascinating episode, and happy Valentine's Day! Learn more about your ad choices. Visit podcastchoices.com/adchoices
Lucy Fleming and Simon Williams give Rosebud a truly romantic Valentine's treat today, as they read the wartime letters of Lucy's parents: Celia Johnson, the great actress and star of Brief Encounter, and Peter Fleming, writer and older brother of Ian Fleming. The letters you hear in this episode were written from 1940 until VE Day, while Peter was away in the Burma and India, working for Lord Wavell, and Celia was at home looking after their young son and becoming known as a film actress. In 1945 she plays the lead in Brief Encounter, one of the greatest British films ever made. These letters are so tender and evocative, and transport you to another time - one of bravery, decency and self-deprecation. Lucy and Simon are themselves married, and have been together for 40 years, and are both successful actors - Archers fans will, in particular, be very familiar with their voices! They talk to Gyles about their first memories, how they met, and tell Gyles the secret to a long and happy marriage. Lucy and Simon read these letters live on stage in their brilliant show, Posting Letters to the Moon, which is touring the UK now - tickets are available at https://www.postingletterstothemoon.com/tickets. Learn more about your ad choices. Visit podcastchoices.com/adchoices
Roz slides down the banister just in time to welcome the hilarious comedian, actress, and Emmy Award winning writer, Rosebud Baker! Spirits gather around as the two discuss a haunted Toys “R” Us, spirit babies, and Rosebud’s unexpected encounter with a medium at a comedy club!Want to share YOUR paranormal experience on the podcast? Email your *short* stories to GhostedByRoz@gmail.com and maybe Roz will read it out loud on the show... or even call you!Be sure to follow the show @GhostedByRoz on Instagram.Support this podcast by shopping our latest sponsor deals and promotions at this link: https://bit.ly/3WwYCsrSee omnystudio.com/listener for privacy information.
Mark Gatiss: actor, writer, producer, director and creator - this is someone who makes Gyles Brandreth look like an under-achiever! From The League of Gentlemen, to Doctor Who, Sherlock and the Christmas Ghost Stories, Mark has been behind some of the most distinctive and original television of recent times. He's also an actor - recently winning awards for his portrayal of Sir John Gielgud in Jack Thorne's The Motive and The Cue. And it goes without saying that he's a fascinating person, with a fascinating story to tell. In this episode, which was recorded live at The Orange Tree Theatre in Richmond, Mark talks to Gyles about his working class childhood, growing up opposite the local psychiatric hospital. He talks about his hatred of PE lessons, his early obsession with TV and how he became interested in horror. He talks about coming out to his parents and how he met the rest of the League of Gentlemen at drama school. With thanks to the team at The Orange Tree Theatre and to Mark Gatiss for this brilliant interview. Learn more about your ad choices. Visit podcastchoices.com/adchoices
Rosebud Baker is here on this week’s Steph Infection! Steph chats with Rosebud about her new special, The Mother Lode, releasing on Netflix on February 18th and how she combined two separate hours into one special. They also chat all about Rosebud’s recent pregnancy, the birth of her first child, and much more! Steph Infection is sponsored by by BetterHelp. Give online therapy a try at betterhelp.com/STEPHPOD and get on your way to being your best self Follow @Steph_Tolev and @Steph_Infection_Podcast on Instagram. Send in your body stories to be featured on the pod! See Steph on tour this year! US DatesMarch 13-15 DenverMarch 21-22 AlbuquerqueApril 25-27 PittsburghMay 2-4 MilwaukeeMay 22 Desert RidgeMay 23-25 Tempe Get tickets at https://punchup.live/stephtolev Be sure to follow @rosebudbaker on Instagram! Steph Tolev caught fire on the BILL BURR PRESENTS: FRIENDS WHO KILL, Netflix special. She was named a COMEDIAN YOU SHOULD AND WILL KNOW by Vulture, which recognized her as one of Canada’s funniest exports. She was featured on Comedy Central’s THE RINGERS stand up series, and season two of UNPROTECTED SETS. Steph has appeared in Comedy Central’s CORPORATE and starred in an episode of the Sarah Silverman-produced PLEASE UNDERSTAND ME. Steph has been well received at festivals all over the world and headlines clubs across the country. She also has a hit podcast on ALL THINGS COMEDY called “STEPH INFECTION” and appears in the feature OLD DADS starring and written by Bill Burr on Netflix. Check out her tour dates to see her live!
The Foul Hooked Whitey is back in Episode 5, Par 1 of, "The Rise and Fall of East Rosebud High School....."
You're in luck, diary fans, as this is one of the most enthralling episodes yet of Gyles's schoolboy diaries. It's the first half of 1963 and at school, Gyles is learning that relationships are complicated things, and that it's not usually OK to love two people at the same time. Outside school, the sexual revolution is beginning and Britain is gripped by the Profumo affair. Learn more about your ad choices. Visit podcastchoices.com/adchoices
In this episode, Vanessa Feltz wows Gyles with some of the funniest stories he's ever heard on Rosebud: from a first kiss with Pete Tong by the pool in Lanzarote, to an illicit encounter with a stranger who was bowled over by the magnificence of her bosom... this show is full of laughs and brilliant anecdotes. Vanessa takes Gyles two generations back in the Feltz family, to her grandparents and great grandparents in the tenements of the East End, and brings these extraordinary characters to life. She tells Gyles about her own childhood in the 'Beverley Hills of North London', and about her school days, when she was known as 'Vanessa the Undresser', and could do alluring things with a red woollen scarf. She talks to Gyles about her career and why it's important to teach your grandchildren how to make a human chain. Vanessa's autobiography, 'Vanessa Bares All' is out now. It's hugely entertaining, and well worth reading. Thanks so much to Vanessa for this brilliant interview, and for making Gyles laugh so much. Learn more about your ad choices. Visit podcastchoices.com/adchoices
The Foul Hooked Whitey is back discussing the impact of East Rosebud's Rock and Roll oriented custom logo's in Part 4 of the "Rise and Fall of East Rosebud High School."
In this episode of More Rosebud, the hypnotist and self-help writer Paul McKenna talks to Gyles about a subject you may have heard about before on Rosebud: manifesting. Plus he also tells Gyles about his early life, how he became a hypnotist, and he shows Gyles some relaxation techniques. So beware, this episode may make you exceedingly relaxed! Make sure you aren't driving or operating machinery while listening. Paul's book, Power Manifesting, is out now. Cue the music! Learn more about your ad choices. Visit podcastchoices.com/adchoices
The Foul Hooked Whitey returns and gives up his key staff in part 3 of "The Rise and Fall of East Rosebud High"
The Foul Hooked Whitey brings you Part 2 of "The Rise and Fall Of East Rosebud High School".In this episode the subject focuses on the hoops a Fly Shop Owner has to jump through to obtain the key vendors in the sport.....
Krishnan Guru-Murthy has been presenting the news on Channel 4 since 1998, but his career began long before that, when he was only a teenager. In fact, Gyles is a bit jealous when he finds out that Krishnan was even younger than him when he first appeared on TV! In this interview, Gyles also learns about the fascinating story of Krishnan's parents, how they met, and his father's inspirational journey from extreme poverty to NHS consultant. Krishnan talks about growing up in Lancashire in the 80s, his success at school and the racism in the playground, and then tells Gyles how he started to work as a journalist when he was only 18. He talks about his life now and his Strictly experience. This is a wide-ranging and stimulating conversation with someone who has been at the heart of British current affairs for a generation - thank you Krishnan for coming on Rosebud! Learn more about your ad choices. Visit podcastchoices.com/adchoices
The Foul Hooked Whitey is back to rant in Part one of "The Rise and Fall of East Rosebud High School".....
Welcome to part 7 of Gyles reading from the diaries he has kept since he was 10. Gyles is now 14, and a pupil at Bedales, a progressive Hampshire boarding school populated by many CND-supporting teachers and the children of liberal-minded, artistic parents. In this episode we hear about Gyles's summer holidays, spent on his own in Paris, learning French and staying in a boarding house. We hear about his success in the school production. We hear about his teachers - many of whom are "very CND" and some of whom are, shock horror, vegetarian. We hear Gyles's review of the best bits of 1962 and about the records he's enjoyed listening to that year. If you've only just discovered these episodes, you can either start here, or go back to episode 1 of the diaries and start at the beginning (when Gyles was at prep school). Enjoy this. Learn more about your ad choices. Visit podcastchoices.com/adchoices
We're honoured to have the great Dame Penelope Wilton as our very special guest on Rosebud today, in a rare podcast interview for this distinguished and well-loved actress. Penelope talks to Gyles about her happy childhood, in and around Knightsbridge and Kensington in London - we find out about her interesting neighbours, Mr Onion and the one-armed colonel. She tells Gyles about her mother's illness and her unhappy time at boarding school. She talks about her drama school days, and her breakthrough role in Harold Pinter's classic play Betrayal at the National Theatre. She talks about the roles on TV which made her a household name: Ever Decreasing Circles and, more recently, Afterlife. We're delighted to give Rosebud listeners the chance to spend time with this delightful, talented performer. Enjoy this. Learn more about your ad choices. Visit podcastchoices.com/adchoices
Holmberg's Morning Sickness - Brady Report - Wednesday January 15, 2025 Learn more about your ad choices. Visit podcastchoices.com/adchoices
Holmberg's Morning Sickness - Brady Report - Wednesday January 15, 2025 Learn more about your ad choices. Visit podcastchoices.com/adchoicesSee Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.
Holmberg's Morning Sickness - Brady Report - Wednesday January 15, 2025 Learn more about your ad choices. Visit podcastchoices.com/adchoices
Holmberg's Morning Sickness - Brady Report - Wednesday January 15, 2025 Learn more about your ad choices. Visit podcastchoices.com/adchoicesSee Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.
Danny Baker tells Gyles his extraordinary rags to riches story - from a council house in Bermondsey to partying with Elton John and Rod Stewart. He tells Gyles about his extraordinary father, Spud Baker. He also talks about getting cancer, and about getting cancelled, and how he survived both by being a "natural born Pollyanna". This story is full of wit, charm, funny stories, and Danny's characteristic ebullience. Enjoy. Learn more about your ad choices. Visit podcastchoices.com/adchoices
Due to overwhelming demand (>15x applications:slots), we are closing CFPs for AI Engineer Summit NYC today. Last call! Thanks, we'll be reaching out to all shortly!The world's top AI blogger and friend of every pod, Simon Willison, dropped a monster 2024 recap: Things we learned about LLMs in 2024. Brian of the excellent TechMeme Ride Home pinged us for a connection and a special crossover episode, our first in 2025. The target audience for this podcast is a tech-literate, but non-technical one. You can see Simon's notes for AI Engineers in his World's Fair Keynote.Timestamp* 00:00 Introduction and Guest Welcome* 01:06 State of AI in 2025* 01:43 Advancements in AI Models* 03:59 Cost Efficiency in AI* 06:16 Challenges and Competition in AI* 17:15 AI Agents and Their Limitations* 26:12 Multimodal AI and Future Prospects* 35:29 Exploring Video Avatar Companies* 36:24 AI Influencers and Their Future* 37:12 Simplifying Content Creation with AI* 38:30 The Importance of Credibility in AI* 41:36 The Future of LLM User Interfaces* 48:58 Local LLMs: A Growing Interest* 01:07:22 AI Wearables: The Next Big Thing* 01:10:16 Wrapping Up and Final ThoughtsTranscript[00:00:00] Introduction and Guest Welcome[00:00:00] Brian: Welcome to the first bonus episode of the Tech Meme Write Home for the year 2025. I'm your host as always, Brian McCullough. Listeners to the pod over the last year know that I have made a habit of quoting from Simon Willison when new stuff happens in AI from his blog. Simon has been, become a go to for many folks in terms of, you know, Analyzing things, criticizing things in the AI space.[00:00:33] Brian: I've wanted to talk to you for a long time, Simon. So thank you for coming on the show. No, it's a privilege to be here. And the person that made this connection happen is our friend Swyx, who has been on the show back, even going back to the, the Twitter Spaces days but also an AI guru in, in their own right Swyx, thanks for coming on the show also.[00:00:54] swyx (2): Thanks. I'm happy to be on and have been a regular listener, so just happy to [00:01:00] contribute as well.[00:01:00] Brian: And a good friend of the pod, as they say. Alright, let's go right into it.[00:01:06] State of AI in 2025[00:01:06] Brian: Simon, I'm going to do the most unfair, broad question first, so let's get it out of the way. The year 2025. Broadly, what is the state of AI as we begin this year?[00:01:20] Brian: Whatever you want to say, I don't want to lead the witness.[00:01:22] Simon: Wow. So many things, right? I mean, the big thing is everything's got really good and fast and cheap. Like, that was the trend throughout all of 2024. The good models got so much cheaper, they got so much faster, they got multimodal, right? The image stuff isn't even a surprise anymore.[00:01:39] Simon: They're growing video, all of that kind of stuff. So that's all really exciting.[00:01:43] Advancements in AI Models[00:01:43] Simon: At the same time, they didn't get massively better than GPT 4, which was a bit of a surprise. So that's sort of one of the open questions is, are we going to see huge, but I kind of feel like that's a bit of a distraction because GPT 4, but way cheaper, much larger context lengths, and it [00:02:00] can do multimodal.[00:02:01] Simon: is better, right? That's a better model, even if it's not.[00:02:05] Brian: What people were expecting or hoping, maybe not expecting is not the right word, but hoping that we would see another step change, right? Right. From like GPT 2 to 3 to 4, we were expecting or hoping that maybe we were going to see the next evolution in that sort of, yeah.[00:02:21] Brian: We[00:02:21] Simon: did see that, but not in the way we expected. We thought the model was just going to get smarter, and instead we got. Massive drops in, drops in price. We got all of these new capabilities. You can talk to the things now, right? They can do simulated audio input, all of that kind of stuff. And so it's kind of, it's interesting to me that the models improved in all of these ways we weren't necessarily expecting.[00:02:43] Simon: I didn't know it would be able to do an impersonation of Santa Claus, like a, you know, Talked to it through my phone and show it what I was seeing by the end of 2024. But yeah, we didn't get that GPT 5 step. And that's one of the big open questions is, is that actually just around the corner and we'll have a bunch of GPT 5 class models drop in the [00:03:00] next few months?[00:03:00] Simon: Or is there a limit?[00:03:03] Brian: If you were a betting man and wanted to put money on it, do you expect to see a phase change, step change in 2025?[00:03:11] Simon: I don't particularly for that, like, the models, but smarter. I think all of the trends we're seeing right now are going to keep on going, especially the inference time compute, right?[00:03:21] Simon: The trick that O1 and O3 are doing, which means that you can solve harder problems, but they cost more and it churns away for longer. I think that's going to happen because that's already proven to work. I don't know. I don't know. Maybe there will be a step change to a GPT 5 level, but honestly, I'd be completely happy if we got what we've got right now.[00:03:41] Simon: But cheaper and faster and more capabilities and longer contexts and so forth. That would be thrilling to me.[00:03:46] Brian: Digging into what you've just said one of the things that, by the way, I hope to link in the show notes to Simon's year end post about what, what things we learned about LLMs in 2024. Look for that in the show notes.[00:03:59] Cost Efficiency in AI[00:03:59] Brian: One of the things that you [00:04:00] did say that you alluded to even right there was that in the last year, you felt like the GPT 4 barrier was broken, like IE. Other models, even open source ones are now regularly matching sort of the state of the art.[00:04:13] Simon: Well, it's interesting, right? So the GPT 4 barrier was a year ago, the best available model was OpenAI's GPT 4 and nobody else had even come close to it.[00:04:22] Simon: And they'd been at the, in the lead for like nine months, right? That thing came out in what, February, March of, of 2023. And for the rest of 2023, nobody else came close. And so at the start of last year, like a year ago, the big question was, Why has nobody beaten them yet? Like, what do they know that the rest of the industry doesn't know?[00:04:40] Simon: And today, that I've counted 18 organizations other than GPT 4 who've put out a model which clearly beats that GPT 4 from a year ago thing. Like, maybe they're not better than GPT 4. 0, but that's, that, that, that barrier got completely smashed. And yeah, a few of those I've run on my laptop, which is wild to me.[00:04:59] Simon: Like, [00:05:00] it was very, very wild. It felt very clear to me a year ago that if you want GPT 4, you need a rack of 40, 000 GPUs just to run the thing. And that turned out not to be true. Like the, the, this is that big trend from last year of the models getting more efficient, cheaper to run, just as capable with smaller weights and so forth.[00:05:20] Simon: And I ran another GPT 4 model on my laptop this morning, right? Microsoft 5. 4 just came out. And that, if you look at the benchmarks, it's definitely, it's up there with GPT 4. 0. It's probably not as good when you actually get into the vibes of the thing, but it, it runs on my, it's a 14 gigabyte download and I can run it on a MacBook Pro.[00:05:38] Simon: Like who saw that coming? The most exciting, like the close of the year on Christmas day, just a few weeks ago, was when DeepSeek dropped their DeepSeek v3 model on Hugging Face without even a readme file. It was just like a giant binary blob that I can't run on my laptop. It's too big. But in all of the benchmarks, it's now by far the best available [00:06:00] open, open weights model.[00:06:01] Simon: Like it's, it's, it's beating the, the metalamas and so forth. And that was trained for five and a half million dollars, which is a tenth of the price that people thought it costs to train these things. So everything's trending smaller and faster and more efficient.[00:06:15] Brian: Well, okay.[00:06:16] Challenges and Competition in AI[00:06:16] Brian: I, I kind of was going to get to that later, but let's, let's combine this with what I was going to ask you next, which is, you know, you're talking, you know, Also in the piece about the LLM prices crashing, which I've even seen in projects that I'm working on, but explain Explain that to a general audience, because we hear all the time that LLMs are eye wateringly expensive to run, but what we're suggesting, and we'll come back to the cheap Chinese LLM, but first of all, for the end user, what you're suggesting is that we're starting to see the cost come down sort of in the traditional technology way of Of costs coming down over time,[00:06:49] Simon: yes, but very aggressively.[00:06:51] Simon: I mean, my favorite thing, the example here is if you look at GPT-3, so open AI's g, PT three, which was the best, a developed model in [00:07:00] 2022 and through most of 20 2023. That, the models that we have today, the OpenAI models are a hundred times cheaper. So there was a 100x drop in price for OpenAI from their best available model, like two and a half years ago to today.[00:07:13] Simon: And[00:07:14] Brian: just to be clear, not to train the model, but for the use of tokens and things. Exactly,[00:07:20] Simon: for running prompts through them. And then When you look at the, the really, the top tier model providers right now, I think, are OpenAI, Anthropic, Google, and Meta. And there are a bunch of others that I could list there as well.[00:07:32] Simon: Mistral are very good. The, the DeepSeq and Quen models have got great. There's a whole bunch of providers serving really good models. But even if you just look at the sort of big brand name providers, they all offer models now that are A fraction of the price of the, the, of the models we were using last year.[00:07:49] Simon: I think I've got some numbers that I threw into my blog entry here. Yeah. Like Gemini 1. 5 flash, that's Google's fast high quality model is [00:08:00] how much is that? It's 0. 075 dollars per million tokens. Like these numbers are getting, So we just do cents per million now,[00:08:09] swyx (2): cents per million,[00:08:10] Simon: cents per million makes, makes a lot more sense.[00:08:12] Simon: Yeah they have one model 1. 5 flash 8B, the absolute cheapest of the Google models, is 27 times cheaper than GPT 3. 5 turbo was a year ago. That's it. And GPT 3. 5 turbo, that was the cheap model, right? Now we've got something 27 times cheaper, and the Google, this Google one can do image recognition, it can do million token context, all of those tricks.[00:08:36] Simon: But it's, it's, it's very, it's, it really is startling how inexpensive some of this stuff has got.[00:08:41] Brian: Now, are we assuming that this, that happening is directly the result of competition? Because again, you know, OpenAI, and probably they're doing this for their own almost political reasons, strategic reasons, keeps saying, we're losing money on everything, even the 200.[00:08:56] Brian: So they probably wouldn't, the prices wouldn't be [00:09:00] coming down if there wasn't intense competition in this space.[00:09:04] Simon: The competition is absolutely part of it, but I have it on good authority from sources I trust that Google Gemini is not operating at a loss. Like, the amount of electricity to run a prompt is less than they charge you.[00:09:16] Simon: And the same thing for Amazon Nova. Like, somebody found an Amazon executive and got them to say, Yeah, we're not losing money on this. I don't know about Anthropic and OpenAI, but clearly that demonstrates it is possible to run these things at these ludicrously low prices and still not be running at a loss if you discount the Army of PhDs and the, the training costs and all of that kind of stuff.[00:09:36] Brian: One, one more for me before I let Swyx jump in here. To, to come back to DeepSeek and this idea that you could train, you know, a cutting edge model for 6 million. I, I was saying on the show, like six months ago, that if we are getting to the point where each new model It would cost a billion, ten billion, a hundred billion to train that.[00:09:54] Brian: At some point it would almost, only nation states would be able to train the new models. Do you [00:10:00] expect what DeepSeek and maybe others are proving to sort of blow that up? Or is there like some sort of a parallel track here that maybe I'm not technically, I don't have the mouse to understand the difference.[00:10:11] Brian: Is the model, are the models going to go, you know, Up to a hundred billion dollars or can we get them down? Sort of like DeepSeek has proven[00:10:18] Simon: so I'm the wrong person to answer that because I don't work in the lab training these models. So I can give you my completely uninformed opinion, which is, I felt like the DeepSeek thing.[00:10:27] Simon: That was a bomb shell. That was an absolute bombshell when they came out and said, Hey, look, we've trained. One of the best available models and it cost us six, five and a half million dollars to do it. I feel, and they, the reason, one of the reasons it's so efficient is that we put all of these export controls in to stop Chinese companies from giant buying GPUs.[00:10:44] Simon: So they've, were forced to be, go as efficient as possible. And yet the fact that they've demonstrated that that's possible to do. I think it does completely tear apart this, this, this mental model we had before that yeah, the training runs just keep on getting more and more expensive and the number of [00:11:00] organizations that can afford to run these training runs keeps on shrinking.[00:11:03] Simon: That, that's been blown out of the water. So yeah, that's, again, this was our Christmas gift. This was the thing they dropped on Christmas day. Yeah, it makes me really optimistic that we can, there are, It feels like there was so much low hanging fruit in terms of the efficiency of both inference and training and we spent a whole bunch of last year exploring that and getting results from it.[00:11:22] Simon: I think there's probably a lot left. I think there's probably, well, I would not be surprised to see even better models trained spending even less money over the next six months.[00:11:31] swyx (2): Yeah. So I, I think there's a unspoken angle here on what exactly the Chinese labs are trying to do because DeepSea made a lot of noise.[00:11:41] swyx (2): so much for joining us for around the fact that they train their model for six million dollars and nobody quite quite believes them. Like it's very, very rare for a lab to trumpet the fact that they're doing it for so cheap. They're not trying to get anyone to buy them. So why [00:12:00] are they doing this? They make it very, very obvious.[00:12:05] swyx (2): Deepseek is about 150 employees. It's an order of magnitude smaller than at least Anthropic and maybe, maybe more so for OpenAI. And so what's, what's the end game here? Are they, are they just trying to show that the Chinese are better than us?[00:12:21] Simon: So Deepseek, it's the arm of a hedge, it's a, it's a quant fund, right?[00:12:25] Simon: It's an algorithmic quant trading thing. So I, I, I would love to get more insight into how that organization works. My assumption from what I've seen is it looks like they're basically just flexing. They're like, hey, look at how utterly brilliant we are with this amazing thing that we've done. And it's, it's working, right?[00:12:43] Simon: They but, and so is that it? Are they, is this just their kind of like, this is, this is why our company is so amazing. Look at this thing that we've done, or? I don't know. I'd, I'd love to get Some insight from, from within that industry as to, as to how that's all playing out.[00:12:57] swyx (2): The, the prevailing theory among the Local Llama [00:13:00] crew and the Twitter crew that I indexed for my newsletter is that there is some amount of copying going on.[00:13:06] swyx (2): It's like Sam Altman you know, tweet, tweeting about how they're being copied. And then also there's this, there, there are other sort of opening eye employees that have said, Stuff that is similar that DeepSeek's rate of progress is how U. S. intelligence estimates the number of foreign spies embedded in top labs.[00:13:22] swyx (2): Because a lot of these ideas do spread around, but they surprisingly have a very high density of them in the DeepSeek v3 technical report. So it's, it's interesting. We don't know how much, how many, how much tokens. I think that, you know, people have run analysis on how often DeepSeek thinks it is cloud or thinks it is opening GPC 4.[00:13:40] swyx (2): Thanks for watching! And we don't, we don't know. We don't know. I think for me, like, yeah, we'll, we'll, we basically will never know as, as external commentators. I think what's interesting is how, where does this go? Is there a logical floor or bottom by my estimations for the same amount of ELO started last year to the end of last year cost went down by a thousand X for the [00:14:00] GPT, for, for GPT 4 intelligence.[00:14:02] swyx (2): Would, do they go down a thousand X this year?[00:14:04] Simon: That's a fascinating question. Yeah.[00:14:06] swyx (2): Is there a Moore's law going on, or did we just get a one off benefit last year for some weird reason?[00:14:14] Simon: My uninformed hunch is low hanging fruit. I feel like up until a year ago, people haven't been focusing on efficiency at all. You know, it was all about, what can we get these weird shaped things to do?[00:14:24] Simon: And now once we've sort of hit that, okay, we know that we can get them to do what GPT 4 can do, When thousands of researchers around the world all focus on, okay, how do we make this more efficient? What are the most important, like, how do we strip out all of the weights that have stuff in that doesn't really matter?[00:14:39] Simon: All of that kind of thing. So yeah, maybe that was it. Maybe 2024 was a freak year of all of the low hanging fruit coming out at once. And we'll actually see a reduction in the, in that rate of improvement in terms of efficiency. I wonder, I mean, I think we'll know for sure in about three months time if that trend's going to continue or not.[00:14:58] swyx (2): I agree. You know, I [00:15:00] think the other thing that you mentioned that DeepSeq v3 was the gift that was given from DeepSeq over Christmas, but I feel like the other thing that might be underrated was DeepSeq R1,[00:15:11] Speaker 4: which is[00:15:13] swyx (2): a reasoning model you can run on your laptop. And I think that's something that a lot of people are looking ahead to this year.[00:15:18] swyx (2): Oh, did they[00:15:18] Simon: release the weights for that one?[00:15:20] swyx (2): Yeah.[00:15:21] Simon: Oh my goodness, I missed that. I've been playing with the quen. So the other great, the other big Chinese AI app is Alibaba's quen. Actually, yeah, I, sorry, R1 is an API available. Yeah. Exactly. When that's really cool. So Alibaba's Quen have released two reasoning models that I've run on my laptop.[00:15:38] Simon: Now there was, the first one was Q, Q, WQ. And then the second one was QVQ because the second one's a vision model. So you can like give it vision puzzles and a prompt that these things, they are so much fun to run. Because they think out loud. It's like the OpenAR 01 sort of hides its thinking process. The Query ones don't.[00:15:59] Simon: They just, they [00:16:00] just churn away. And so you'll give it a problem and it will output literally dozens of paragraphs of text about how it's thinking. My favorite thing that happened with QWQ is I asked it to draw me a pelican on a bicycle in SVG. That's like my standard stupid prompt. And for some reason it thought in Chinese.[00:16:18] Simon: It spat out a whole bunch of like Chinese text onto my terminal on my laptop, and then at the end it gave me quite a good sort of artistic pelican on a bicycle. And I ran it all through Google Translate, and yeah, it was like, it was contemplating the nature of SVG files as a starting point. And the fact that my laptop can think in Chinese now is so delightful.[00:16:40] Simon: It's so much fun watching you do that.[00:16:43] swyx (2): Yeah, I think Andrej Karpathy was saying, you know, we, we know that we have achieved proper reasoning inside of these models when they stop thinking in English, and perhaps the best form of thought is in Chinese. But yeah, for listeners who don't know Simon's blog he always, whenever a new model comes out, you, I don't know how you do it, but [00:17:00] you're always the first to run Pelican Bench on these models.[00:17:02] swyx (2): I just did it for 5.[00:17:05] Simon: Yeah.[00:17:07] swyx (2): So I really appreciate that. You should check it out. These are not theoretical. Simon's blog actually shows them.[00:17:12] Brian: Let me put on the investor hat for a second.[00:17:15] AI Agents and Their Limitations[00:17:15] Brian: Because from the investor side of things, a lot of the, the VCs that I know are really hot on agents, and this is the year of agents, but last year was supposed to be the year of agents as well. Lots of money flowing towards, And Gentic startups.[00:17:32] Brian: But in in your piece that again, we're hopefully going to have linked in the show notes, you sort of suggest there's a fundamental flaw in AI agents as they exist right now. Let me let me quote you. And then I'd love to dive into this. You said, I remain skeptical as to their ability based once again, on the Challenge of gullibility.[00:17:49] Brian: LLMs believe anything you tell them, any systems that attempt to make meaningful decisions on your behalf, will run into the same roadblock. How good is a travel agent, or a digital assistant, or even a research tool, if it [00:18:00] can't distinguish truth from fiction? So, essentially, what you're suggesting is that the state of the art now that allows agents is still, it's still that sort of 90 percent problem, the edge problem, getting to the Or, or, or is there a deeper flaw?[00:18:14] Brian: What are you, what are you saying there?[00:18:16] Simon: So this is the fundamental challenge here and honestly my frustration with agents is mainly around definitions Like any if you ask anyone who says they're working on agents to define agents You will get a subtly different definition from each person But everyone always assumes that their definition is the one true one that everyone else understands So I feel like a lot of these agent conversations, people talking past each other because one person's talking about the, the sort of travel agent idea of something that books things on your behalf.[00:18:41] Simon: Somebody else is talking about LLMs with tools running in a loop with a cron job somewhere and all of these different things. You, you ask academics and they'll laugh at you because they've been debating what agents mean for over 30 years at this point. It's like this, this long running, almost sort of an in joke in that community.[00:18:57] Simon: But if we assume that for this purpose of this conversation, an [00:19:00] agent is something that, Which you can give a job and it goes off and it does that thing for you like, like booking travel or things like that. The fundamental challenge is, it's the reliability thing, which comes from this gullibility problem.[00:19:12] Simon: And a lot of my, my interest in this originally came from when I was thinking about prompt injections as a source of this form of attack against LLM systems where you deliberately lay traps out there for this LLM to stumble across,[00:19:24] Brian: and which I should say you have been banging this drum that no one's gotten any far, at least on solving this, that I'm aware of, right.[00:19:31] Brian: Like that's still an open problem. The two years.[00:19:33] Simon: Yeah. Right. We've been talking about this problem and like, a great illustration of this was Claude so Anthropic released Claude computer use a few months ago. Fantastic demo. You could fire up a Docker container and you could literally tell it to do something and watch it open a web browser and navigate to a webpage and click around and so forth.[00:19:51] Simon: Really, really, really interesting and fun to play with. And then, um. One of the first demos somebody tried was, what if you give it a web page that says download and run this [00:20:00] executable, and it did, and the executable was malware that added it to a botnet. So the, the very first most obvious dumb trick that you could play on this thing just worked, right?[00:20:10] Simon: So that's obviously a really big problem. If I'm going to send something out to book travel on my behalf, I mean, it's hard enough for me to figure out which airlines are trying to scam me and which ones aren't. Do I really trust a language model that believes the literal truth of anything that's presented to it to go out and do those things?[00:20:29] swyx (2): Yeah I definitely think there's, it's interesting to see Anthropic doing this because they used to be the safety arm of OpenAI that split out and said, you know, we're worried about letting this thing out in the wild and here they are enabling computer use for agents. Thanks. The, it feels like things have merged.[00:20:49] swyx (2): You know, I'm, I'm also fairly skeptical about, you know, this always being the, the year of Linux on the desktop. And this is the equivalent of this being the year of agents that people [00:21:00] are not predicting so much as wishfully thinking and hoping and praying for their companies and agents to work.[00:21:05] swyx (2): But I, I feel like things are. Coming along a little bit. It's to me, it's kind of like self driving. I remember in 2014 saying that self driving was just around the corner. And I mean, it kind of is, you know, like in, in, in the Bay area. You[00:21:17] Simon: get in a Waymo and you're like, Oh, this works. Yeah, but it's a slow[00:21:21] swyx (2): cook.[00:21:21] swyx (2): It's a slow cook over the next 10 years. We're going to hammer out these things and the cynical people can just point to all the flaws, but like, there are measurable or concrete progress steps that are being made by these builders.[00:21:33] Simon: There is one form of agent that I believe in. I believe, mostly believe in the research assistant form of agents.[00:21:39] Simon: The thing where you've got a difficult problem and, and I've got like, I'm, I'm on the beta for the, the Google Gemini 1. 5 pro with deep research. I think it's called like these names, these names. Right. But. I've been using that. It's good, right? You can give it a difficult problem and it tells you, okay, I'm going to look at 56 different websites [00:22:00] and it goes away and it dumps everything to its context and it comes up with a report for you.[00:22:04] Simon: And it's not, it won't work against adversarial websites, right? If there are websites with deliberate lies in them, it might well get caught out. Most things don't have that as a problem. And so I've had some answers from that which were genuinely really valuable to me. And that feels to me like, I can see how given existing LLM tech, especially with Google Gemini with its like million token contacts and Google with their crawl of the entire web and their, they've got like search, they've got search and cache, they've got a cache of every page and so forth.[00:22:35] Simon: That makes sense to me. And that what they've got right now, I don't think it's, it's not as good as it can be, obviously, but it's, it's, it's, it's a real useful thing, which they're going to start rolling out. So, you know, Perplexity have been building the same thing for a couple of years. That, that I believe in.[00:22:50] Simon: You know, if you tell me that you're going to have an agent that's a research assistant agent, great. The coding agents I mean, chat gpt code interpreter, Nearly two years [00:23:00] ago, that thing started writing Python code, executing the code, getting errors, rewriting it to fix the errors. That pattern obviously works.[00:23:07] Simon: That works really, really well. So, yeah, coding agents that do that sort of error message loop thing, those are proven to work. And they're going to keep on getting better, and that's going to be great. The research assistant agents are just beginning to get there. The things I'm critical of are the ones where you trust, you trust this thing to go out and act autonomously on your behalf, and make decisions on your behalf, especially involving spending money, like that.[00:23:31] Simon: I don't see that working for a very long time. That feels to me like an AGI level problem.[00:23:37] swyx (2): It's it's funny because I think Stripe actually released an agent toolkit which is one of the, the things I featured that is trying to enable these agents each to have a wallet that they can go and spend and have, basically, it's a virtual card.[00:23:49] swyx (2): It's not that, not that difficult with modern infrastructure. can[00:23:51] Simon: stick a 50 cap on it, then at least it's an honor. Can't lose more than 50.[00:23:56] Brian: You know I don't, I don't know if either of you know Rafat Ali [00:24:00] he runs Skift, which is a, a travel news vertical. And he, he, he constantly laughs at the fact that every agent thing is, we're gonna get rid of booking a, a plane flight for you, you know?[00:24:11] Brian: And, and I would point out that, like, historically, when the web started, the first thing everyone talked about is, You can go online and book a trip, right? So it's funny for each generation of like technological advance. The thing they always want to kill is the travel agent. And now they want to kill the webpage travel agent.[00:24:29] Simon: Like it's like I use Google flight search. It's great, right? If you gave me an agent to do that for me, it would save me, I mean, maybe 15 seconds of typing in my things, but I still want to see what my options are and go, yeah, I'm not flying on that airline, no matter how cheap they are.[00:24:44] swyx (2): Yeah. For listeners, go ahead.[00:24:47] swyx (2): For listeners, I think, you know, I think both of you are pretty positive on NotebookLM. And you know, we, we actually interviewed the NotebookLM creators, and there are actually two internal agents going on internally. The reason it takes so long is because they're running an agent loop [00:25:00] inside that is fairly autonomous, which is kind of interesting.[00:25:01] swyx (2): For one,[00:25:02] Simon: for a definition of agent loop, if you picked that particularly well. For one definition. And you're talking about the podcast side of this, right?[00:25:07] swyx (2): Yeah, the podcast side of things. They have a there's, there's going to be a new version coming out that, that we'll be featuring at our, at our conference.[00:25:14] Simon: That one's fascinating to me. Like NotebookLM, I think it's two products, right? On the one hand, it's actually a very good rag product, right? You dump a bunch of things in, you can run searches, that, that, it does a good job of. And then, and then they added the, the podcast thing. It's a bit of a, it's a total gimmick, right?[00:25:30] Simon: But that gimmick got them attention, because they had a great product that nobody paid any attention to at all. And then you add the unfeasibly good voice synthesis of the podcast. Like, it's just, it's, it's, it's the lesson.[00:25:43] Brian: It's the lesson of mid journey and stuff like that. If you can create something that people can post on socials, you don't have to lift a finger again to do any marketing for what you're doing.[00:25:53] Brian: Let me dig into Notebook LLM just for a second as a podcaster. As a [00:26:00] gimmick, it makes sense, and then obviously, you know, you dig into it, it sort of has problems around the edges. It's like, it does the thing that all sort of LLMs kind of do, where it's like, oh, we want to Wrap up with a conclusion.[00:26:12] Multimodal AI and Future Prospects[00:26:12] Brian: I always call that like the the eighth grade book report paper problem where it has to have an intro and then, you know But that's sort of a thing where because I think you spoke about this again in your piece at the year end About how things are going multimodal and how things are that you didn't expect like, you know vision and especially audio I think So that's another thing where, at least over the last year, there's been progress made that maybe you, you didn't think was coming as quick as it came.[00:26:43] Simon: I don't know. I mean, a year ago, we had one really good vision model. We had GPT 4 vision, was, was, was very impressive. And Google Gemini had just dropped Gemini 1. 0, which had vision, but nobody had really played with it yet. Like Google hadn't. People weren't taking Gemini [00:27:00] seriously at that point. I feel like it was 1.[00:27:02] Simon: 5 Pro when it became apparent that actually they were, they, they got over their hump and they were building really good models. And yeah, and they, to be honest, the video models are mostly still using the same trick. The thing where you divide the video up into one image per second and you dump that all into the context.[00:27:16] Simon: So maybe it shouldn't have been so surprising to us that long context models plus vision meant that the video was, was starting to be solved. Of course, it didn't. Not being, you, what you really want with videos, you want to be able to do the audio and the images at the same time. And I think the models are beginning to do that now.[00:27:33] Simon: Like, originally, Gemini 1. 5 Pro originally ignored the audio. It just did the, the, like, one frame per second video trick. As far as I can tell, the most recent ones are actually doing pure multimodal. But the things that opens up are just extraordinary. Like, the the ChatGPT iPhone app feature that they shipped as one of their 12 days of, of OpenAI, I really can be having a conversation and just turn on my video camera and go, Hey, what kind of tree is [00:28:00] this?[00:28:00] Simon: And so forth. And it works. And for all I know, that's just snapping a like picture once a second and feeding it into the model. The, the, the things that you can do with that as an end user are extraordinary. Like that, that to me, I don't think most people have cottoned onto the fact that you can now stream video directly into a model because it, it's only a few weeks old.[00:28:22] Simon: Wow. That's a, that's a, that's a, that's Big boost in terms of what kinds of things you can do with this stuff. Yeah. For[00:28:30] swyx (2): people who are not that close I think Gemini Flashes free tier allows you to do something like capture a photo, one photo every second or a minute and leave it on 24, seven, and you can prompt it to do whatever.[00:28:45] swyx (2): And so you can effectively have your own camera app or monitoring app that that you just prompt and it detects where it changes. It detects for, you know, alerts or anything like that, or describes your day. You know, and, and, and the fact that this is free I think [00:29:00] it's also leads into the previous point of it being the prices haven't come down a lot.[00:29:05] Simon: And even if you're paying for this stuff, like a thing that I put in my blog entry is I ran a calculation on what it would cost to process 68, 000 photographs in my photo collection, and for each one just generate a caption, and using Gemini 1. 5 Flash 8B, it would cost me 1. 68 to process 68, 000 images, which is, I mean, that, that doesn't make sense.[00:29:28] Simon: None of that makes sense. Like it's, it's a, for one four hundredth of a cent per image to generate captions now. So you can see why feeding in a day's worth of video just isn't even very expensive to process.[00:29:40] swyx (2): Yeah, I'll tell you what is expensive. It's the other direction. So we're here, we're talking about consuming video.[00:29:46] swyx (2): And this year, we also had a lot of progress, like probably one of the most excited, excited, anticipated launches of the year was Sora. We actually got Sora. And less exciting.[00:29:55] Simon: We did, and then VO2, Google's Sora, came out like three [00:30:00] days later and upstaged it. Like, Sora was exciting until VO2 landed, which was just better.[00:30:05] swyx (2): In general, I feel the media, or the social media, has been very unfair to Sora. Because what was released to the world, generally available, was Sora Lite. It's the distilled version of Sora, right? So you're, I did not[00:30:16] Simon: realize that you're absolutely comparing[00:30:18] swyx (2): the, the most cherry picked version of VO two, the one that they published on the marketing page to the, the most embarrassing version of the soa.[00:30:25] swyx (2): So of course it's gonna look bad, so, well, I got[00:30:27] Simon: access to the VO two I'm in the VO two beta and I've been poking around with it and. Getting it to generate pelicans on bicycles and stuff. I would absolutely[00:30:34] swyx (2): believe that[00:30:35] Simon: VL2 is actually better. Is Sora, so is full fat Sora coming soon? Do you know, when, when do we get to play with that one?[00:30:42] Simon: No one's[00:30:43] swyx (2): mentioned anything. I think basically the strategy is let people play around with Sora Lite and get info there. But the, the, keep developing Sora with the Hollywood studios. That's what they actually care about. Gotcha. Like the rest of us. Don't really know what to do with the video anyway. Right.[00:30:59] Simon: I mean, [00:31:00] that's my thing is I realized that for generative images and images and video like images We've had for a few years and I don't feel like they've broken out into the talented artist community yet Like lots of people are having fun with them and doing and producing stuff. That's kind of cool to look at but what I want you know that that movie everything everywhere all at once, right?[00:31:20] Simon: One, one ton of Oscars, utterly amazing film. The VFX team for that were five people, some of whom were watching YouTube videos to figure out what to do. My big question for, for Sora and and and Midjourney and stuff, what happens when a creative team like that starts using these tools? I want the creative geniuses behind everything, everywhere all at once.[00:31:40] Simon: What are they going to be able to do with this stuff in like a few years time? Because that's really exciting to me. That's where you take artists who are at the very peak of their game. Give them these new capabilities and see, see what they can do with them.[00:31:52] swyx (2): I should, I know a little bit here. So it should mention that, that team actually used RunwayML.[00:31:57] swyx (2): So there was, there was,[00:31:57] Simon: yeah.[00:31:59] swyx (2): I don't know how [00:32:00] much I don't. So, you know, it's possible to overstate this, but there are people integrating it. Generated video within their workflow, even pre SORA. Right, because[00:32:09] Brian: it's not, it's not the thing where it's like, okay, tomorrow we'll be able to do a full two hour movie that you prompt with three sentences.[00:32:15] Brian: It is like, for the very first part of, of, you know video effects in film, it's like, if you can get that three second clip, if you can get that 20 second thing that they did in the matrix that blew everyone's minds and took a million dollars or whatever to do, like, it's the, it's the little bits and pieces that they can fill in now that it's probably already there.[00:32:34] swyx (2): Yeah, it's like, I think actually having a layered view of what assets people need and letting AI fill in the low value assets. Right, like the background video, the background music and, you know, sometimes the sound effects. That, that maybe, maybe more palatable maybe also changes the, the way that you evaluate the stuff that's coming out.[00:32:57] swyx (2): Because people tend to, in social media, try to [00:33:00] emphasize foreground stuff, main character stuff. So you really care about consistency, and you, you really are bothered when, like, for example, Sorad. Botch's image generation of a gymnast doing flips, which is horrible. It's horrible. But for background crowds, like, who cares?[00:33:18] Brian: And by the way, again, I was, I was a film major way, way back in the day, like, that's how it started. Like things like Braveheart, where they filmed 10 people on a field, and then the computer could turn it into 1000 people on a field. Like, that's always been the way it's around the margins and in the background that first comes in.[00:33:36] Brian: The[00:33:36] Simon: Lord of the Rings movies were over 20 years ago. Although they have those giant battle sequences, which were very early, like, I mean, you could almost call it a generative AI approach, right? They were using very sophisticated, like, algorithms to model out those different battles and all of that kind of stuff.[00:33:52] Simon: Yeah, I know very little. I know basically nothing about film production, so I try not to commentate on it. But I am fascinated to [00:34:00] see what happens when, when these tools start being used by the real, the people at the top of their game.[00:34:05] swyx (2): I would say like there's a cultural war that is more that being fought here than a technology war.[00:34:11] swyx (2): Most of the Hollywood people are against any form of AI anyway, so they're busy Fighting that battle instead of thinking about how to adopt it and it's, it's very fringe. I participated here in San Francisco, one generative AI video creative hackathon where the AI positive artists actually met with technologists like myself and then we collaborated together to build short films and that was really nice and I think, you know, I'll be hosting some of those in my events going forward.[00:34:38] swyx (2): One thing that I think like I want to leave it. Give people a sense of it's like this is a recap of last year But then sometimes it's useful to walk away as well with like what can we expect in the future? I don't know if you got anything. I would also call out that the Chinese models here have made a lot of progress Hyde Law and Kling and God knows who like who else in the video arena [00:35:00] Also making a lot of progress like surprising him like I think maybe actually Chinese China is surprisingly ahead with regards to Open8 at least, but also just like specific forms of video generation.[00:35:12] Simon: Wouldn't it be interesting if a film industry sprung up in a country that we don't normally think of having a really strong film industry that was using these tools? Like, that would be a fascinating sort of angle on this. Mm hmm. Mm hmm.[00:35:25] swyx (2): Agreed. I, I, I Oh, sorry. Go ahead.[00:35:29] Exploring Video Avatar Companies[00:35:29] swyx (2): Just for people's Just to put it on people's radar as well, Hey Jen, there's like there's a category of video avatar companies that don't specifically, don't specialize in general video.[00:35:41] swyx (2): They only do talking heads, let's just say. And HeyGen sings very well.[00:35:45] Brian: Swyx, you know that that's what I've been using, right? Like, have, have I, yeah, right. So, if you see some of my recent YouTube videos and things like that, where, because the beauty part of the HeyGen thing is, I, I, I don't want to use the robot voice, so [00:36:00] I record the mp3 file for my computer, And then I put that into HeyGen with the avatar that I've trained it on, and all it does is the lip sync.[00:36:09] Brian: So it looks, it's not 100 percent uncanny valley beatable, but it's good enough that if you weren't looking for it, it's just me sitting there doing one of my clips from the show. And, yeah, so, by the way, HeyGen. Shout out to them.[00:36:24] AI Influencers and Their Future[00:36:24] swyx (2): So I would, you know, in terms of like the look ahead going, like, looking, reviewing 2024, looking at trends for 2025, I would, they basically call this out.[00:36:33] swyx (2): Meta tried to introduce AI influencers and failed horribly because they were just bad at it. But at some point that there will be more and more basically AI influencers Not in a way that Simon is but in a way that they are not human.[00:36:50] Simon: Like the few of those that have done well, I always feel like they're doing well because it's a gimmick, right?[00:36:54] Simon: It's a it's it's novel and fun to like Like that, the AI Seinfeld thing [00:37:00] from last year, the Twitch stream, you know, like those, if you're the only one or one of just a few doing that, you'll get, you'll attract an audience because it's an interesting new thing. But I just, I don't know if that's going to be sustainable longer term or not.[00:37:11] Simon: Like,[00:37:12] Simplifying Content Creation with AI[00:37:12] Brian: I'm going to tell you, Because I've had discussions, I can't name the companies or whatever, but, so think about the workflow for this, like, now we all know that on TikTok and Instagram, like, holding up a phone to your face, and doing like, in my car video, or walking, a walk and talk, you know, that's, that's very common, but also, if you want to do a professional sort of talking head video, you still have to sit in front of a camera, you still have to do the lighting, you still have to do the video editing, versus, if you can just record, what I'm saying right now, the last 30 seconds, If you clip that out as an mp3 and you have a good enough avatar, then you can put that avatar in front of Times Square, on a beach, or whatever.[00:37:50] Brian: So, like, again for creators, the reason I think Simon, we're on the verge of something, it, it just, it's not going to, I think it's not, oh, we're going to have [00:38:00] AI avatars take over, it'll be one of those things where it takes another piece of the workflow out and simplifies it. I'm all[00:38:07] Simon: for that. I, I always love this stuff.[00:38:08] Simon: I like tools. Tools that help human beings do more. Do more ambitious things. I'm always in favor of, like, that, that, that's what excites me about this entire field.[00:38:17] swyx (2): Yeah. We're, we're looking into basically creating one for my podcast. We have this guy Charlie, he's Australian. He's, he's not real, but he pre, he opens every show and we are gonna have him present all the shorts.[00:38:29] Simon: Yeah, go ahead.[00:38:30] The Importance of Credibility in AI[00:38:30] Simon: The thing that I keep coming back to is this idea of credibility like in a world that is full of like AI generated everything and so forth It becomes even more important that people find the sources of information that they trust and find people and find Sources that are credible and I feel like that's the one thing that LLMs and AI can never have is credibility, right?[00:38:49] Simon: ChatGPT can never stake its reputation on telling you something useful and interesting because That means nothing, right? It's a matrix multiplication. It depends on who prompted it and so forth. So [00:39:00] I'm always, and this is when I'm blogging as well, I'm always looking for, okay, who are the reliable people who will tell me useful, interesting information who aren't just going to tell me whatever somebody's paying them to tell, tell them, who aren't going to, like, type a one sentence prompt into an LLM and spit out an essay and stick it online.[00:39:16] Simon: And that, that to me, Like, earning that credibility is really important. That's why a lot of my ethics around the way that I publish are based on the idea that I want people to trust me. I want to do things that, that gain credibility in people's eyes so they will come to me for information as a trustworthy source.[00:39:32] Simon: And it's the same for the sources that I'm, I'm consulting as well. So that's something I've, I've been thinking a lot about that sort of credibility focus on this thing for a while now.[00:39:40] swyx (2): Yeah, you can layer or structure credibility or decompose it like so one thing I would put in front of you I'm not saying that you should Agree with this or accept this at all is that you can use AI to generate different Variations and then and you pick you as the final sort of last mile person that you pick The last output and [00:40:00] you put your stamp of credibility behind that like that everything's human reviewed instead of human origin[00:40:04] Simon: Yeah, if you publish something you need to be able to put it on the ground Publishing it.[00:40:08] Simon: You need to say, I will put my name to this. I will attach my credibility to this thing. And if you're willing to do that, then, then that's great.[00:40:16] swyx (2): For creators, this is huge because there's a fundamental asymmetry between starting with a blank slate versus choosing from five different variations.[00:40:23] Brian: Right.[00:40:24] Brian: And also the key thing that you just said is like, if everything that I do, if all of the words were generated by an LLM, if the voice is generated by an LLM. If the video is also generated by the LLM, then I haven't done anything, right? But if, if one or two of those, you take a shortcut, but it's still, I'm willing to sign off on it.[00:40:47] Brian: Like, I feel like that's where I feel like people are coming around to like, this is maybe acceptable, sort of.[00:40:53] Simon: This is where I've been pushing the definition. I love the term slop. Where I've been pushing the definition of slop as AI generated [00:41:00] content that is both unrequested and unreviewed and the unreviewed thing is really important like that's the thing that elevates something from slop to not slop is if A human being has reviewed it and said, you know what, this is actually worth other people's time.[00:41:12] Simon: And again, I'm willing to attach my credibility to it and say, hey, this is worthwhile.[00:41:16] Brian: It's, it's, it's the cura curational, curatorial and editorial part of it that no matter what the tools are to do shortcuts, to do, as, as Swyx is saying choose between different edits or different cuts, but in the end, if there's a curatorial mind, Or editorial mind behind it.[00:41:32] Brian: Let me I want to wedge this in before we start to close.[00:41:36] The Future of LLM User Interfaces[00:41:36] Brian: One of the things coming back to your year end piece that has been a something that I've been banging the drum about is when you're talking about LLMs. Getting harder to use. You said most users are thrown in at the deep end.[00:41:48] Brian: The default LLM chat UI is like taking brand new computer users, dropping them into a Linux terminal and expecting them to figure it all out. I mean, it's, it's literally going back to the command line. The command line was defeated [00:42:00] by the GUI interface. And this is what I've been banging the drum about is like, this cannot be.[00:42:05] Brian: The user interface, what we have now cannot be the end result. Do you see any hints or seeds of a GUI moment for LLM interfaces?[00:42:17] Simon: I mean, it has to happen. It absolutely has to happen. The the, the, the, the usability of these things is turning into a bit of a crisis. And we are at least seeing some really interesting innovation in little directions.[00:42:28] Simon: Just like OpenAI's chat GPT canvas thing that they just launched. That is at least. Going a little bit more interesting than just chat, chats and responses. You know, you can, they're exploring that space where you're collaborating with an LLM. You're both working in the, on the same document. That makes a lot of sense to me.[00:42:44] Simon: Like that, that feels really smart. The one of the best things is still who was it who did the, the UI where you could, they had a drawing UI where you draw an interface and click a button. TL draw would then make it real thing. That was spectacular, [00:43:00] absolutely spectacular, like, alternative vision of how you'd interact with these models.[00:43:05] Simon: Because yeah, the and that's, you know, so I feel like there is so much scope for innovation there and it is beginning to happen. Like, like, I, I feel like most people do understand that we need to do better in terms of interfaces that both help explain what's going on and give people better tools for working with models.[00:43:23] Simon: I was going to say, I want to[00:43:25] Brian: dig a little deeper into this because think of the conceptual idea behind the GUI, which is instead of typing into a command line open word. exe, it's, you, you click an icon, right? So that's abstracting away sort of the, again, the programming stuff that like, you know, it's, it's a, a, a child can tap on an iPad and, and make a program open, right?[00:43:47] Brian: The problem it seems to me right now with how we're interacting with LLMs is it's sort of like you know a dumb robot where it's like you poke it and it goes over here, but no, I want it, I want to go over here so you poke it this way and you can't get it exactly [00:44:00] right, like, what can we abstract away from the From the current, what's going on that, that makes it more fine tuned and easier to get more precise.[00:44:12] Brian: You see what I'm saying?[00:44:13] Simon: Yes. And the this is the other trend that I've been following from the last year, which I think is super interesting. It's the, the prompt driven UI development thing. Basically, this is the pattern where Claude Artifacts was the first thing to do this really well. You type in a prompt and it goes, Oh, I should answer that by writing a custom HTML and JavaScript application for you that does a certain thing.[00:44:35] Simon: And when you think about that take and since then it turns out This is easy, right? Every decent LLM can produce HTML and JavaScript that does something useful. So we've actually got this alternative way of interacting where they can respond to your prompt with an interactive custom interface that you can work with.[00:44:54] Simon: People haven't quite wired those back up again. Like, ideally, I'd want the LLM ask me a [00:45:00] question where it builds me a custom little UI, For that question, and then it gets to see how I interacted with that. I don't know why, but that's like just such a small step from where we are right now. But that feels like such an obvious next step.[00:45:12] Simon: Like an LLM, why should it, why should you just be communicating with, with text when it can build interfaces on the fly that let you select a point on a map or or move like sliders up and down. It's gonna create knobs and dials. I keep saying knobs and dials. right. We can do that. And the LLMs can build, and Claude artifacts will build you a knobs and dials interface.[00:45:34] Simon: But at the moment they haven't closed the loop. When you twiddle those knobs, Claude doesn't see what you were doing. They're going to close that loop. I'm, I'm shocked that they haven't done it yet. So yeah, I think there's so much scope for innovation and there's so much scope for doing interesting stuff with that model where the LLM, anything you can represent in SVG, which is almost everything, can now be part of that ongoing conversation.[00:45:59] swyx (2): Yeah, [00:46:00] I would say the best executed version of this I've seen so far is Bolt where you can literally type in, make a Spotify clone, make an Airbnb clone, and it actually just does that for you zero shot with a nice design.[00:46:14] Simon: There's a benchmark for that now. The LMRena people now have a benchmark that is zero shot app, app generation, because all of the models can do it.[00:46:22] Simon: Like it's, it's, I've started figuring out. I'm building my own version of this for my own project, because I think within six months. I think it'll just be an expected feature. Like if you have a web application, why don't you have a thing where, oh, look, the, you can add a custom, like, so for my dataset data exploration project, I want you to be able to do things like conjure up a dashboard, just via a prompt.[00:46:43] Simon: You say, oh, I need a pie chart and a bar chart and put them next to each other, and then have a form where submitting the form inserts a row into my database table. And this is all suddenly feasible. It's, it's, it's not even particularly difficult to do, which is great. Utterly bizarre that these things are now easy.[00:47:00][00:47:00] swyx (2): I think for a general audience, that is what I would highlight, that software creation is becoming easier and easier. Gemini is now available in Gmail and Google Sheets. I don't write my own Google Sheets formulas anymore, I just tell Gemini to do it. And so I think those are, I almost wanted to basically somewhat disagree with, with your assertion that LMS got harder to use.[00:47:22] swyx (2): Like, yes, we, we expose more capabilities, but they're, they're in minor forms, like using canvas, like web search in, in in chat GPT and like Gemini being in, in Excel sheets or in Google sheets, like, yeah, we're getting, no,[00:47:37] Simon: no, no, no. Those are the things that make it harder, because the problem is that for each of those features, they're amazing.[00:47:43] Simon: If you understand the edges of the feature, if you're like, okay, so in Google, Gemini, Excel formulas, I can get it to do a certain amount of things, but I can't get it to go and read a web. You probably can't get it to read a webpage, right? But you know, there are, there are things that it can do and things that it can't do, which are completely undocumented.[00:47:58] Simon: If you ask it what it [00:48:00] can and can't do, they're terrible at answering questions about that. So like my favorite example is Claude artifacts. You can't build a Claude artifact that can hit an API somewhere else. Because the cause headers on that iframe prevents accessing anything outside of CDNJS. So, good luck learning cause headers as an end user in order to understand why Like, I've seen people saying, oh, this is rubbish.[00:48:26] Simon: I tried building an artifact that would run a prompt and it couldn't because Claude didn't expose an API with cause headers that all of this stuff is so weird and complicated. And yeah, like that, that, the more that with the more tools we add, the more expertise you need to really, To understand the full scope of what you can do.[00:48:44] Simon: And so it's, it's, I wouldn't say it's, it's, it's, it's like, the question really comes down to what does it take to understand the full extent of what's possible? And honestly, that, that's just getting more and more involved over time.[00:48:58] Local LLMs: A Growing Interest[00:48:58] swyx (2): I have one more topic that I, I [00:49:00] think you, you're kind of a champion of and we've touched on it a little bit, which is local LLMs.[00:49:05] swyx (2): And running AI applications on your desktop, I feel like you are an early adopter of many, many things.[00:49:12] Simon: I had an interesting experience with that over the past year. Six months ago, I almost completely lost interest. And the reason is that six months ago, the best local models you could run, There was no point in using them at all, because the best hosted models were so much better.[00:49:26] Simon: Like, there was no point at which I'd choose to run a model on my laptop if I had API access to Cloud 3. 5 SONNET. They just, they weren't even comparable. And that changed, basically, in the past three months, as the local models had this step changing capability, where now I can run some of these local models, and they're not as good as Cloud 3.[00:49:45] Simon: 5 SONNET, but they're not so far away that It's not worth me even using them. The other, the, the, the, the continuing problem is I've only got 64 gigabytes of RAM, and if you run, like, LLAMA370B, it's not going to work. Most of my RAM is gone. So now I have to shut down my Firefox tabs [00:50:00] and, and my Chrome and my VS Code windows in order to run it.[00:50:03] Simon: But it's got me interested again. Like, like the, the efficiency improvements are such that now, if you were to like stick me on a desert island with my laptop, I'd be very productive using those local models. And that's, that's pretty exciting. And if those trends continue, and also, like, I think my next laptop, if when I buy one is going to have twice the amount of RAM, At which point, maybe I can run the, almost the top tier, like open weights models and still be able to use it as a computer as well.[00:50:32] Simon: NVIDIA just announced their 3, 000 128 gigabyte monstrosity. That's pretty good price. You know, that's that's, if you're going to buy it,[00:50:42] swyx (2): custom OS and all.[00:50:46] Simon: If I get a job, if I, if, if, if I have enough of an income that I can justify blowing $3,000 on it, then yes.[00:50:52] swyx (2): Okay, let's do a GoFundMe to get Simon one it.[00:50:54] swyx (2): Come on. You know, you can get a job anytime you want. Is this, this is just purely discretionary .[00:50:59] Simon: I want, [00:51:00] I want a job that pays me to do exactly what I'm doing already and doesn't tell me what else to do. That's, thats the challenge.[00:51:06] swyx (2): I think Ethan Molik does pretty well. Whatever, whatever it is he's doing.[00:51:11] swyx (2): But yeah, basically I was trying to bring in also, you know, not just local models, but Apple intelligence is on every Mac machine. You're, you're, you seem skeptical. It's rubbish.[00:51:21] Simon: Apple intelligence is so bad. It's like, it does one thing well.[00:51:25] swyx (2): Oh yeah, what's that? It summarizes notifications. And sometimes it's humorous.[00:51:29] Brian: Are you sure it does that well? And also, by the way, the other, again, from a sort of a normie point of view. There's no indication from Apple of when to use it. Like, everybody upgrades their thing and it's like, okay, now you have Apple Intelligence, and you never know when to use it ever again.[00:51:47] swyx (2): Oh, yeah, you consult the Apple docs, which is MKBHD.[00:51:49] swyx (2): The[00:51:51] Simon: one thing, the one thing I'll say about Apple Intelligence is, One of the reasons it's so disappointing is that the models are just weak, but now, like, Llama 3b [00:52:00] is Such a good model in a 2 gigabyte file I think give Apple six months and hopefully they'll catch up to the state of the art on the small models And then maybe it'll start being a lot more interesting.[00:52:10] swyx (2): Yeah. Anyway, I like This was year one And and you know just like our first year of iPhone maybe maybe not that much of a hit and then year three They had the App Store so Hey I would say give it some time, and you know, I think Chrome also shipping Gemini Nano I think this year in Chrome, which means that every app, every web app will have for free access to a local model that just ships in the browser, which is kind of interesting.[00:52:38] swyx (2): And then I, I think I also wanted to just open the floor for any, like, you know, any of us what are the apps that, you know, AI applications that we've adopted that have, that we really recommend because these are all, you know, apps that are running on our browser that like, or apps that are running locally that we should be, that, that other people should be trying.[00:52:55] swyx (2): Right? Like, I, I feel like that's, that's one always one thing that is helpful at the start of the [00:53:00] year.[00:53:00] Simon: Okay. So for running local models. My top picks, firstly, on the iPhone, there's this thing called MLC Chat, which works, and it's easy to install, and it runs Llama 3B, and it's so much fun. Like, it's not necessarily a capable enough novel that I use it for real things, but my party trick right now is I get my phone to write a Netflix Christmas movie plot outline where, like, a bunch of Jeweller falls in love with the King of Sweden or whatever.[00:53:25] Simon: And it does a good job and it comes up with pun names for the movies. And that's, that's deeply entertaining. On my laptop, most recently, I've been getting heavy into, into Olama because the Olama team are very, very good at finding the good models and patching them up and making them work well. It gives you an API.[00:53:42] Simon: My little LLM command line tool that has a plugin that talks to Olama, which works really well. So that's my, my Olama is. I think the easiest on ramp to to running models locally, if you want a nice user interface, LMStudio is, I think, the best user interface [00:54:00] thing at that. It's not open source. It's good.[00:54:02] Simon: It's worth playing with. The other one that I've been trying with recently, there's a thing called, what's it called? Open web UI or something. Yeah. The UI is fantastic. It, if you've got Olama running and you fire this thing up, it spots Olama and it gives you an interface onto your Olama models. And t
Chapter 1 What's Cheyenne Autumn by Mari Sandoz"Cheyenne Autumn" is a historical novel by Mari Sandoz, published in 1953. The book recounts the Cheyenne tribe's harrowing journey in 1878, as they attempted to return to their homeland in Wyoming after being forcibly relocated to reservations in Oklahoma. Through rich, evocative prose, Sandoz explores the themes of displacement, resilience, and the struggle for identity amidst the overwhelming forces of American expansionism.The narrative highlights key figures such as Chief Dull Knife and the challenges faced by the Cheyenne, including hunger, illness, and the brutal realities of life on the plains. Sandoz's empathetic portrayal gives voice to the Cheyenne people, shedding light on their culture, traditions, and the deep impact of colonization. The novel serves as both a poignant historical account and a reflection on the enduring spirit of a proud people.Chapter 2 Cheyenne Autumn by Mari Sandoz Summary"Cheyenne Autumn" by Mari Sandoz is a historical novel that vividly recounts the experiences of the Cheyenne people during the 1870s, focusing on their struggles and resilience in the face of westward expansion and U.S. government policies. The narrative primarily follows the Cheyenne tribe, particularly highlighting the journey and hardships faced by a group of Cheyenne led by a chief named Little Wolf as they seek to return to their homeland after being relocated to a barren reservation in Oklahoma. Central to the story is the theme of survival and the deep connection the Cheyenne have to their land, culture, and identity. The characters are depicted with depth, showcasing their traditions, spirituality, and the bonds of community amidst the suffering caused by violence, starvation, and broken treaties.The novel also explores the conflict between the U.S. government and Native American tribes, illustrating the injustices perpetrated against the Cheyenne. As the story unfolds, readers experience the courage and determination of the Cheyenne as they undertake the perilous journey northward, battling not just the elements but also their own disillusionment and trauma.Through rich descriptions and a blend of historical fact and fiction, Sandoz emphasizes the plight of the Cheyenne and other Native American tribes, making the narrative a powerful commentary on the impact of colonization and the importance of cultural preservation.Chapter 3 Cheyenne Autumn AuthorMari Sandoz was an American author born on May 11, 1896, in Warren, Nebraska, and she passed away on March 10, 1966. She is best known for her works that capture the history and experiences of the Great Plains, particularly in relation to Native Americans and early settlers. Cheyenne AutumnSandoz released Cheyenne Autumn in 1953. This historical novel focuses on the Cheyenne people's struggles and the forced relocation they faced, depicting both the Cheyenne and the American government's perspectives. It stands as a notable commentary on the injustices faced by Native Americans and reflects Sandoz's commitment to understanding and portraying their culture. Other Notable WorksMari Sandoz wrote several other books, including:Son of the Gamblin' Man (1945)Old Jules (1935) This semi-autobiographical work about her father is one of her most acclaimed books.The Battle of the Rosebud (1944) A historical novel about the battle involving Crazy Horse.The Cattlemen: From the Rio Grande to Montana (1941) A comprehensive account of the cattle ranching industry.The Horse and the Plains Indians (1955) A notable work highlighting the relationship between Native Americans and horses. Best EditionDetermining the "best" edition can vary based on personal preference—some readers prefer the original texts for their authenticity, while others prefer updated editions with additional...
This is episode 6 of Gyles reading from his schoolboy diaries. He's still at Bedales, and settling in - getting his newspaper delivered every morning, dining like a king at Sunday breakfast (soft white rolls with honey and marmite) and being entered into poetry reading competitions. He also carries on with his hobbies - theatre-going, reading and politics. We hope you enjoy these snapshots of a different time - if you've only just discovered these, go back to episode 1 of the diaries to hear it from the start. Learn more about your ad choices. Visit podcastchoices.com/adchoices
Send us a textIn this episode, I chat with Matt Lupica, General Manager of Rosebud Book Barn, a family-run independent bookstore in Victoria, Australia. We explore the unique relationship between the bookstore and its community, the importance of customer feedback, and the innovative tools that help independent bookstores thrive in today's digital age.We also discuss:• Insights into Matt's unconventional journey into bookselling• The role of community in shaping the offerings at The Book Barn• Exploring the integration of art supplies to cater to local artists• The significance of YourBookstore.io in connecting readers and bookstores• Using circlepos.com to enhance operational efficiency and customer experience• Personal reading habits that include customer recommendations• The evolving landscape of independent bookstores in AustraliaSupport the showThe Bookshop PodcastMandy Jackson-BeverlySocial Media Links
It's the start of a big new year of podcasts from Rosebud, and we're beginning with a real treat for you - an interview with one of Britain's most successful actors: Hugh Bonneville. During his ordinary childhood in a comfortable, middle class home in Blackheath, in which showing off was strongly discouraged, Hugh plundered the family dressing-up box and made his schoolfriends put on plays. At school he joined the drama club and eventually got a place in the National Youth Theatre. We find out about how Hugh got into the industry, but also about his parents, his first dog, and his interest in theology. With huge thanks to Hugh for his time, energy and wonderful stories. Learn more about your ad choices. Visit podcastchoices.com/adchoices