Compiler front-end
POPULARITY
In this episode, Shannon discusses the profound and challenging nature of biblical love as outlined in 1 Corinthians 13. The conversation emphasizes that true love is not about feelings but about becoming more like Christ. It critiques the current state of love among believers, highlighting the need for a return to sacrificial, agape love that reflects God's character. The host explores the importance of love in spiritual gifts, the futility of sacrifice without love, and practical applications of love in everyday life. The episode concludes with a call to action for listeners to embody love in their interactions and relationships.takeawaysBiblical love challenges and stretches us.Love is about who we are becoming, not just feelings.The world is watching how we respond to conflict.We need to repent for weaponizing our words.Agape love is selfless, sacrificial, and unconditional.Without love, our gifts are just noise.True impact happens when God's love flows through us.Love shifts the focus from us to God and others.What does love require of me? is a guiding question.Love is the priority; without it, everything else is meaningless.Chapters00:00 | Introduction to The Better Way03:49 | The Challenge of Biblical Love07:12 | Understanding Agape Love10:01 | The Importance of Love in Spiritual Gifts12:58 | Sacrifice Without Love is Meaningless16:12 | The Heart of Love in Action20:54 | Practical Applications of Love23:44 | Conclusion and Next Steps
2025 is off to a terrible start, not just in terms of current world events but because in this episode Daryl follows up on his previous review of the 1997 Berserk TV series with this review of the decades-anticipated 2016 Berserk TV series. Oh boy. Visit www.animeworldorder.com for full show notes and supplemental links.
Clive Langer in conversation with David Eastaugh https://newclang.bandcamp.com/album/new-clang Best known as one of the UK's most successful record producers with a string of high-profile credits in his portfolio, CLIVE LANGER returns in the new year with a second album from his band project, THE CLANG GROUP.A belated follow-up to 2016's Practice, the Group's maiden outing for Domino Records, New Clang was recorded with Deaf School co-conspirators John Wood (aka Max Ripple) and Gregg Braden, along with former Klaxons bassist Jamie Reynolds. Written and recorded in the aftermath of Clive's 70th birthday, New Clang is adeeply personal but incredibly vibrant album; catching Clive in reflective mode, the songs address the process of ageing and the state of the world,as well as confronting his own addiction to alcohol.“ After the pandemic, the dust settled, it felt like it was time, a new time, to play again,” he explains. “Not to revisit but to write and rehearse with my Clang Group mates. We were missing a bass player and fortuitously I met Jamie Reynolds and he filled the vacancy. The songs started to flow, we were back in the groove!”“The new album is the first sober songwriting I think I've done in almost 50 years,” he adds. “I've known and accepted that I was an addict for decades... I just didn't do anything about it. I thought I could live with it, I still enjoyed it. Someone once asked me ‘What do you do?' I replied ‘I drink'. Anyway, making an album sober was like making an album drunk except I wassober!!” Packaged in spare black-on-white, suggesting a tabula rasaof sorts, New Clang's distinctive sleeve artis the work of British artist Edwin Burdis, whom Clive met during his time with Domino Records. “Clive asked me to a studio in London to listen to his new album, still a work in progress,” recalls Edwin. “I was struck by the contrast between the upbeat music and its underlying melancholy, evoking clowns and cartoon characters and a nostalgia for London's recent past. At the time, I had been drawing simple cartoon motifs that aligned perfectly with Clive's songs. I wanted the campaign to be cohesive—black-and-white graphics that blend humour with a sense of tragedy and sadness.” A founder member of pioneering Liverpool art-rockers Deaf School, Langer is noted for a string of production credits (usually in collaboration with Alan Winstanley) on hits forthe likes of Dexys Midnight Runners (the no.1 single and album ‘Come On Eileen' and Too-Rye-Ay)andDavid Bowie(‘Absolute Beginners') plus numerous landmark releasesfor Elvis Costello, Madness, Morrissey, The Teardrop Explodes, China Crisis, Bush, They Might Be Giants, The Rockingbirds and, more recently, Fat White Family.
Send us a textDANCER IN THE DARK (2000)Broadcasting live from 1964, and entirely in song, this week's very special episode of TGTPTU covers Lars von Trier's sixth film (but only our second of his covered this 4x4): DANCER IN THE DARK (2000). It's been over a hundred episodes, since Season 1's Paint Yer Hereafter ep during our Clint Eastwood coverage, that TGTPTU has covered a musical. Dancer in the Dark, the third entry into Lars von Trier's Golden Heart trilogy, follows LVT's preceding two film both in being shot à la the Dane's handheld style developed during TV show The Kingdom and in their general plot of a woman who sacrifices more than most would believe conscionable. And starring in Dancer as that woman, an immigrant named Selma with diminishing eyesight who takes on extra shifts at the factory and side work to finance her son's secret surgery and slips into worlds of musical fantasy, is Björk. At perhaps the height of her stardom (and somehow choosing to be in a relationship with TGTPTU's previously discussed avant-garde director Matthew Barney), Björk in her first major movie role had a stake in the production and her own interpretation of Selma, which caused friction on set with the notoriously controlling Danish director, but likely contributed to her winning Best Actress at Cannes and the film the Palme d'Or. That friction may have been caused by her taking on an emotionally fraught role, especially in the second half of the film as Selma faces execution for a murder she did not intend for reasons she cannot share or else risk the wellbeing of her son. The situation onset may have also not been helped by alleged events that came out during the #MeToo, which while referenced in the episode can be found more fully here: https://www.nme.com/news/music/bjork-lends-voice-metoo-campaign-detail-sexual-harassment-hands-danish-director-lars-von-trier-2150898 As to that handheld camera style, often held by LVT himself, its digital video and potentially jarring, anti-Hollywood time cuts are complimented with a second camera aesthetic reserved for the musical moments, called “100 cameras.” This technique involved using a hundred stationary DV cameras of lesser quality than the one used for handheld footage. The hope for this multitude of cameras was for them to capture a single take of a performance without different setups. These cameras were remotely operated on ten monitors hardwired with a toggle switch inside a special construction trailer hidden in the background of the shot. Alas, this hope, unrealized, for the capture of movement to allow smoother cutting than the time cuts LVT used for the handhold was not to be. Yet the hundred camera experiment would still allow for a different feel and aesthetic from the handheld footage, especially when their transfer to film used cathode ray tube (verses the sharper laser transfer for main handheld DV camera). So tune in on your home system or your crystal radio on the a.m. dial, close your eyes, and let the dulcet voices of our four hosts' song set against industrial percussion transport you up through your ceiling and into cinema heaven. Clang! Bang! Clatter, crash, clack! THEME SONG BY: WEIRD A.I.Email: thegoodthepodandtheugly@gmail.comFacebook: https://m.facebook.com/TGTPTUInstagram: https://instagram.com/thegoodthepodandtheugly?igshid=um92md09kjg0Bluesky: @mrkoral.bsky.socialYouTube: https://www.youtube.com/channel/UC6mI2plrgJu-TB95bbJCW-gBuzzsprout: https://thegoodthepodandtheugly.buzzsprout.com/Letterboxd (follow us!):Ken: Ken KoralRyan: Ryan Tobias
Throwback Thursday time!Clang, clang, clang went the trolley!Yes, this week, we're taking you back to one of our episodes covering a classic movie, it's legacy and whether or not it deserves to be labelled a classic.It's Meet Me in St. LouisGive it a listen!Original air date: 19/01/2021· Please note, this is an un-altered release of the original episode. All references and content are accurate and relevant as of the original release date but may now be out of date.All the usual links below:Apple/iPhone:https://podcasts.apple.com/gb/podcast/films-n-that/id1470141261Spotify:https://open.spotify.com/show/1C4LiOrMZTD90e9tbB5EQOAcast:https://feeds.acast.com/public/shows/6071ac061216e55e7a95b11bYouTube: https://www.youtube.com/channel/UCIZopXPQHmlSnpgwtr2_ROQIf you'd like to get in touch, then the email is filmsandthatpod@gmail.com and we're on all the usual social media platforms if just search for Just Films & that and you should find us!Our Website ishttps://www.justfilmsandthatpod.com/Our Patreon is:https://www.patreon.com/justfilmsandthatCheers!The Just Films & That team Get bonus content on Patreon Hosted on Acast. See acast.com/privacy for more information.
“Where shall I begin?” Subscribe here to be notified when the postman's come. James Cavell played by Darren Brown Douglas Kelly, Edward Tenlinger and Thomas the Hotel Footman played by Micah Stock Pamela Kelly played by Christine Brunner Adapted from the novel 'Dearest' by Michael London Production, Editing, & Sound Design by George Drake, Jr. Music Composition by Mustafa Shaheen This series was made possible by a generous grant from the Montgomery County Arts & Cultural District with assistance from Culture Works. This episode of Dearest features the following sounds from Freesound.org: Large heavy door by SonicRealityFX, Small Crowd pre-concert talking party bar walla t... by JohnsonBrandEditing, Fire.WAV by inchadney, Fire Iron, Poking Wood, Fireplace, Indoors, Clang... by ninjaotter, Wood fire in a fireplace / living room by flwrpwr, People Having Dinner v2.wav by JiggleSticks, Party Pack, Match, Ignite, 01-01.wav by Inspector J
Benjamin Summerton joins Timur and Phil. Ben talks to us about what led him to benchmark the impact of the final and noexcept keywords, how to interpret his results, and the project that inspired him to do so in the first place. Show Notes News Boost 1.86 released RealtimeSanitizer - new real-time safety testing tool for C and C++ projects that comes with Clang 20 "Honey, I shrunk {fmt}: bringing binary size to 14k and ditching the C++ runtime" Links Previous episodes covering std lib implementations: Stephan T. Lavavej (MSVC) Stephan T. Lavavej and Sy Brand (MSVC) Billy O'Neil (MSVC) Marshall Clow (libc++) Eric Fiselier (libc++) "noexcept affects libstdc++'s unordered_set" - Arthur O'Dwyer Episode with Martin Hořeňovský, discussing non-portal random distribution Episode with Frances Buontempo, also mentioning random numbers and the portable distribution issue "Free Your Functions" (video) - Klaus Iglberger (timed link to the bit that talks about performance) Ben's PSRayTracing repo
Swoosh! Clang! Welcome to Swashbuckling September! We kick it off with one of the greats, The Adventures of Robin Hood from 1938! We discuss what we did on our summer vacations, the incredible display of hats in this film, Errol Flynn: despicable person and we discuss our experiences with firearms that tint our movie watching experience. Check it out! Register for our upcoming FREE live show Roger Ebert's review For all of our bonus episodes check out our Patreon Patreon supporters help pick episodes, monthly themes and get access to all of our additional shows and our Patron exclusive Discord. It's only the price of a single cup of coffee ($5 a month!) Visit our website and send us an email! Follow Movie Friends on Twitter and Instagram You scrolled this far? That's impressive.
APP OF THE DAY - CLANG SOUNDBOARD by 101.9POR
Episode 58: Georgie JonesGeorgie Jones is a writer and performer known for her humour and heart. She was a Roundhouse Poetry Slam finalist, before joining the Roundhouse Poetry Collective and subsequently spent a year as Roundhouse Resident Artist.It's all a bit Roundhouse centric this, isn't it? But WAIT, there's MORE! Georgie's one-woman shows combine spoken word, stand up and storytelling and have been commissioned by China Plate, MAC, Warwick Arts Centre, In Good Company and generously supported by the Arts Council.She has performed in theatres, at festivals and on radio stations all over the UK. She is often praised for her warmth and wit, so took her poetry to social media in search of other, more varied, compliments. Georgie has amassed over 140k followers, over 5.8 million views and her work has been shared by Malorie Blackman, Adrian Lester and British pop-rock band Bastille who said her poetry was ‘fucking brilliant'. Clang.When she's not peddling her own work, Georgie loves a bit of collaboration. She has worked with numerous brands to create bespoke poetry commissions, ranging from independent body-positive slow-fashion brand LAW Design Studio to the Woodland Trust and everything in between. She combines her love of writing with her love of love through offering a bespoke ghostwriting service for weddings, speeches and events.Georgie also writes and performs with Chortle Award nominated sketch group Just These Please who have sold out runs at Edinburgh Fringe and Soho Theatre respectively.#hygystpod #GeorgieJones #Roundhouse #Chortle #Fringe #SohoTheatre #LAWDDesignStudio Have You Got Your Sh*t Together? with Caitlin O'Ryan, is a podcast that celebrates not having your sh*t together! In each episode, Caitlin interviews guests who seemingly “have their sh*t together” - be that in life/love/work/hobbies. Throughout the conversation, the questions unveil whether they actually do, or whether the whole concept is a lie! With a mix of guests from various backgrounds, the podcast is sure to be relatable, honest, and an antidote to Instagram culture. Producer - Ant Hickman (www.ahickman.uk)Artwork - Tim Saunders (www.instagram.com/timsaunders.design)Photography - Patch Bell (www.patchstudio.uk)Music - Cassia - 'Slow' (www.wearecassia.com)Web: www.hygystpod.comInsta: www.instgram.com/hygystpodEmail: hygystpod@gmail.comRSS: https://feeds.acast.com/public/shows/644a8e8eadac0f0010542d86 Hosted on Acast. See acast.com/privacy for more information.
In this episode, Conor and Bryce chat with Doug Gregor from Apple about the Swift programming language!Link to Episode 184 on WebsiteDiscuss this episode, leave a comment, or ask a question (on GitHub)TwitterADSP: The PodcastConor HoekstraBryce Adelstein LelbachAbout the Guest:Douglas Gregor is is a Distinguished Engineer at Apple working on the Swift programming language, compiler, and related libraries and tools. He is code owner emeritus of the Clang compiler (part of the LLVM project), a former member of the ISO C++ committee, and a co-author on the second edition of C++ Templates: The Complete Guide. He holds a Ph.D. in computer science from Rensselaer Polytechnic Institute.Show NotesDate Recorded: 2024-04-29Date Released: 2024-05-31Swift Programming LanguageSwift ActorsD Programming LanguageRust Programming LanguageFearless Concurrency? Understanding Concurrent Programming Safety in Real-World Rust SoftwareSwift Protocols2022 LLVM Dev Mtg: Implementing Language Support for ABI-Stable Software Evolution in Swift and LLVMOxide Episode - Discovering the XZ Backdoor with Andres FreundSwift Algorithms LibraryIntro Song InfoMiss You by Sarah Jansen https://soundcloud.com/sarahjansenmusicCreative Commons — Attribution 3.0 Unported — CC BY 3.0Free Download / Stream: http://bit.ly/l-miss-youMusic promoted by Audio Library https://youtu.be/iYYxnasvfx8
It's hogs all the way down this week as we're joined by Alex Paul, guitarist in Girih and design and manufacturing brain behind Robot Graves Industries. You might think we talk about aluminum guitar necks, but we get too busy asking the hogline about their favorite Electro Harmonix pedals, then dive deep on Alex's favorite: the HOG Harmonic Octave Generator V1. We talk about the utility of six footswitches vs a suite of MIDI functionality, and the je ne sais quoi of sound quality across pedal versions. We also talk a lot about what it means to perform music, and the way technology can both help and hinder the interaction between artist and audience.Check out Robot Graves: https://robotgraves.com/Listen to Girih: https://girih.bandcamp.com/Buy Old Blood pedals: http://www.oldbloodnoise.comJoin the conversation in Discord: https://discord.com/invite/PhpA5MbN5uFollow us on the socials: @robotgraves, @oldbloodnoise, @andyothling, @danfromdsfLeave us a voicemail at 505-633-4647!
In this episode, Conor and Bryce chat with Doug Gregor from Apple about the Swift programming language!Link to Episode 183 on WebsiteDiscuss this episode, leave a comment, or ask a question (on GitHub)TwitterADSP: The PodcastConor HoekstraBryce Adelstein LelbachAbout the Guest:Douglas Gregor is is a Distinguished Engineer at Apple working on the Swift programming language, compiler, and related libraries and tools. He is code owner emeritus of the Clang compiler (part of the LLVM project), a former member of the ISO C++ committee, and a co-author on the second edition of C++ Templates: The Complete Guide. He holds a Ph.D. in computer science from Rensselaer Polytechnic Institute.Show NotesDate Recorded: 2024-04-29Date Released: 2024-05-24Swift Programming LanguageWWDC 2014 Swift AnnouncementSwift on LanguishIntro Song InfoMiss You by Sarah Jansen https://soundcloud.com/sarahjansenmusicCreative Commons — Attribution 3.0 Unported — CC BY 3.0Free Download / Stream: http://bit.ly/l-miss-youMusic promoted by Audio Library https://youtu.be/iYYxnasvfx8
In this episode, Conor and Bryce chat with Doug Gregor from Apple about C++11 Variadic Templates, C++11 std::tuple, C++17 std::variant, Swift and more!Link to Episode 182 on WebsiteDiscuss this episode, leave a comment, or ask a question (on GitHub)TwitterADSP: The PodcastConor HoekstraBryce Adelstein LelbachAbout the Guest:Douglas Gregor is is a Distinguished Engineer at Apple working on the Swift programming language, compiler, and related libraries and tools. He is code owner emeritus of the Clang compiler (part of the LLVM project), a former member of the ISO C++ committee, and a co-author on the second edition of C++ Templates: The Complete Guide. He holds a Ph.D. in computer science from Rensselaer Polytechnic Institute.Show NotesDate Recorded: 2024-04-29Date Released: 2024-05-17C++11 Variadic Templates / Parameter Packs / ExpansionC++26 Pack IndexingC++11 std::tupleC++17 std::variantC++11 Digit SeparatorsSwift Programming LanguageHPX (High Performance ParalleX)Intro Song InfoMiss You by Sarah Jansen https://soundcloud.com/sarahjansenmusicCreative Commons — Attribution 3.0 Unported — CC BY 3.0Free Download / Stream: http://bit.ly/l-miss-youMusic promoted by Audio Library https://youtu.be/iYYxnasvfx8
In this episode, Conor and Bryce chat with Doug Gregor from Apple about the history of C++0x Concepts (part 2).Link to Episode 181 on WebsiteDiscuss this episode, leave a comment, or ask a question (on GitHub)TwitterADSP: The PodcastConor HoekstraBryce Adelstein LelbachAbout the Guest:Douglas Gregor is is a Distinguished Engineer at Apple working on the Swift programming language, compiler, and related libraries and tools. He is code owner emeritus of the Clang compiler (part of the LLVM project), a former member of the ISO C++ committee, and a co-author on the second edition of C++ Templates: The Complete Guide. He holds a Ph.D. in computer science from Rensselaer Polytechnic Institute.Show NotesDate Recorded: 2024-04-29Date Released: 2024-05-10C++20 ConceptsSwift Programming LanguageElements of ProgrammingTecton: A Language for Manipulating Generic ObjectsGeneric Programming by David Musser and Alexander StepanovOriginal paper on concepts for C++0x (Stroustrup and Dos Reis)C++ Concepts vs Rust Traits vs Haskell Typeclasses vs Swift Protocols - Conor Hoekstra - ACCU 2021Paper on the implementation of concepts in ConceptGCC (Gregor, Siek)C++0x Concepts proposal that explains the model (Gregor, Stroustrup)Language wording for concepts that went into C++0xDoug's last-ditch effort to bring back a simpler C++0x Concepts model using archetypes for type checkingJeremy Siek's extensive C++0x Concepts writeupType-Soundness and Optimization in the Concepts ProposalIntro Song InfoMiss You by Sarah Jansen https://soundcloud.com/sarahjansenmusicCreative Commons — Attribution 3.0 Unported — CC BY 3.0Free Download / Stream: http://bit.ly/l-miss-youMusic promoted by Audio Library https://youtu.be/iYYxnasvfx8
In this episode, Conor and Bryce chat with Doug Gregor from Apple about the history of C++0x Concepts.Link to Episode 180 on WebsiteDiscuss this episode, leave a comment, or ask a question (on GitHub)TwitterADSP: The PodcastConor HoekstraBryce Adelstein LelbachAbout the Guest:Douglas Gregor is is a Distinguished Engineer at Apple working on the Swift programming language, compiler, and related libraries and tools. He is code owner emeritus of the Clang compiler (part of the LLVM project), a former member of the ISO C++ committee, and a co-author on the second edition of C++ Templates: The Complete Guide. He holds a Ph.D. in computer science from Rensselaer Polytechnic Institute.Show NotesDate Recorded: 2024-04-29Date Released: 2024-05-03C++20 ConceptsSwift Programming LanguageElements of ProgrammingTecton: A Language for Manipulating Generic ObjectsGeneric Programming by David Musser and Alexander StepanovOriginal paper on concepts for C++0x (Stroustrup and Dos Reis)C++ Concepts vs Rust Traits vs Haskell Typeclasses vs Swift Protocols - Conor Hoekstra - ACCU 2021Paper on the implementation of concepts in ConceptGCC (Gregor, Siek)C++0x Concepts proposal that explains the model (Gregor, Stroustrup)Language wording for concepts that went into C++0xDoug's last-ditch effort to bring back a simpler C++0x Concepts model using archetypes for type checkingJeremy Siek's extensive C++0x Concepts writeupIntro Song InfoMiss You by Sarah Jansen https://soundcloud.com/sarahjansenmusicCreative Commons — Attribution 3.0 Unported — CC BY 3.0Free Download / Stream: http://bit.ly/l-miss-youMusic promoted by Audio Library https://youtu.be/iYYxnasvfx8
Do you know slang?HighkeyiykykBet!FlexFell off... and so many others To subscribe to The Pete McMurray Show Podcast just click here
This week on the show, The story of SSH getting port 22, GGC using Clang, AuxRunner, Stabweek, Using a Kensington SlimBladePro on OpenBSD, and more... NOTES This episode of BSDNow is brought to you by Tarsnap (https://www.tarsnap.com/bsdnow) and the BSDNow Patreon (https://www.patreon.com/bsdnow) Headlines The story of getting SSH port 22 (https://www.ssh.com/academy/ssh/port#the-story-of-getting-ssh-port-22) Can GCC use Clang as its assembler? (https://briancallahan.net/blog/20240122.html) News Roundup AUXrunner: a macOS QEMU-based app for running A/UX (https://mendelson.org/auxrunner.html) Stabweek (https://lists.freebsd.org/archives/freebsd-current/2024-February/005657.html) Using the Kensington SlimBlade Pro TrackBall with OpenBSD (https://www.tumfatig.net/2024/using-the-kensington-slimblade-pro-trackball-with-openbsd/) Running 9front on an emulated SGI Indy via MAME (https://posixcafe.org/blogs/2024/01/01/0/) Beastie Bits Huffman Codes – How Do They Work? (https://two-wrongs.com/huffman-codes-how-do-they-work) NetBSD 10.0_RC5 (https://mail-index.netbsd.org/source-changes/2024/02/27/msg150156.html) New code for SIGILL faults help identify misbranches (https://www.undeadly.org/cgi?action=article;sid=20240222183703) New Illumos telegram channel (https://t.me/illumosDistroes) The Jan Feb issues of the FreeBSD Journal is here (https://freebsdfoundation.org/blog/the-january-february-2024-issue-of-the-freebsd-journal-is-here/) Tarsnap This weeks episode of BSDNow was sponsored by our friends at Tarsnap, the only secure online backup you can trust your data to. Even paranoids need backups. Feedback/Questions Send questions, comments, show ideas/topics, or stories you want mentioned on the show to feedback@bsdnow.tv (mailto:feedback@bsdnow.tv) Join us and other BSD Fans in our BSD Now Telegram channel (https://t.me/bsdnow)
Despite the San Diego Gulls getting outplayed for the most part in their past two games, they managed to earn a split thanks to a stellar performance from Calle Clang. JD Hernandez is back to talk about the similarities in (a lack of) defense between the Ducks and the Gulls. While the Gulls couldn't make it two consecutive victories against the first-place Firebirds, they did earn an important two points against their biggest rival, the Ontario Reign. Finally, Calle Clang was spectacular single-handedly getting those two points for San Diego... but are those two points enough to give the Gulls a shot at the playoffs?IbottaIbotta is a free app that gives you the most cash back every time you shop on hundreds of items from groceries to beauty supplies to toys. Right now, Ibotta is offering our listeners $5 just for trying Ibotta by using the code LOCKEDONNHL when you download the free app in the App Store or Google Play store. RobinhoodRobinhood has the only IRA that gives you a 3% boost on every dollar you contribute when you subscribe to Robinhood Gold. Now through April 30th, Robinhood is even boosting every single dollar you transfer in from other retirement accounts with a 3% match. Available to U.S. customers in good standing. Robinhood Financial LLC (member SIPC), is a registered broker dealer.
James William Moore is not only a much-sought after and admired educator, but he is also an international lens-based artist known for his use of camp and kitsch aesthetics to create surreal and thought-provoking cinematic experiences through photography, video, projection mapping, and installations. Through his work, Moore appropriates politics, American pop culture, and everyday life creating a visual language that is both humorous and deeply meaningful. He has always been drawn to the power of visual storytelling, as seen in his series Tilting at Windmills, Get a Clue, and Portrait of a Teller's Fortune as he brings his imagination to life by combining fact and fantasy.To James, appropriation is not a dirty word. To him it is a word that has been much maligned over the years. “However, when we aren't appropriating cultures, the power of appropriation is limitless, “ said James. “ To be honest, as much as kitsch and camp form the heart of my storytelling art practice, the soul of my artwork is appropriation. Appropriation refers to taking something of someone else's and making it our own. When I look back over my work, I see a heavy influence coming from artists like Cindy Sherman, René Magritte, Philip-Lorca diCorcia, Andy Warhol, and Edward Hopper. Whether it's a subconscious passion for architecture and mundane of Hopper, the vibrant storytelling with saturated imagery of diCorcia, or the sheer gaudiness of Warhol – I see the influence of these masters on my work. Through my reverent appreciation of these artists, I appropriate their style, subject matter, visual composition, and techniques.”Moore completed his Master of Fine Art, with a concentration in photography, at San José State University. He has taught photography at SJSU and Gavilan College. He also led a workshop on Adobe Photoshop to assist with the City of San Jose's Cultural History and Postcard Public Art Project. His work has been seen in group shows, with highlights including: Clang, Clang Clang went the Trolley at Rayko Galleries' SHOWCASE (2012, San Francisco), Spin Me ‘Round at Pacific Art League's Carnevale (2010, Palo Alto), Alone with Dino at 1650 Gallery's Dudes, Bros, & Gentlemen (2016, Los Angeles), Dances at Windmills at JJ&A PopUP Gallery's Unconventional Urban Ballet (2014, Palm Springs) and selected images from Madame B's Tarot Readings appearing at FotoNostrum (2023, Barcelona). Moore's solo show highlights include: 40 at Paragon Restaurant, Carnevale & Kimonos at Read Brown Salon (2016, Palm Springs), Get a Clue at San Jose State University's Black Gallery (2020), and Madame B's Tarot Readings at Jo Farb Hernandez Gallery. His public art participation includes Red Obi from the Obon series in the Japantown Mural Project (2013, San Jose) and Judgement from the Madame B's Tarot Readings series in Expo Metro's Billboard Art Project (2023, Barcelona).
This is a recap of the top 10 posts on Hacker News on December 2nd, 2023.This podcast was generated by wondercraft.ai(00:36): Not a real engineer (2019)Original post: https://news.ycombinator.com/item?id=38503486&utm_source=wondercraft_ai(02:19): Infants understand language via rhythm and tone rather than individual soundsOriginal post: https://news.ycombinator.com/item?id=38500906&utm_source=wondercraft_ai(04:11): Clang now makes binaries an original Pi B+ can't runOriginal post: https://news.ycombinator.com/item?id=38504134&utm_source=wondercraft_ai(06:08): GQL – Git Query LanguageOriginal post: https://news.ycombinator.com/item?id=38498688&utm_source=wondercraft_ai(08:06): UniFi ExpressOriginal post: https://news.ycombinator.com/item?id=38504027&utm_source=wondercraft_ai(09:44): Cicadas are so loud, fiber optic cables can ‘hear' themOriginal post: https://news.ycombinator.com/item?id=38500065&utm_source=wondercraft_ai(11:45): Is Ada safer than Rust?Original post: https://news.ycombinator.com/item?id=38498775&utm_source=wondercraft_ai(13:45): Can't sign in with FIDO2 key on office.comOriginal post: https://news.ycombinator.com/item?id=38502340&utm_source=wondercraft_ai(15:18): Open-source drawing tool – ExcalidrawOriginal post: https://news.ycombinator.com/item?id=38499375&utm_source=wondercraft_ai(16:55): Mundane emotions: Losing yourself in boredom, time and technology (2022)Original post: https://news.ycombinator.com/item?id=38500681&utm_source=wondercraft_aiThis is a third-party project, independent from HN and YC. Text and audio generated using AI, by wondercraft.ai. Create your own studio quality podcast with text as the only input in seconds at app.wondercraft.ai. Issues or feedback? We'd love to hear from you: team@wondercraft.ai
FreeBSD 14 has been released, Reading your RSS feed on FreeBSD, Manipulate PDF files easily with pdftk, clang(1)/llvm updated to version 16 in OpenBSD, NetBSD Security Advisory: multiple vulnerabilities in ftpd(8), and more NOTES This episode of BSDNow is brought to you by Tarsnap (https://www.tarsnap.com/bsdnow) and the BSDNow Patreon (https://www.patreon.com/bsdnow) Headlines FreeBSD 14 (https://www.freebsd.org/releases/14.0R/relnotes/) • [Quick update](https://www.daemonology.net/blog/2023-11-21-late-breaking-FreeBSD-14-breakage.html) • [Vermaden's FreeBSD 14 valuable news] (https://vermaden.wordpress.com/2023/11/17/valuable-freebsd-14-0-release-updates) News Roundup Reading your RSS feed on FreeBSD (https://www.ncartron.org/reading-your-rss-feed-on-freebsd.html) Manipulate PDF files easily with pdftk (https://dataswamp.org/~solene/2023-08-19-pdftk-guide.html) clang(1)/llvm updated to version 16 (https://www.undeadly.org/cgi?action=article;sid=20231113160314&utm_source=bsdweekly) NetBSD Security Advisory 2023-007: multiple vulnerabilities in ftpd(8) (https://bsdsec.net/articles/netbsd-security-advisory-2023-007-multiple-vulnerabilities-in-ftpd-8) Tarsnap This weeks episode of BSDNow was sponsored by our friends at Tarsnap, the only secure online backup you can trust your data to. Even paranoids need backups. Feedback/Questions Brad - zpool disk allocation questions (https://github.com/BSDNow/bsdnow.tv/blob/master/episodes/535/feedback/Brad%20-%20zpool%20disk%20allocation%20questions.md) Kevin - shell question (https://github.com/BSDNow/bsdnow.tv/blob/master/episodes/535/feedback/Kevin%20-%20shell%20question.md) Send questions, comments, show ideas/topics, or stories you want mentioned on the show to feedback@bsdnow.tv (mailto:feedback@bsdnow.tv) Join us and other BSD Fans in our BSD Now Telegram channel (https://t.me/bsdnow)
In this episode of "Locked On Hornets," host Doug Branson welcomes special guest David Walker to discuss the latest buzz surrounding the Charlotte Hornets. All the news out of training is overwhelmingly positive. Is this business as usual or is this team really built different than previous seasons? Head Coach Steve Clifford is getting rave reviews from his players. He seems to be the right coach for the moment but what about long term? LaMelo Ball is names one of Zach Lowe's 5 most intriguing players of the NBA season and we do another edition of Stang'n or Clang'n with sneakers, weight gain and team governor's awkwardly shooting basketballs.Follow & Subscribe on all Podcast platforms…
In this episode of "Locked On Hornets," host Doug Branson welcomes special guest David Walker to discuss the latest buzz surrounding the Charlotte Hornets. All the news out of training is overwhelmingly positive. Is this business as usual or is this team really built different than previous seasons? Head Coach Steve Clifford is getting rave reviews from his players. He seems to be the right coach for the moment but what about long term? LaMelo Ball is names one of Zach Lowe's 5 most intriguing players of the NBA season and we do another edition of Stang'n or Clang'n with sneakers, weight gain and team governor's awkwardly shooting basketballs. Follow & Subscribe on all Podcast platforms…
How does Brandon Miller ascend to every night starter and Hornets star as quickly as possible? Doug has the one thing he should focus on and the stats to back it up. Plus, key injury updates for Cody Martin and Bryce McGowens and how their potential absences could affect who gets a look at training camp. Finally, Doug brings back Stang'n or Clang'n to tell you what is hot or not in the Hornets universe right now.Follow & Subscribe on all Podcast platforms…
How does Brandon Miller ascend to every night starter and Hornets star as quickly as possible? Doug has the one thing he should focus on and the stats to back it up. Plus, key injury updates for Cody Martin and Bryce McGowens and how their potential absences could affect who gets a look at training camp. Finally, Doug brings back Stang'n or Clang'n to tell you what is hot or not in the Hornets universe right now. Follow & Subscribe on all Podcast platforms…
Want to help define the AI Engineer stack? Have opinions on the top tools, communities and builders? We're collaborating with friends at Amplify to launch the first State of AI Engineering survey! Please fill it out (and tell your friends)!If AI is so important, why is its software so bad?This was the motivating question for Chris Lattner as he reconnected with his product counterpart on Tensorflow, Tim Davis, and started working on a modular solution to the problem of sprawling, monolithic, fragmented platforms in AI development. They announced a $30m seed in 2022 and, following their successful double launch of Modular/Mojo
It's Below Deck Down Under S2, Ep10 Kiss, Kiss, Clang, ClangIn this episode:Chef is (rightly so) pissed at Culver and Jaimee It's a one day charter with a lovely family vibe ANDThe deck crew make ANOTHER costly mistake… Support the showClick the link above to head over to Patreon where you can join our community & access ad-free, early release episodes from $2 per month (USD) or for $5 USD per month enjoy all the above perks AND our weekly bonus episode 'The Wrap Up' for Patreon subscribers only! You can also support us by clicking the link below to purchase a 'virtual coffee'! https://www.buymeacoffee.com/theaftdeckpodAs an entirely independent podcast, we appreciate ALL of your support!!! Ask us questions, give us your thoughts on the show or tell us what you'd like us to cover in future episodes at theaftdeckpod@gmail.com or DM us @theaftdeck.podMusic by: AudioCoffee (Denys Kyshchuk)
Jeremy and Philip react to chapters 57-76 of Light Bringer, sharing their first impressions and big takeaways. (There are no Light Bringer spoilers beyond chapter 76. Clang, clang, clang)Warning: All episodes contain spoilers for the Red Rising book seriesEmail hailreaperpod@gmail.comSubscribe to our YouTube ChannelInstagram & Twitter @hailreaperpodArt by Jeff HalseyAdditional production by Tim MountHail Reaper is a production of Deepgrave StudiosThanks to the Howlers that made this episode possible:Affable Batty, Data, Fury Caesura, Maestro Tracker, SushiWestern, The Scepter, Thorn, and WitFlash.
Mature shops should be looking to a security architecture process to help scale their systems and embrace security by design. We talk about what it means to create a security architecture process, why it's not just another security review, and why it requires security to dig into engineering. Segment Resources: - https://www.lacework.com/ciso-boardbook/ciso/merritt-baer Zap gets a jolt of new support, using Clang for security research, LLM attacks learn models, Rust visualizes dependencies, a National Cyber Workforce and Education Strategy, and more! Visit https://securityweekly.com/asw for all the latest episodes! Follow us on Twitter: https://www.twitter.com/secweekly Like us on Facebook: https://www.facebook.com/secweekly Show Notes: https://securityweekly.com/asw-250
Zap gets a jolt of new support, using Clang for security research, LLM attacks learn models, Rust visualizes dependencies, a National Cyber Workforce and Education Strategy, and more! Visit https://www.securityweekly.com/asw for all the latest episodes! Show Notes: https://securityweekly.com/asw-250
Mature shops should be looking to a security architecture process to help scale their systems and embrace security by design. We talk about what it means to create a security architecture process, why it's not just another security review, and why it requires security to dig into engineering. Segment Resources: - https://www.lacework.com/ciso-boardbook/ciso/merritt-baer Zap gets a jolt of new support, using Clang for security research, LLM attacks learn models, Rust visualizes dependencies, a National Cyber Workforce and Education Strategy, and more! Visit https://securityweekly.com/asw for all the latest episodes! Follow us on Twitter: https://www.twitter.com/secweekly Like us on Facebook: https://www.facebook.com/secweekly Show Notes: https://securityweekly.com/asw-250
Clang! Blong!! Flenk!!! and other pinball noises. Stuart Gipp, Diamond Feit and Ralph Barbagallo (Demon's Tilt, Xenotilt) bring their multi-balls to this fast-paced discussion of the fine sport of pinball on the television. Tilt!
This week, I spoke to percussionist, Scotty Irving, via Skype on two separate days due to technical difficulties and crossed wires! Scotty is best known as the one man behind the Christian Harsh Noise act, Clang Quartet, but we also talk about his time behind the drums in Geezer Lake, Benj-O-Matic, and Spirit of Hamlet, as well as school band, but the main focus of this one is Scotty's faith. Clang Quartet: https://www.facebook.com/clangquartetyoutube.com/clangquartet1967Armor of God: https://youtu.be/dwNBDz1w1tgGeezer. Lake: https://www.facebook.com/geezerlakeSpirit of Hamlet: https://spiritofhamlet.bandcamp.com/https://www.facebook.com/profile.php?id=100088661322401Benjy Johnson: http://www.benjyjohnsonmusic.com/Songs for GGHT79:Geezer Lake - Field BlisterGeezer Lake - My Ugly Body TempleClang Quartet - With Weariness and Heavy Heart (https://norentrecords.bandcamp.com/album/a-slow-death-for-the-peacemaker-nrr163) Clang Quartet Live at Tulsa Noisefest (https://youtu.be/PU61tjOdVDY)Clang Quartet. - Love Thy Neighbor (https://cruelsymphonies.bandcamp.com/album/judge-thy-neighbor-love-thyself)Benj-O-Matic - Shoot the Pig (http://www.benjyjohnsonmusic.com/)Spirit of Hamlet - Strike it Rich (https://spiritofhamlet.bandcamp.com/)Visit Gabba Gabba Huh? Records & Vintage Goods! Located on the first floor of Eastridge Mall in Gastonia, NC, near Dillards!Visit Hobo Wolfman Records, located inside Junky Monkey 3041 Kerr Ave, Wilmington, NC!Support the show
Chris Lattner is a legendary software and hardware engineer, leading projects at Apple, Tesla, Google, SiFive, and Modular AI, including the development of Swift, LLVM, Clang, MLIR, CIRCT, TPUs, and Mojo. Please support this podcast by checking out our sponsors: - iHerb: https://lexfridman.com/iherb and use code LEX to get 22% off your order - Numerai: https://numer.ai/lex - InsideTracker: https://insidetracker.com/lex to get 20% off EPISODE LINKS: Chris's Twitter: https://twitter.com/clattner_llvm Chris's Website: http://nondot.org/sabre/ Mojo programming language: https://www.modular.com/mojo Modular AI: https://modular.com/ PODCAST INFO: Podcast website: https://lexfridman.com/podcast Apple Podcasts: https://apple.co/2lwqZIr Spotify: https://spoti.fi/2nEwCF8 RSS: https://lexfridman.com/feed/podcast/ YouTube Full Episodes: https://youtube.com/lexfridman YouTube Clips: https://youtube.com/lexclips SUPPORT & CONNECT: - Check out the sponsors above, it's the best way to support this podcast - Support on Patreon: https://www.patreon.com/lexfridman - Twitter: https://twitter.com/lexfridman - Instagram: https://www.instagram.com/lexfridman - LinkedIn: https://www.linkedin.com/in/lexfridman - Facebook: https://www.facebook.com/lexfridman - Medium: https://medium.com/@lexfridman OUTLINE: Here's the timestamps for the episode. On some podcast players you should be able to click the timestamp to jump to that time. (00:00) - Introduction (06:38) - Mojo programming language (16:55) - Code indentation (25:22) - The power of autotuning (35:12) - Typed programming languages (51:56) - Immutability (1:04:14) - Distributed deployment (1:38:41) - Mojo vs CPython (1:54:30) - Guido van Rossum (2:01:31) - Mojo vs PyTorch vs TensorFlow (2:04:55) - Swift programming language (2:10:27) - Julia programming language (2:15:32) - Switching programming languages (2:24:58) - Mojo playground (2:29:48) - Jeremy Howard (2:40:34) - Function overloading (2:48:59) - Error vs Exception (2:56:39) - Mojo roadmap (3:09:41) - Building a company (3:21:27) - ChatGPT (3:27:50) - Danger of AI (3:31:44) - Future of programming (3:35:01) - Advice for young people
Check out Niall's training company Feabhas.Feabhas training courseshttps://www.feabhas.com/course-listblog (over 10 years of content)https://blog.feabhas.comYouTube list:Is C+ a Safer C?https://www.youtube.com/watch?v=1lrN5bti-b4A brief introduction to concepts in Modern C++https://www.youtube.com/watch?v=Vdr-q-uEBjIWhy Rust won't replace C (just yet anyway)https://www.youtube.com/watch?v=ojEXMM_1bVAOther shownotes from things mentioned during the podcast:Doctest: https://github.com/doctest/doctestDev Containers for VSCode: https://microsoft.github.io/code-with-engineering-playbook/developer-experience/devcontainers/GodBolt compiler explorer: https://godbolt.org/CPP Insights: https://cppinsights.io/Catch2 unit testing framework: https://github.com/catchorg/Catch2Clang-tidy: https://clang.llvm.org/extra/clang-tidy/Clang-format: https://clang.llvm.org/docs/ClangFormat.htmlMbed OS for ARM: https://os.mbed.com/
It's now almost 6 months since Google declared Code Red, and the results — Jeff Dean's recap of 2022 achievements and a mass exodus of the top research talent that contributed to it in January, Bard's rushed launch in Feb, a slick video showing Google Workspace AI features and confusing doubly linked blogposts about PaLM API in March, and merging Google Brain and DeepMind in April — have not been inspiring. Google's internal panic is in full display now with the surfacing of a well written memo, written by software engineer Luke Sernau written in early April, revealing internal distress not seen since Steve Yegge's infamous Google Platforms Rant. Similar to 2011, the company's response to an external challenge has been to mobilize the entire company to go all-in on a (from the outside) vague vision.Google's misfortunes are well understood by now, but the last paragraph of the memo: “We have no moat, and neither does OpenAI”, was a banger of a mic drop.Combine this with news this morning that OpenAI lost $540m last year and will need as much as $100b more funding (after the complex $10b Microsoft deal in Jan), and the memo's assertion that both Google and OpenAI have “no moat” against the mighty open source horde have gained some credibility in the past 24 hours.Many are criticising this memo privately:* A CEO commented to me yesterday that Luke Sernau does not seem to work in AI related parts of Google and “software engineers don't understand moats”. * Emad Mostaque, himself a perma-champion of open source and open models, has repeatedly stated that “Closed models will always outperform open models” because closed models can just wrap open ones.* Emad has also commented on the moats he does see: “Unique usage data, Unique content, Unique talent, Unique product, Unique business model”, most of which Google does have, and OpenAI less so (though it is winning on the talent front)* Sam Altman famously said that “very few to no one is Silicon Valley has a moat - not even Facebook” (implying that moats don't actually matter, and you should spend your time thinking about more important things)* It is not actually clear what race the memo thinks Google and OpenAI are in vs Open Source. Neither are particularly concerned about running models locally on phones, and they are perfectly happy to let “a crazy European alpha male” run the last mile for them while they build actually monetizable cloud infrastructure.However moats are of intense interest by everybody keen on productized AI, cropping up in every Harvey, Jasper, and general AI startup vs incumbent debate. It is also interesting to take the memo at face value and discuss the searing hot pace of AI progress in open source. We hosted this discussion yesterday with Simon Willison, who apart from being an incredible communicator also wrote a great recap of the No Moat memo. 2,800 have now tuned in on Twitter Spaces, but we have taken the audio and cleaned it up here. Enjoy!Timestamps* [00:00:00] Introducing the Google Memo* [00:02:48] Open Source > Closed?* [00:05:51] Running Models On Device* [00:07:52] LoRA part 1* [00:08:42] On Moats - Size, Data* [00:11:34] Open Source Models are Comparable on Data* [00:13:04] Stackable LoRA* [00:19:44] The Need for Special Purpose Optimized Models* [00:21:12] Modular - Mojo from Chris Lattner* [00:23:33] The Promise of Language Supersets* [00:28:44] Google AI Strategy* [00:29:58] Zuck Releasing LLaMA* [00:30:42] Google Origin Confirmed* [00:30:57] Google's existential threat* [00:32:24] Non-Fiction AI Safety ("y-risk")* [00:35:17] Prompt Injection* [00:36:00] Google vs OpenAI* [00:41:04] Personal plugs: Simon and TravisTranscripts[00:00:00] Introducing the Google Memo[00:00:00] Simon Willison: So, yeah, this is a document, which Kate, which I first saw at three o'clock this morning, I think. It claims to be leaked from Google. There's good reasons to believe it is leaked from Google, and to be honest, if it's not, it doesn't actually matter because the quality of the analysis, I think stands alone.[00:00:15] If this was just a document by some anonymous person, I'd still think it was interesting and worth discussing. And the title of the document is We Have No Moat and neither does Open ai. And the argument it makes is that while Google and OpenAI have been competing on training bigger and bigger language models, the open source community is already starting to outrun them, given only a couple of months of really like really, really serious activity.[00:00:41] You know, Facebook lama was the thing that really kicked us off. There were open source language models like Bloom before that some G P T J, and they weren't very impressive. Like nobody was really thinking that they were. Chat. G P T equivalent Facebook Lama came out in March, I think March 15th. And was the first one that really sort of showed signs of being as capable maybe as chat G P T.[00:01:04] My, I don't, I think all of these models, they've been, the analysis of them has tend to be a bit hyped. Like I don't think any of them are even quite up to GT 3.5 standards yet, but they're within spitting distance in some respects. So anyway, Lama came out and then, Two weeks later Stanford Alpaca came out, which was fine tuned on top of Lama and was a massive leap forward in terms of quality.[00:01:27] And then a week after that Vicuna came out, which is to this date, the the best model I've been able to run on my own hardware. I, on my mobile phone now, like, it's astonishing how little resources you need to run these things. But anyway, the the argument that this paper made, which I found very convincing is it only took open source two months to get this far.[00:01:47] It's now every researcher in the world is kicking it on new, new things, but it feels like they're being there. There are problems that Google has been trying to solve that the open source models are already addressing, and really how do you compete with that, like with your, it's closed ecosystem, how are you going to beat these open models with all of this innovation going on?[00:02:04] But then the most interesting argument in there is it talks about the size of models and says that maybe large isn't a competitive advantage, maybe actually a smaller model. With lots of like different people fine tuning it and having these sort of, these LoRA l o r a stackable fine tuning innovations on top of it, maybe those can move faster.[00:02:23] And actually having to retrain your giant model every few months from scratch is, is way less useful than having small models that you can tr you can fine tune in a couple of hours on laptop. So it's, it's fascinating. I basically, if you haven't read this thing, you should read every word of it. It's not very long.[00:02:40] It's beautifully written. Like it's, it's, I mean, If you try and find the quotable lines in it, almost every line of it's quotable. Yeah. So, yeah, that's that, that, that's the status of this[00:02:48] Open Source > Closed?[00:02:48] swyx: thing. That's a wonderful summary, Simon. Yeah, there, there's so many angles we can take to this. I, I'll just observe one, one thing which if you think about the open versus closed narrative, Ima Mok, who is the CEO of Stability, has always been that open will trail behind closed, because the closed alternatives can always take.[00:03:08] Learnings and lessons from open source. And this is the first highly credible statement that is basically saying the exact opposite, that open source is moving than, than, than closed source. And they are scared. They seem to be scared. Which is interesting,[00:03:22] Travis Fischer: Travis. Yeah, the, the, the, a few things that, that I'll, I'll, I'll say the only thing which can keep up with the pace of AI these days is open source.[00:03:32] I think we're, we're seeing that unfold in real time before our eyes. And. You know, I, I think the other interesting angle of this is to some degree LLMs are they, they don't really have switching costs. They are going to be, become commoditized. At least that's, that's what a lot of, a lot of people kind of think to, to what extent is it Is it a, a rate in terms of, of pricing of these things?[00:03:55] , and they all kind of become roughly the, the, the same in, in terms of their, their underlying abilities. And, and open source is gonna, gonna be actively pushing, pushing that forward. And, and then this is kind of coming from, if it is to be believed the kind of Google or an insider type type mentality around you know, where is the actual competitive advantage?[00:04:14] What should they be focusing on? How can they get back in into the game? When you know, when, when, when, when currently the, the, the external view of, of Google is that they're kind of spinning their wheels and they have this code red,, and it's like they're, they're playing catch up already.[00:04:28] Like how could they use the open source community and work with them, which is gonna be really, really hard you know, from a structural perspective given Google's place in the ecosystem. But a, a lot, lot, a lot of jumping off points there.[00:04:42] Alessio Fanelli: I was gonna say, I think the Post is really focused on how do we get the best model, but it's not focused on like, how do we build the best product around it.[00:04:50] A lot of these models are limited by how many GPUs you can get to run them and we've seen on traditional open source, like everybody can use some of these projects like Kafka and like Alaska for free. But the reality is that not everybody can afford to run the infrastructure needed for it.[00:05:05] So I, I think like the main takeaway that I have from this is like, A lot of the moats are probably around just getting the, the sand, so to speak, and having the GPUs to actually serve these models. Because even if the best model is open source, like running it at large scale for an end is not easy and like, it's not super convenient to get a lot, a lot of the infrastructure.[00:05:27] And we've seen that model work in open source where you have. The opensource project, and then you have a enterprise cloud hosted version for it. I think that's gonna look really different in opensource models because just hosting a model doesn't have a lot of value. So I'm curious to hear how people end up getting rewarded to do opensource.[00:05:46] You know, it's, we figured that out in infrastructure, but we haven't figured it out in in Alans[00:05:51] Running Models On Device[00:05:51] Simon Willison: yet. I mean, one thing I'll say is that the the models that you can run on your own devices are so far ahead of what I ever dreamed they would be at this point. Like Vicuna 13 b i i, I, I think is the current best available open mo model that I've played with.[00:06:08] It's derived from Facebook Lama, so you can't use it for commercial purposes yet. But the point about MCK 13 B is it runs in the browser directly on web gpu. There's this amazing web l l M project where you literally, your browser downloaded a two gigabyte file. And it fires up a chat g D style interface and it's quite good.[00:06:27] It can do rap battles between different animals and all of the kind of fun stuff that you'd expect to be able to do the language model running entirely in Chrome canary. It's shocking to me that that's even possible, but that kind of shows that once, once you get to inference, if you can shrink the model down and the techniques for shrinking these models, the, the first one was the the quantization.[00:06:48] Which the Lama CPP project really sort of popularized Matt can by using four bits instead of 16 bit floating point numbers, you can shrink it down quite a lot. And then there was a paper that came out days ago suggesting that you can prune the models and ditch half the model and maintain the same level of quality.[00:07:05] So with, with things like that, with all of these tricks coming together, it's really astonishing how much you can get done on hardware that people actually have in their pockets even.[00:07:15] swyx: Just for completion I've been following all of your posts. Oh, sorry. Yes. I just wanna follow up, Simon. You're, you said you're running a model on your phone. Which model is it? And I don't think you've written it up.[00:07:27] Simon Willison: Yeah, that one's vina. I did, did I write it up? I did. I've got a blog post about how it it, it, it knows who I am, sort of, but it said that I invented a, a, a pattern for living called bear or bunny pattern, which I definitely didn't, but I loved that my phone decided that I did.[00:07:44] swyx: I will hunt for that because I'm not yet running Vic on my phone and I feel like I should and, and as like a very base thing, but I'll, okay.[00:07:52] Stackable LoRA Modules[00:07:52] swyx: Also, I'll follow up two things, right? Like one I'm very interesting and let's, let's talk about that a little bit more because this concept of stackable improvements to models I think is extremely interesting.[00:08:00] Like, I would love to MPM install abilities onto my models, right? Which is really awesome. But the, the first thing thing is under-discussed is I don't get the panic. Like, honestly, like Google has the most moats. I I, I was arguing maybe like three months ago on my blog. Like Google has the most mote out of a lot of people because, hey, we have your calendar.[00:08:21] Hey, we have your email. Hey, we have your you know, Google Docs. Like, isn't that a, a sufficient mode? Like, why are these guys panicking so much? I don't, I still don't get it. Like, Sure open source is running ahead and like, it's, it's on device and whatev, what have you, but they have so much more mode.[00:08:36] Like, what are we talking about here? There's many dimensions to compete on.[00:08:42] On Moats - Size, Data[00:08:42] Travis Fischer: Yeah, there's like one of, one of the, the things that, that the author you know, mentions in, in here is when, when you start to, to, to have the feeling of what we're trailing behind, then you're, you're, you're, you're brightest researchers jump ship and go to OpenAI or go to work at, at, at academia or, or whatever.[00:09:00] And like the talent drain. At the, the level of the, the senior AI researchers that are pushing these things ahead within Google, I think is a serious, serious concern. And my, my take on it's a good point, right? Like, like, like, like what Google has modes. They, they, they're not running outta money anytime soon.[00:09:16] You know, I think they, they do see the level of the, the defensibility and, and the fact that they want to be, I'll chime in the, the leader around pretty much anything. Tech first. There's definitely ha ha have lost that, that, that feeling. Right? , and to what degree they can, they can with the, the open source community to, to get that back and, and help drive that.[00:09:38] You know all of the llama subset of models with, with alpaca and Vicuna, et cetera, that all came from, from meta. Right. Like that. Yeah. Like it's not licensed in an open way where you can build a company on top of it, but is now kind of driving this family of, of models, like there's a tree of models that, that they're, they're leading.[00:09:54] And where is Google in that, in that playbook? Like for a long time they were the one releasing those models being super open and, and now it's just they, they've seem to be trailing and there's, there's people jumping ship and to what degree can they, can they, can they. Close off those wounds and, and focus on, on where, where they, they have unique ability to, to gain momentum.[00:10:15] I think is a core part of my takeaway from this. Yeah.[00:10:19] Alessio Fanelli: And think another big thing in the post is, oh, as long as you have high quality data, like you don't need that much data, you can just use that. The first party data loops are probably gonna be the most important going forward if we do believe that this is true.[00:10:32] So, Databricks. We have Mike Conover from Databricks on the podcast, and they talked about how they came up with the training set for Dolly, which they basically had Databricks employees write down very good questions and very good answers for it. Not every company as the scale to do that. And I think products like Google, they have millions of people writing Google Docs.[00:10:54] They have millions of people using Google Sheets, then millions of people writing stuff, creating content on YouTube. The question is, if you wanna compete against these companies, maybe the model is not what you're gonna do it with because the open source kind of commoditizes it. But how do you build even better data?[00:11:12] First party loops. And that's kind of the hardest thing for startups, right? Like even if we open up the, the models to everybody and everybody can just go on GitHub and. Or hugging face and get the waste to the best model, but get enough people to generate data for me so that I can still make it good. That's, that's what I would be worried about if I was a, a new company.[00:11:31] How do I make that happen[00:11:32] Simon Willison: really quickly?[00:11:34] Open Source Models are Comparable on Data[00:11:34] Simon Willison: I'm not convinced that the data is that big a challenge. So there's this PO project. So the problem with Facebook LAMA is that it's not available for, for commercial use. So people are now trying to train a alternative to LAMA that's entirely on openly licensed data.[00:11:48] And that the biggest project around that is this red pajama project, which They released their training data a few weeks ago and it was 2.7 terabytes. Right? So actually tiny, right? You can buy a laptop that you can fit 2.7 terabytes on. Got it. But it was the same exact data that Facebook, the same thing that Facebook Lamb had been trained on.[00:12:06] Cuz for your base model. You're not really trying to teach it fact about the world. You're just trying to teach it how English and other languages work, how they fit together. And then the real magic is when you fine tune on top of that. That's what Alpaca did on top of Lama and so on. And the fine tuning sets, it looks like, like tens of thousands of examples to kick one of these role models into shape.[00:12:26] And tens of thousands of examples like Databricks spent a month and got the 2000 employees of their company to help kick in and it worked. You've got the open assistant project of crowdsourcing this stuff now as well. So it's achievable[00:12:40] swyx: sore throat. I agree. I think it's a fa fascinating point. Actually, so I've heard through the grapevine then red pajamas model.[00:12:47] Trained on the, the data that they release is gonna be releasing tomorrow. And it's, it's this very exciting time because the, the, there, there's a, there's a couple more models that are coming down the pike, which independently we produced. And so yeah, that we, everyone is challenging all these assumptions from, from first principles, which is fascinating.[00:13:04] Stackable LoRA[00:13:04] swyx: I, I did, I did wanted to, to like try to get a little bit more technical in terms of like the, the, the, the specific points race. Cuz this doc, this doc was just amazing. Can we talk about LoRA. I, I, I'll open up to Simon again if he's back.[00:13:16] Simon Willison: I'd rather someone else take on. LoRA, I've, I, I know as much as I've read in that paper, but not much more than that.[00:13:21] swyx: So I thought it was this kind of like an optimization technique. So LoRA stands for lower rank adaptation. But this is the first mention of LoRA as a form of stackable improvements. Where he I forget what, let, just, let me just kind of Google this. But obviously anyone's more knowledgeable please.[00:13:39] So come on in.[00:13:40] Alessio Fanelli: I, all of Lauren is through GTS Man, about 20 minutes on GT four, trying to figure out word. It was I study computer science, but this is not this is not my area of expertise. What I got from it is that basically instead of having to retrain the whole model you can just pick one of the ranks and you take.[00:13:58] One of like the, the weight matrix tests and like make two smaller matrixes from it and then just two to be retrained and training the whole model. So[00:14:08] swyx: it save a lot of Yeah. You freeze part of the thing and then you just train the smaller part like that. Exactly. That seems to be a area of a lot of fruitful research.[00:14:15] Yeah. I think Mini GT four recently did something similar as well. And then there's, there's, there's a, there's a Spark Model people out today that also did the same thing.[00:14:23] Simon Willison: So I've seen a lot of LoRA stable, the stable diffusion community has been using LoRA a lot. So they, in that case, they had a, I, the thing I've seen is people releasing LoRA's that are like you, you train a concept like a, a a particular person's face or something you release.[00:14:38] And the, the LoRA version of this end up being megabytes of data, like, which is, it's. You know, it's small enough that you can just trade those around and you can effectively load multiple of those into the model. But what I haven't realized is that you can use the same trick on, on language models. That was one of the big new things for me in reading the the leaks Google paper today.[00:14:56] Alessio Fanelli: Yeah, and I think the point to make around on the infrastructure, so what tragedy has told me is that when you're figuring out what rank you actually wanna do this fine tuning at you can have either go too low and like the model doesn't actually learn it. Or you can go too high and the model overfit those learnings.[00:15:14] So if you have a base model that everybody agrees on, then all the subsequent like LoRA work is done around the same rank, which gives you an advantage. And the point they made in the, that, since Lama has been the base for a lot of this LoRA work like they own. The, the mind share of the community.[00:15:32] So everything that they're building is compatible with their architecture. But if Google Opensources their own model the rank that they chose For LoRA on Lama might not work on the Google model. So all of the existing work is not portable. So[00:15:46] Simon Willison: the impression I got is that one of the challenges with LoRA is that you train all these LoRAs on top of your model, but then if you retrain that base model as LoRA's becoming invalid, right?[00:15:55] They're essentially, they're, they're, they're built for an exact model version. So this means that being the big company with all of the GPUs that can afford to retrain a model every three months. That's suddenly not nearly as valuable as it used to be because now maybe there's an open source model that's five years old at this point and has like multiple, multiple stacks of LoRA's trained all over the world on top of it, which can outperform your brand new model just because there's been so much more iteration on that base.[00:16:20] swyx: I, I think it's, I think it's fascinating. It's I think Jim Fan from Envidia was recently making this argument for transformers. Like even if we do come up with a better. Architecture, then transformers, they're the sheer hundreds and millions of dollars that have been invested on top of transformers.[00:16:34] Make it actually there is some switching costs and it's not exactly obvious that better architecture. Equals equals we should all switch immediately tomorrow. It's, it's, it's[00:16:44] Simon Willison: kinda like the, the difficulty of launching a new programming language today Yes. Is that pipeline and JavaScript have a million packages.[00:16:51] So no matter how good your new language is, if it can't tap into those existing package libraries, it's, it's not gonna be useful for, which is why Moji is so clever, because they did build on top of Pips. They get all of that existing infrastructure, all of that existing code working already.[00:17:05] swyx: I mean, what, what thought you, since you co-create JAO and all that do, do we wanna take a diversion into mojo?[00:17:10] No, no. I[00:17:11] Travis Fischer: would, I, I'd be happy to, to, to jump in, and get Simon's take on, on Mojo. 1, 1, 1 small, small point on LoRA is I, I, I just think. If you think about at a high level, what the, the major down downsides are of these, these large language models. It's the fact that they well they're, they're, they're difficult to, to train, right?[00:17:32] They, they tend to hallucinate and they are, have, have a static, like, like they were trained at a certain date, right? And with, with LoRA, I think it makes it a lot more amenable to Training new, new updates on top of that, that like base model on the fly where you can incorporate new, new data and in a way that is, is, is an interesting and potentially more optimal alternative than Doing the kind of in context generation cuz, cuz most of like who at perplexity AI or, or any of these, these approaches currently, it's like all based off of doing real-time searches and then injecting as much into the, the, the local context window as possible so that you, you try to ground your, your, your, your language model.[00:18:16] Both in terms of the, the information it has access to that, that, that helps to reduce hallucinations. It can't reduce it, but helps to reduce it and then also gives it access to up-to-date information that wasn't around for that, that massive like, like pre-training step. And I think LoRA in, in, in mine really makes it more, more amenable to having.[00:18:36] Having constantly shifting lightweight pre-training on top of it that scales better than than normal. Pre I'm sorry. Fine tune, fine tuning. Yeah, that, that was just kinda my one takeaway[00:18:45] Simon Willison: there. I mean, for me, I've never been, I want to run models on my own hard, I don't actually care about their factual content.[00:18:52] Like I don't need a model that's been, that's trained on the most upstate things. What I need is a model that can do the bing and bar trick, right? That can tell when it needs to run a search. And then go and run a search to get extra information and, and bring that context in. And similarly, I wanted to be able to operate tools where it can access my email or look at my notes or all of those kinds of things.[00:19:11] And I don't think you need a very powerful model for that. Like that's one of the things where I feel like, yeah, vicuna running on my, on my laptop is probably powerful enough to drive a sort of personal research assistant, which can look things up for me and it can summarize things for my notes and it can do all of that and I don't care.[00:19:26] But it doesn't know about the Ukraine war because the Ukraine war training cutoff, that doesn't matter. If it's got those additional capabilities, which are quite easy to build the reason everyone's going crazy building agents and tools right now is that it's a few lines of Python code, and a sort of couple of paragraphs to get it to.[00:19:44] The Need for Special Purpose Optimized Models[00:19:44] Simon Willison: Well, let's, let's,[00:19:45] Travis Fischer: let's maybe dig in on that a little bit. And this, this also is, is very related to mojo. Cuz I, I do think there are use cases and domains where having the, the hyper optimized, like a version of these models running on device is, is very relevant where you can't necessarily make API calls out on the fly.[00:20:03] and Aug do context, augmented generation. And I was, I was talking with, with a a researcher. At Lockheed Martin yesterday, literally about like, like the, the version of this that's running of, of language models running on, on fighter jets. Right? And you, you talk about like the, the, the amount of engineering, precision and optimization that has to go into, to those type of models.[00:20:25] And the fact that, that you spend so much money, like, like training a super distilled ver version where milliseconds matter it's a life or death situation there. You know, and you couldn't even, even remotely ha ha have a use case there where you could like call out and, and have, have API calls or something.[00:20:40] So I, I do think there's like keeping in mind the, the use cases where, where. There, there'll be use cases that I'm more excited about at, at the application level where, where, yeah, I want to to just have it be super flexible and be able to call out to APIs and have this agentic type type thing.[00:20:56] And then there's also industries and, and use cases where, where you really need everything baked into the model.[00:21:01] swyx: Yep. Agreed. My, my favorite piece take on this is I think DPC four as a reasoning engine, which I think came from the from Nathan at every two. Which I think, yeah, I see the hundred score over there.[00:21:12] Modular - Mojo from Chris Lattner[00:21:12] swyx: Simon, do you do you have a, a few seconds on[00:21:14] Simon Willison: mojo. Sure. So Mojo is a brand new program language you just announced a few days ago. It's not actually available yet. I think there's an online demo, but to zooming it becomes an open source language we can use. It's got really some very interesting characteristics.[00:21:29] It's a super set of Python, so anything written in Python, Python will just work, but it adds additional features on top that let you basically do very highly optimized code with written. In Python syntax, it compiles down the the main thing that's exciting about it is the pedigree that it comes from.[00:21:47] It's a team led by Chris Latner, built L L V M and Clang, and then he designed Swift at Apple. So he's got like three, three for three on, on extraordinarily impactful high performance computing products. And he put together this team and they've basically, they're trying to go after the problem of how do you build.[00:22:06] A language which you can do really high performance optimized work in, but where you don't have to do everything again from scratch. And that's where building on top of Python is so clever. So I wasn't like, if this thing came along, I, I didn't really pay attention to it until j Jeremy Howard, who built Fast ai put up a very detailed blog post about why he was excited about Mojo, which included a, there's a video demo in there, which everyone should watch because in that video he takes Matrix multiplication implemented in Python.[00:22:34] And then he uses the mojo extras to 2000 x. The performance of that matrix multiplication, like he adds a few static types functions sort of struck instead of the class. And he gets 2000 times the performance out of it, which is phenomenal. Like absolutely extraordinary. So yeah, that, that got me really excited.[00:22:52] Like the idea that we can still use Python and all of this stuff we've got in Python, but we can. Just very slightly tweak some things and get literally like thousands times upwards performance out of the things that matter. That's really exciting.[00:23:07] swyx: Yeah, I, I, I'm curious, like, how come this wasn't thought of before?[00:23:11] It's not like the, the, the concept of a language super set hasn't hasn't, has, has isn't, is completely new. But all, as far as I know, all the previous Python interpreter approaches, like the alternate runtime approaches are like they, they, they're more, they're more sort of, Fit conforming to standard Python, but never really tried this additional approach of augmenting the language.[00:23:33] The Promise of Language Supersets[00:23:33] swyx: I, I'm wondering if you have many insights there on, like, why, like why is this a, a, a breakthrough?[00:23:38] Simon Willison: Yeah, that's a really interesting question. So, Jeremy Howard's piece talks about this thing called M L I R, which I hadn't heard of before, but this was another Chris Latner project. You know, he built L L VM as a low level virtual machine.[00:23:53] That you could build compilers on top of. And then M L I R was this one that he initially kicked off at Google, and I think it's part of TensorFlow and things like that. But it was very much optimized for multiple cores and GPU access and all of that kind of thing. And so my reading of Jeremy Howard's article is that they've basically built Mojo on top of M L I R.[00:24:13] So they had a huge, huge like a starting point where they'd, they, they knew this technology better than anyone else. And because they had this very, very robust high performance basis that they could build things on. I think maybe they're just the first people to try and build a high, try and combine a high level language with M L A R, with some extra things.[00:24:34] So it feels like they're basically taking a whole bunch of ideas people have been sort of experimenting with over the last decade and bundled them all together with exactly the right team, the right level of expertise. And it looks like they've got the thing to work. But yeah, I mean, I've, I've, I'm. Very intrigued to see, especially once this is actually available and we can start using it.[00:24:52] It, Jeremy Howard is someone I respect very deeply and he's, he's hyping this thing like crazy, right? His headline, his, and he's not the kind of person who hypes things if they're not worth hyping. He said Mojo may be the biggest programming language advanced in decades. And from anyone else, I'd kind of ignore that headline.[00:25:09] But from him it really means something.[00:25:11] swyx: Yes, because he doesn't hype things up randomly. Yeah, and, and, and he's a noted skeptic of Julia which is, which is also another data science hot topic. But from the TypeScript and web, web development worlds there has been a dialect of TypeScript that was specifically optimized to compile, to web assembly which I thought was like promising and then, and, and eventually never really took off.[00:25:33] But I, I like this approach because I think more. Frameworks should, should essentially be languages and recognize that they're language superset and maybe working compilers that that work on them. And then that is the, by the way, that's the direction that React is going right now. So fun times[00:25:50] Simon Willison: type scripts An interesting comparison actually, cuz type script is effectively a superset of Java script, right?[00:25:54] swyx: It's, but there's no, it's purely[00:25:57] Simon Willison: types, right? Gotcha. Right. So, so I guess mojo is the soup set python, but the emphasis is absolutely on tapping into the performance stuff. Right.[00:26:05] swyx: Well, the just things people actually care about.[00:26:08] Travis Fischer: Yeah. The, the one thing I've found is, is very similar to the early days of type script.[00:26:12] There was the, the, the, the most important thing was that it's incrementally adoptable. You know, cuz people had a script code basis and, and they wanted to incrementally like add. The, the, the main value prop for TypeScript was reliability and the, the, the, the static typing. And with Mojo, Lucia being basically anyone who's a target a large enterprise user of, of Mojo or even researchers, like they're all going to be coming from a, a hardcore.[00:26:36] Background in, in Python and, and have large existing libraries. And the the question will be for what use cases will mojo be like a, a, a really good fit for that incremental adoption where you can still tap into your, your, your massive, like python exi existing infrastructure workflows, data tooling, et cetera.[00:26:55] And, and what does, what does that path to adoption look like?[00:26:59] swyx: Yeah, we, we, we don't know cuz it's a wait listed language which people were complaining about. They, they, the, the mojo creators were like saying something about they had to scale up their servers. And I'm like, what language requires essential server?[00:27:10] So it's a little bit suss, a little bit, like there's a, there's a cloud product already in place and they're waiting for it. But we'll see. We'll see. I mean, emojis should be promising in it. I, I actually want more. Programming language innovation this way. You know, I was complaining years ago that programming language innovation is all about stronger types, all fun, all about like more functional, more strong types everywhere.[00:27:29] And, and this is, the first one is actually much more practical which I, which I really enjoy. This is why I wrote about self provisioning run types.[00:27:36] Simon Willison: And[00:27:37] Alessio Fanelli: I mean, this is kind of related to the post, right? Like if you stop all of a sudden we're like, the models are all the same and we can improve them.[00:27:45] Like, where can we get the improvements? You know, it's like, Better run times, better languages, better tooling, better data collection. Yeah. So if I were a founder today, I wouldn't worry as much about the model, maybe, but I would say, okay, what can I build into my product and like, or what can I do at the engineering level that maybe it's not model optimization because everybody's working on it, but like you said, it's like, why haven't people thought of this before?[00:28:09] It's like, it's, it's definitely super hard, but I'm sure that if you're like Google or you're like open AI or you're like, Databricks, we got smart enough people that can think about these problems, so hopefully we see more of this.[00:28:21] swyx: You need, Alan? Okay. I promise to keep this relatively tight. I know Simon on a beautiful day.[00:28:27] It is a very nice day in California. I wanted to go through a few more points that you have pulled out Simon and, and just give you the opportunity to, to rant and riff and, and what have you. I, I, are there any other points from going back to the sort of Google OpenAI mode documents that, that you felt like we, we should dive in on?[00:28:44] Google AI Strategy[00:28:44] Simon Willison: I mean, the really interesting stuff there is the strategy component, right? The this idea that that Facebook accidentally stumbled into leading this because they put out this model that everyone else is innovating on top of. And there's a very open question for me as to would Facebook relic Lama to allow for commercial usage?[00:29:03] swyx: Is there some rumor? Is that, is that today?[00:29:06] Simon Willison: Is there a rumor about that?[00:29:07] swyx: That would be interesting? Yeah, I saw, I saw something about Zuck saying that he would release the, the Lama weights officially.[00:29:13] Simon Willison: Oh my goodness. No, that I missed. That is, that's huge.[00:29:17] swyx: Let me confirm the tweet. Let me find the tweet and then, yeah.[00:29:19] Okay.[00:29:20] Simon Willison: Because actually I met somebody from Facebook machine learning research a couple of weeks ago, and I, I pressed 'em on this and they said, basically they don't think it'll ever happen because if it happens, and then somebody does horrible fascist stuff with this model, all of the headlines will be Meg releases a monster into the world.[00:29:36] So, so hi. His, the, the, the, a couple of weeks ago, his feeling was that it's just too risky for them to, to allow it to be used like that. But a couple of weeks is, is, is a couple of months in AI world. So yeah, it wouldn't be, it feels to me like strategically Facebook should be jumping right on this because this puts them at the very.[00:29:54] The very lead of, of open source innovation around this stuff.[00:29:58] Zuck Releasing LLaMA[00:29:58] swyx: So I've pinned the tweet talking about Zuck and Zuck saying that meta will open up Lama. It's from the founder of Obsidian, which gives it a slight bit more credibility, but it is the only. Tweet that I can find about it. So completely unsourced,[00:30:13] we shall see. I, I, I mean I have friends within meta, I should just go ask them. But yeah, I, I mean one interesting angle on, on the memo actually is is that and, and they were linking to this in, in, in a doc, which is apparently like. Facebook got a bunch of people to do because they, they never released it for commercial use, but a lot of people went ahead anyway and, and optimized and, and built extensions and stuff.[00:30:34] They, they got a bunch of free work out of opensource, which is an interesting strategy.[00:30:39] There's okay. I don't know if I.[00:30:42] Google Origin Confirmed[00:30:42] Simon Willison: I've got exciting piece of news. I've just heard from somebody with contacts at Google that they've heard people in Google confirm the leak. That that document wasn't even legit Google document, which I don't find surprising at all, but I'm now up to 10, outta 10 on, on whether that's, that's, that's real.[00:30:57] Google's existential threat[00:30:57] swyx: Excellent. Excellent. Yeah, it is fascinating. Yeah, I mean the, the strategy is, is, is really interesting. I think Google has been. Definitely sleeping on monetizing. You know, I, I, I heard someone call when Google Brain and Devrel I merged that they would, it was like goodbye to the Xerox Park of our era and it definitely feels like Google X and Google Brain would definitely Xerox parks of our, of our era, and I guess we all benefit from that.[00:31:21] Simon Willison: So, one thing I'll say about the, the Google side of things, like the there was a question earlier, why are Google so worried about this stuff? And I think it's, it's just all about the money. You know, the, the, the engine of money at Google is Google searching Google search ads, and who uses Chachi PT on a daily basis, like me, will have noticed that their usage of Google has dropped like a stone.[00:31:41] Because there are many, many questions that, that chat, e p t, which shows you no ads at all. Is, is, is a better source of information for than Google now. And so, yeah, I'm not, it doesn't surprise me that Google would see this as an existential threat because whether or not they can be Bard, it's actually, it's not great, but it, it exists, but it hasn't it yet either.[00:32:00] And if I've got a Chatbook chatbot that's not showing me ads and chatbot that is showing me ads, I'm gonna pick the one that's not showing[00:32:06] swyx: me ads. Yeah. Yeah. I, I agree. I did see a prototype of Bing with ads. Bing chat with ads. I haven't[00:32:13] Simon Willison: seen the prototype yet. No.[00:32:15] swyx: Yeah, yeah. Anyway, I I, it, it will come obviously, and then we will choose, we'll, we'll go out of our ways to avoid ads just like we always do.[00:32:22] We'll need ad blockers and chat.[00:32:23] Excellent.[00:32:24] Non-Fiction AI Safety ("y-risk")[00:32:24] Simon Willison: So I feel like on the safety side, the, the safety side, there are basically two areas of safety that I, I, I sort of split it into. There's the science fiction scenarios, the AI breaking out and killing all humans and creating viruses and all of that kind of thing. The sort of the terminated stuff. And then there's the the.[00:32:40] People doing bad things with ai and that's latter one is the one that I think is much more interesting and that cuz you could u like things like romance scams, right? Romance scams already take billions of dollars from, from vulner people every year. Those are very easy to automate using existing tools.[00:32:56] I'm pretty sure for QNA 13 b running on my laptop could spin up a pretty decent romance scam if I was evil and wanted to use it for them. So that's the kind of thing where, I get really nervous about it, like the fact that these models are out there and bad people can use these bad, do bad things.[00:33:13] Most importantly at scale, like romance scamming, you don't need a language model to pull off one romance scam, but if you wanna pull off a thousand at once, the language model might be the, the thing that that helps you scale to that point. And yeah, in terms of the science fiction stuff and also like a model on my laptop that can.[00:33:28] Guess what comes next in a sentence. I'm not worried that that's going to break out of my laptop and destroy the world. There. There's, I'm get slightly nervous about the huge number of people who are trying to build agis on top of this models, the baby AGI stuff and so forth, but I don't think they're gonna get anywhere.[00:33:43] I feel like if you actually wanted a model that was, was a threat to human, a language model would be a tiny corner of what that thing. Was actually built on top of, you'd need goal setting and all sorts of other bits and pieces. So yeah, for the moment, the science fiction stuff doesn't really interest me, although it is a little bit alarming seeing more and more of the very senior figures in this industry sort of tip the hat, say we're getting a little bit nervous about this stuff now.[00:34:08] Yeah.[00:34:09] swyx: So that would be Jeff Iton and and I, I saw this me this morning that Jan Lacoon was like happily saying, this is fine. Being the third cheer award winner.[00:34:20] Simon Willison: But you'll see a lot of the AI safe, the people who've been talking about AI safety for the longest are getting really angry about science fiction scenarios cuz they're like, no, the, the thing that we need to be talking about is the harm that you can cause with these models right now today, which is actually happening and the science fiction stuff kind of ends up distracting from that.[00:34:36] swyx: I love it. You, you. Okay. So, so Uher, I don't know how to pronounce his name. Elier has a list of ways that AI will kill us post, and I think, Simon, you could write a list of ways that AI will harm us, but not kill us, right? Like the, the, the non-science fiction actual harm ways, I think, right? I haven't seen a, a actual list of like, hey, romance scams spam.[00:34:57] I, I don't, I don't know what else, but. That could be very interesting as a Hmm. Okay. Practical. Practical like, here are the situations we need to guard against because they are more real today than that we need to. Think about Warren, about obviously you've been a big advocate of prompt injection awareness even though you can't really solve them, and I, I worked through a scenario with you, but Yeah,[00:35:17] Prompt Injection[00:35:17] Simon Willison: yeah.[00:35:17] Prompt injection is a whole other side of this, which is, I mean, that if you want a risk from ai, the risk right now is everyone who's building puts a building systems that attackers can trivially subvert into stealing all of their private data, unlocking their house, all of that kind of thing. So that's another very real risk that we have today.[00:35:35] swyx: I think in all our personal bios we should edit in prompt injections already, like in on my website, I wanna edit in a personal prompt injections so that if I get scraped, like I all know if someone's like reading from a script, right? That that is generated by any iBot. I've[00:35:49] Simon Willison: seen people do that on LinkedIn already and they get, they get recruiter emails saying, Hey, I didn't read your bio properly and I'm just an AI script, but would you like a job?[00:35:57] Yeah. It's fascinating.[00:36:00] Google vs OpenAI[00:36:00] swyx: Okay. Alright, so topic. I, I, I think, I think this this, this mote is is a peak under the curtain of the, the internal panic within Google. I think it is very val, very validated. I'm not so sure they should care so much about small models or, or like on device models.[00:36:17] But the other stuff is interesting. There is a comment at the end that you had by about as for opening open is themselves, open air, doesn't matter. So this is a Google document talking about Google's position in the market and what Google should be doing. But they had a comment here about open eye.[00:36:31] They also say open eye had no mode, which is a interesting and brave comment given that open eye is the leader in, in a lot of these[00:36:38] Simon Willison: innovations. Well, one thing I will say is that I think we might have identified who within Google wrote this document. Now there's a version of it floating around with a name.[00:36:48] And I look them up on LinkedIn. They're heavily involved in the AI corner of Google. So my guess is that at Google done this one, I've worked for companies. I'll put out a memo, I'll write up a Google doc and I'll email, email it around, and it's nowhere near the official position of the company or of the executive team.[00:37:04] It's somebody's opinion. And so I think it's more likely that this particular document is somebody who works for Google and has an opinion and distributed it internally and then it, and then it got leaked. I dunno if it's necessarily. Represents Google's sort of institutional thinking about this? I think it probably should.[00:37:19] Again, this is such a well-written document. It's so well argued that if I was an executive at Google and I read that, I would, I would be thinking pretty hard about it. But yeah, I don't think we should see it as, as sort of the official secret internal position of the company. Yeah. First[00:37:34] swyx: of all, I might promote that person.[00:37:35] Cuz he's clearly more,[00:37:36] Simon Willison: oh, definitely. He's, he's, he's really, this is a, it's, I, I would hire this person about the strength of that document.[00:37:42] swyx: But second of all, this is more about open eye. Like I'm not interested in Google's official statements about open, but I was interested like his assertion, open eye.[00:37:50] Doesn't have a mote. That's a bold statement. I don't know. It's got the best people.[00:37:55] Travis Fischer: Well, I, I would, I would say two things here. One, it's really interesting just at a meta, meta point that, that they even approached it this way of having this public leak. It, it, it kind of, Talks a little bit to the fact that they, they, they felt that that doing do internally, like wasn't going to get anywhere or, or maybe this speaks to, to some of the like, middle management type stuff or, or within Google.[00:38:18] And then to the, the, the, the point about like opening and not having a moat. I think for, for large language models, it, it, it will be over, over time kind of a race to the bottom just because the switching costs are, are, are so low compared with traditional cloud and sas. And yeah, there will be differences in, in, in quality, but, but like over time, if you, you look at the limit of these things like the, I I think Sam Altman has been quoted a few times saying that the, the, the price of marginal price of intelligence will go to zero.[00:38:47] Time and the marginal price of energy powering that intelligence will, will also hit over time. And in that world, if you're, you're providing large language models, they become commoditized. Like, yeah. What, what is, what is your mode at that point? I don't know. I think they're e extremely well positioned as a team and as a company for leading this space.[00:39:03] I'm not that, that worried about that, but it is something from a strategic point of view to keep in mind about large language models becoming a commodity. So[00:39:11] Simon Willison: it's quite short, so I think it's worth just reading the, in fact, that entire section, it says epilogue. What about open ai? All of this talk of open source can feel unfair given open AI's current closed policy.[00:39:21] Why do we have to share if they won't? That's talking about Google sharing, but the fact of the matter is we are already sharing everything with them. In the form of the steady flow of poached senior researchers until we spent that tide. Secrecy is a moot point. I love that. That's so salty. And, and in the end, open eye doesn't matter.[00:39:38] They are making the same mistakes that we are in their posture relative to open source. And their ability to maintain an edge is necessarily in question. Open source alternatives. Canned will eventually eclipse them. Unless they change their stance in this respect, at least we can make the first move. So the argument this, this paper is making is that Google should go, go like meta and, and just lean right into open sourcing it and engaging with the wider open source community much more deeply, which OpenAI have very much signaled they are not willing to do.[00:40:06] But yeah, it's it's, it's read the whole thing. The whole thing is full of little snippets like that. It's just super fun. Yes,[00:40:12] swyx: yes. Read the whole thing. I, I, I also appreciate that the timeline, because it set a lot of really great context for people who are out of the loop. So Yeah.[00:40:20] Alessio Fanelli: Yeah. And the final conspiracy theory is that right before Sundar and Satya and Sam went to the White House this morning, so.[00:40:29] swyx: Yeah. Did it happen? I haven't caught up the White House statements.[00:40:34] Alessio Fanelli: No. That I, I just saw, I just saw the photos of them going into the, the White House. I've been, I haven't seen any post-meeting updates.[00:40:41] swyx: I think it's a big win for philanthropic to be at that table.[00:40:44] Alessio Fanelli: Oh yeah, for sure. And co here it's not there.[00:40:46] I was like, hmm. Interesting. Well, anyway,[00:40:50] swyx: yeah. They need, they need some help. Okay. Well, I, I promise to keep this relatively tight. Spaces do tend to have a, have a tendency of dragging on. But before we go, anything that you all want to plug, anything that you're working on currently maybe go around Simon are you still working on dataset?[00:41:04] Personal plugs: Simon and Travis[00:41:04] Simon Willison: I am, I am, I'm having a bit of a, so datasets my open source project that I've been working on. It's about helping people analyze and publish data. I'm having an existential crisis of it at the moment because I've got access to the chat g p T code, interpreter mode, and you can upload the sequel light database to that and it will do all of the things that I, on my roadmap for the next 12 months.[00:41:24] Oh my God. So that's frustrating. So I'm basically, I'm leaning data. My interest in data and AI are, are rapidly crossing over a lot harder about the AI features that I need to build on top of dataset. Make sure it stays relevant in a chat. G p t can do most of the stuff that it does already. But yeah the thing, I'll plug my blog simon willis.net.[00:41:43] I'm now updating it daily with stuff because AI move moved so quickly and I have a sub newsletter, which is effectively my blog, but in email form sent out a couple of times a week, which Please subscribe to that or RSS feed on my blog or, or whatever because I'm, I'm trying to keep track of all sorts of things and I'm publishing a lot at the moment.[00:42:02] swyx: Yes. You, you are, and we love you very much for it because you, you are a very good reporter and technical deep diver into things, into all the things. Thank you, Simon. Travis are you ready to announce the, I guess you've announced it some somewhat. Yeah. Yeah.[00:42:14] Travis Fischer: So I'm I, I just founded a company.[00:42:16] I'm working on a framework for building reliable agents that aren't toys and focused on more constrained use cases. And you know, I I, I look at kind of agi. And these, these audigy type type projects as like jumping all the way to str to, to self-driving. And, and we, we, we kind of wanna, wanna start with some more enter and really focus on, on reliable primitives to, to start that.[00:42:38] And that'll be an open source type script project. I'll be releasing the first version of that soon. And that's, that's it. Follow me you know, on here for, for this type of stuff, I, I, I, everything, AI[00:42:48] swyx: and, and spa, his chat PT bot,[00:42:50] Travis Fischer: while you still can. Oh yeah, the chat VT Twitter bot is about 125,000 followers now.[00:42:55] It's still running. I, I'm not sure if it's your credit. Yeah. Can you say how much you spent actually, No, no. Well, I think probably totally like, like a thousand bucks or something, but I, it's, it's sponsored by OpenAI, so I haven't, I haven't actually spent any real money.[00:43:08] swyx: What? That's[00:43:09] awesome.[00:43:10] Travis Fischer: Yeah. Yeah.[00:43:11] Well, once, once I changed, originally the logo was the Chachi VUI logo and it was the green one, and then they, they hit me up and asked me to change it. So it's now it's a purple logo. And they're, they're, they're cool with that. Yeah.[00:43:21] swyx: Yeah. Sending take down notices to people with G B T stuff apparently now.[00:43:26] So it's, yeah, it's a little bit of a gray area. I wanna write more on, on mos. I've been actually collecting and meaning to write a piece of mos and today I saw the memo, I was like, oh, okay. Like I guess today's the day we talk about mos. So thank you all. Thanks. Thanks, Simon. Thanks Travis for, for jumping on and thanks to all the audience for engaging on this with us.[00:43:42] We'll continue to engage on Twitter, but thanks to everyone. Cool. Thanks everyone. Bye. Alright, thanks everyone. Bye. Get full access to Latent Space at www.latent.space/subscribe
Dan Sullivan and Jeffrey Madoff explore the importance of collaboration in both entertainment and business, using examples from stellar performers. Discover the dedication and skills required and how these can enhance collaboration and improvisation in the business world. In This Episode: Explore captivating 1930s and 1940s black and white film performances, featuring Eleanor Powell and Fred Astaire, and the importance of collaboration in entertainment and business.Discover the dedication and training required to master tap dancing, as demonstrated by Powell and Astaire's magical performance.Learn about the simplicity and effectiveness of old-style, one-take film-making and its relevance to modern-day collaboration in film and business.Delve into the significance of teamwork in achieving success, with examples from artists Bernini and Chihuly, and the role of skill development in collaboration.Understand the power of shared vision and talent pooling, using examples from the art world and the lesser-known but talented Nicholas Brothers duo.Discuss the physicality and precision of the Nicholas Brothers' performance in "Stormy Weather," and how performing arts skills can benefit entrepreneurs.Gain insights into the value of applying skills learned outside traditional business settings through a conversation on the parallels between performing arts and the business world.Explore the importance of teamwork, listening, and mastering every aspect of a production in both musicals and business.Recognize the significance of great listening, timing, and respect for the process in both music and business, while avoiding the temptation of prioritizing quick success over skill development.Reflect on the importance of continual growth in one's career, and enjoy remastered performances of famous artists from the past on YouTube, as recommended by Dan and Jeffrey. Resources: Eleanor Powell and Fred Astaire dance to “Begin the Beguine”The Nicholas Brothers with Cab CallowayJudy Garland and Mel Tormé singing “Clang, Clang, Clang Went the Trolley”
Comedians and dearest pals Tom Allen and Suzi Ruffell chat friendship, love, life and culture....sometimes.... Get in touch with all your problems or if you want to give your Like Minded Friend a shout out: hello@likemindedfriendspod.com We'll be out and in your ears wherever you get your podcasts every Wednesday morning, and if you like what you hear why not leave us a review on Apple Podcasts or wherever it is you listen... Thanks - Tom & Suzi xx A 'Keep It Light Media' Production Sales, advertising, and general enquiries: HELLO@KEEPITLIGHTMEDIA.COM Learn more about your ad choices. Visit podcastchoices.com/adchoices
Often referred to as "the AWS of crypto" Alchemy provides the infrastructure that helps developers build cutting edge web3 applications. Product manager, Rob Boyle shares the latest tools Alchemy is building, the web2 companies that are starting to build in web3, and what it's like building for Multichain in episode 24 of The Zeitgeist.Show Notes: 00:38 - Origin Story / Starting at Alchemy03:15 - What is Alchemy? 06:45 - What would Alchemy be analogous to in web2 07:36 - What kind of devs is Alchemy building for?10:21 - Patterns of Web2 companies coming to Web3 16:42 - Teaching new incoming devs 19:07 - Keeping up with a Multichain World23:12 - A builder he admires in the Web3 ecosystem Full Transcript:Brian Friel (00:06):Hey everyone and welcome to The Zeitgeist, the show where we highlight the founders, developers, and designers who are pushing the web 3.0 space forward. I'm Brian Friel, Developer of Relations at Phantom, and I'm super excited to introduce our guest, Rob Boyle, Product Lead at Alchemy. Rob, welcome to the show.Rob Boyle (00:22):Hey, Brian, great to be here. Thanks for having me on.Brian Friel (00:24):It's awesome to talk to you today. Phantom and Alchemy are doing a lot of great work together as it relates to our new multichain project. We got a lot to talk about today, but before we dive into all that I want to learn a little bit about you. Who are you and how did you start working at Alchemy?Rob Boyle (00:39):I've pretty much been in tech my whole life. I grew up in the Bay Area, went to school in the Bay Area, and then I graduated about 2006, degree in computer science, was fired up to work in Silicon Valley. A bunch of my friends went and worked for this startup called thefacebook.com, you might've heard of it, it's a bit of a thing now. I did not think that was going to be the most fun startup so I went and worked for another startup in Palo Alto. Turned out in retrospect to be perhaps not the best decision, a lot of those friends who went to Facebook are retired now quite comfortably, but I had a lot of fun, worked for a startup. I met my co-founder with whom I started two startups back to back both of which were a blast.(01:10):As with most startups, didn't really have rocketship success but we both did find our way over to Facebook a little bit later in life. But went over there and then had a blast at Facebook. Facebook is cool and that it's big enough that you can work on a ton of different problems and problem spaces within the company so I worked on integrity, finding and fighting badness, I worked on consumer products, and I also worked on Facebook developer platform. About two years ago at this point, I was looking to move on. I really liked working on developer platforms, I was looking for developer platforms in new and exciting spaces that were really innovative.(01:43):And I got a cold reach out from Alchemy and they had the fact that Jay-Z was an investor in the subject line. I was like "That's cool enough to respond." Started talking with them. It's so funny I'm like this is maybe bad for branding for myself but I'm the opposite of a lot of crypto folks where I loved Alchemy. I thought Joe and Nikil were brilliant, I thought the team was amazing, I loved the stage that they were at company-wise. And I was like "Guys, convince me that crypto is not a scam." That was the thing I wasn't on board with. I was new to it. And it took me a few months where I just went down the rabbit hole. I started talking to all my advisors and mentors, realized that a lot of them were already in crypto and web 3.0 and I was late to the game. And then over the course, I got completely convinced that this was the most interesting space to work in. And then two years ago I joined Alchemy and the rest is history.Brian Friel (02:29):That's awesome. You might be the first guest on this podcast that came into crypto because they thought the company was cool-Rob Boyle (02:34):I know.Brian Friel (02:35):And not that they got obsessed with the space and then found a company based on that interest. Hey, congratulations, you're the first so that's an award right there.Rob Boyle (02:43):I feel very lucky.Brian Friel (02:44):That's awesome. So you mentioned that you were working on developer platforms at Facebook, and then that transitions really well to what Alchemy has positioned themselves to be in the crypto space. I'd say for most developers who have been developing crypto some while, the term developer platform might mean something different or might not make total sense in the crypto space. Usually, people think, "I'm building my own DAP and I have this RPC provider, and all I really do is put the URL into my DAP and I go from there." What is Alchemy doing? What makes you guys say that you're a developer platform in web 3.0?Rob Boyle (03:16):I think there's three pieces to that. I think we have a very technical audience so they know what an RPC provider is. I'll give the overview just for anyone else who happens to be in the audience. The quick 30 seconds is, modern crypto apps have two components. They have the on-chain component, the smart contracts that run on-chain, which is the really new exciting thing about crypto, and then they have what is more traditional like a user interface like opensea.com where someone goes and interacts with the dapp. What you really need is a bridge between those. The opensea.com needs a way to read data off the chain, what NFTs are for sale, what NFTs do people own, interact with their smart contacts, and right to the chain, actually push transactions through the chain. Same thing is true for Phantom, to use you as an example. The wallet, it needs to understand people's activity, it needs to read and write from the chain. And we are the conduit that allows that.(04:01):What we do is provide that platform in which developers can read and write to the blockchain, we're your conduit there. For the more technical folks, that's traditionally you run your own node or you have access to a node, which is a JSON on our RPC provider which allows you to access that interface that you would to read and write from the chain. What I think makes us a developer platform and not just like a node is a couple things. The first one is scale. The way that these things evolved is you run a node and you're like "Oh, that's fine," and then you get more traffic and you're like "Oh, this node is hard it's not handling all traffic let me just spin up a second node and a third and a fourth, and then you're like "Now I need a load balancer, and now the nodes don't totally agree." As soon as you get past one node with a capacity a bunch of really hard problems emerge in managing that scale.(04:45):And so the first thing that we do is eliminate all those problems. We've spent a bunch of time on a bunch of proprietary infrastructure. We call this product super node where it gives you the interface that looks like a node interface but there's a bunch of complexity underneath the hood to account for inconsistency across nodes, and reorgs, and node upgrades, and all these different factors. We actually have an architecture diagram and the nodes are this tiny little box at the bottom, and the majority of it is all this infra that we've done to give you the feeling of a node. We've had customers who 10 or 100X their traffic overnight when they get super popular or explode or go viral, and our system just magically scales to handle that. So that's the first one is the scalability, reliability function.(05:23):The next one that I'd say is you can do a lot more if you're not just using the node interface. An analogy we like to use is building on the node interface is if all you have is a shovel, and a hand saw, and nails, you don't have a lot of leverage there versus we have a whole suite of enhanced APIs that are using power tools, and bulldozers, and modern construction equipment like our NFT API, or balanced API, transaction API. And all those things provide functionality that you would have to spend a lot of time, which we have spent, getting that functionality out of the node interface but we do it for you by building these APIs. So it really gives you a much more effective set of tools.(06:00):And then the third piece is our dashboard and our monitoring tools. In the new web 3.0 world, a lot of apps don't have the tools. The Web 2.0 world has 20, 30 years of services to help you understand the performance of your application, and how it's running, and which users are interacting with you. There's whole companies built around that. There's some startups doing this in the web 3.0 world but we provide a lot of value here where you can understand all the transactions flowing through your application, wh ich users are using it, where they are in the world if errors crop up what's causing those blah, blah. So that's really why we call ourselves a full-stack developer platform.Brian Friel (06:33):If we're translating this to the Web 2.0 world, what would you say are some similar analogies? Some company names that come to mind are maybe AWS, Datadog, Event Monitoring. How would you guys pitch that to a Web 2.0 company that's looking at web 3.0?Rob Boyle (06:46):AWS is the analogy that we draw the most to. AWS is, obviously, a massive company. You're like "Oh wow, your AWS is crypto." I almost think that undersells it a little bit because we do have aspects of Snowflake, we have aspects of Stripe even and how they help with transaction processing., We have a transaction product that helps you better to the chain. We have aspects of Datadog as you said, and Google Analytics to do some of that. The nice thing about web 3.0 is since it's a brand new industry we can take all those concepts that are really powerful in Web 2.0 and just go build them all for the web 3.0 world. AWS is the closest analogy but we wrap in parts of the other ones as well.Brian Friel (07:20):A one stop shop for everything you need too as well.Rob Boyle (07:23):Thank you, that's our pitch.Brian Friel (07:24):That's awesome. I want to talk a little bit about the devs that you guys are building for, right? So you painted this picture of you guys are everywhere across the developer stack. Who are these developers that you're seeing who are entering the crypto space?Rob Boyle (07:38):Every developer's our customer. And our ultimate goal is to have 1000X developers in the space. We think of this flywheel where when you have a new technology like blockchain, as soon as developers start building something in the space that attracts users. The more users are in the space the more developers turn their attention to it and they build more stuff which gets more users, which gets more developers, and you get this flywheel going. Where crypto sands is, development in crypto is really hard compared to Web 2.0. So the flywheel is running in sand or glue right now where it can't get spinning. And we identify it because the development is so hard it's hard to get this thing going, and some other difficulties as well. So we really just want to make development easier for literally anybody.(08:21):I'll talk about historic, our current customers. We have big enterprises so OpenSea runs on us, Phantom, to use you guys as an example as a luminary in the space, run on our platform. So we pride ourselves on having the scale and reliability to support the biggest names in the space. A lot of these companies were two people working on it on their own not that long ago so we really want to support new developers of the space. We see a lot of developers coming in who are new to crypto. They're coming from Web 2.0, they're coming from other spaces, and now they get excited about crypto. You can develop a really powerful app on our platform with a bunch of usage and not pay a cent because we have very generous free tier. And we also have education resources. We have Alchemy Ventures, a venture arm for some reason. I'll stop sharing all the stuff we do. I see a bunch of new people, I see traditional enterprises.(09:02):What I think is really interesting is increasingly we're seeing Web 2.0 enterprise companies enter the space. This is what excites me. With the most recent bear market, there's a lot of news around oh, a bunch of companies shutting down their web 3.0 efforts, and you saw potentially a pullback, but we're not really seeing that from our perspective as Alchemy. I see a lot of big traditional Web 2.0 enterprises who are still super bullish on experimenting in web 3.0. And have web 3.0 units they're building them out. We work with a ton of them. And they're way more crypto-native than you would think. I won't name the company but I talked to a very big old-school Web 2.0 company and they were complete Degen's who were very excited about a bunch of cool new blockchain technology and were very fluent and very excited about building it. So I think that's really excites me is seeing the Web 2.0 giants really continue to turn attention to the space.Brian Friel (09:51):That's super cool. And that's a bit of a narrative violation like you said. I noticed you guys just put out through Developer Report, you guys do this every year. Flipping through some of that, you guys have some pretty awesome numbers in there to show. I'm curious what you guys are seeing, especially as it relates to these Web 2.0 companies coming in? Are there any patterns that are emerging about why they're interested or what they potentially want to be building in the space? Or maybe even what chains they're going to first? Any patterns that you guys are seeing that maybe other people aren't aware of?Rob Boyle (10:21):That is a great question. The biggest pattern is that a lot of them are finding ways in which their existing business models and the value that they add to people can be augmented by web 3.0. I think that's the exciting part. If you look at Shopify, has rolled out some stuff around token-gated commerce. And that's an example where Shopify does, they build stores, and they realize oh, loyalty programs, some of these other solutions like exclusive access to key fans is a really exciting prospect, and this gets really empowered through web 3.0. So they're taking existing stuff they want to do, which is exclusive experiences for top customers or the ability for brands to give interesting experiences to people, and then using web 3.0 to go power that. So that's the pattern I see.(11:07):It's not like I see all these Web 2.0 companies being fired up about NFTs or DeFi or something, it's more them saying, "Oh, what's the core thing, the value I deliver as a business to people?" And then "Oh, there's all these ways that I can use web 3.0 to deliver more value or make that more exciting." And I think that's what actually makes crypto win. I don't know, some of these guys are like "Oh, I'll do yet another DeFi protocol." There's a lot of those. But it's when they start using web 3.0 to power the existing value chains that they already have.Brian Friel (11:35):It reminds me of what Reddit did with their ... They even called it digital collectibles instead of NFTs, but that was one of the most successful by the numbers successful version of that.Rob Boyle (11:44):They're a perfect example of this where they take something that they already do and they're like "Oh, this gets way better with crypto."Brian Friel (11:49):That's pretty cool. I think that's a world that I get excited about as well where it's a bit more getting mainstream people into it almost by accident or by chance. They see the value and they're like "Why wouldn't it be this way?" And it's a lot easier. Like you did, they can then go down the rabbit hole and learn everything else that's behind the curtain about crypto as well which is pretty cool.Rob Boyle (12:10):Our CEO Nikil has a great quote he likes to use that I love which is, "No one ever talks about using an internet application, that would be super awkward. They just talk about apps that add value." It's a little cliche at this point but the same is true of crypto. My mom is never going to talk about this cool blockchain app she used, she's going to talk about this app that allowed her to send money internationally with no fees or something. And then I can be like "Oh, that's built on crypto" and she'd be like "Oh, no way." I'm like that's how we're going to get a billion people under crypto. So the more that those types of apps start to develop the more exciting it is.Brian Friel (12:38):That's really cool. We also were chatting just before we hit record on this about another trend that you're seeing which is this migration off of the RPC interface. Back in the day, you're having to deal with these RPC providers directly and there's a lot of problems with scale that comes with that. I know you guys are doing a lot with abstracting stuff away. I also saw very recently you know guys put out Create Web3 dApp and just constantly making it easier and easier for them to build with. But can you talk a little bit about that trend that you guys are seeing as well?Rob Boyle (13:07):Completely. To another slightly labored analogy. I think working straight off JSON-RPC is like coding and assembly, or coding a web app without reactor any frameworks at all. It's pretty painful. The original people who came to crypto were super deep into crypto and that felt fine to them. They're like "Oh yeah I code an assembly all day, that's awesome. Just give me a JSON-RPC endpoint I'll go build something." But increasingly as new developers come to the space, especially Web 2.0 ones, they're like "Oh, show me the abstraction layers." Awesome. I understand that this exists, but in the Web 2.0 world I have all these frameworks and services that let me not have to think about a lot of the nuts and bolts and I can work at a higher level of abstraction. If you hit someone who's like "I have a great idea about how to use NFTs to do gated fan experiences." And we're like "That's great, but first spend two days learning how Eth Call works. It's really hard.(13:54):A bunch of what we try to do is meet developers where they are. Where they come in and they have cool ideas around NFTs and how to use them, and we're like "Awesome." We have a minting process, we have an allow list product called Spearmint, we have NFT APIs for getting data. And you can just think at the level of NFTs and not at the level of chain primitives, and that really accelerates how fast people can both learn and get up to speed and start building apps that deliver value. And I see a bunch of startups in this space too. There are a ton of startups. I'm not sure an advantage, but something that's really powerful that Alchemy has is our ventures arm Alchemy Ventures that's been super successful. People would be shocked at the number of people who run it, it's very small but they have investments in some of the most exciting companies in the space.(14:33):And through Alchemy Ventures we see a bunch of these new startups. And we've seen an explosion of startups building abstractions both in frameworks as well as services that allow people to think at these higher levels. And they're doing that on the read side, how people pull data off-chain, there's a bunch of really interesting data platform startups. As well as the right side. There's a bunch of really interesting startups helping people get data under the chain in a more abstract manner. Right now the majority of development is against the JSON-RPC interface which is more historical. In the future, I see the vast majority of development actually pointing at these higher level of abstractions. I think that's where most developers will be used to working in the future.Brian Friel (15:07):That's pretty cool. That's something that I think all devs here would be plenty happy to switch to. There's this meme on solana of chewing glass where the developers are all having to worry about not just finding product market fit with their own app but serializing and deserializing accounts. That context-switching is pretty brutal but it's a rite of passage. And then those devs who go through that process of chewing glass all feel very connected to one another and trial by fire experience.Rob Boyle (15:33):I think it's great for a sense of community for sure.Brian Friel (15:37):It really is. It's also helpful that you guys are turning glass into bubblegum here a little bit it makes it a lot easier.Rob Boyle (15:42):That's a great analogy. I'm going to give that to our solana marketing team they should use that. Glass into bubblegum.Brian Friel (15:49):There you go. TM, just add the credit to Zeitgeist podcast for that one.Rob Boyle (15:54):No worries. Hat tip all the way.Brian Friel (15:56):You mentioned that you guys have a couple different insights here. One, you're working with Web 2.0 companies directly. You have this ventures arm that you're making these investments in on these upcoming developer tools. You guys also have this Alchemy learn and Alchemy University platform as well though I think which captures the more long tail end of the developer. Do you think that the way we're going to be teaching developers, like these long tail devs, coming into web 3.0, do you think that's changing already? Are you guys incorporating that in your curriculum? Or is this still something that this abstraction layer is going to take some time and people still really need to learn Eth Call now or solana account to serialization? How do you guys think about that, trying to teach new incoming devs but also not abstracting too much of what makes crypto crypto?Rob Boyle (16:43):That is a great question. I think right now, as much as I get excited about these abstraction layers, none of them are seamless and you still need the fundamentals. If you take a modern CS class these days it's not like they start you with react to these frameworks, they start you with bare code, you're probably learning C or something. I think the same is true in web 3.0. I think our developer courses reflect this. We still teach people the basics. You really have to understand the basics. And I think the best combination is you get really fluent with the core fundamentals, the building blocks, the most basic layers. You really understand how the protocol works, you really understand how the accounts and onboarding works and signing, and then you get access to the more powerful tools. And then you're like "Oh cool, this saves me a bunch of time but I know how to use them effectively because I understand what they're doing under the hood." Our education classes reflect that. We still teach a bunch of basics, we introduce some of our abstractions.(17:37):I do think it would be a mistake if we had our education portal just telling you how to use Alchemy NFT API to build something or something. We launched a new project called Create Web3 DApp that gives you a bunch of template code. It basically helps you generate all the code depending on the app you're trying to create. The purpose of that is not necessarily so people can ignore that, it's so you can take all that code, digest it, understand what it's doing but not have to write it yourself and be like "Okay, cool." So you can get up and running fast but you still should definitely understand what it's doing. That's still important. I think it will be for the foreseeable future.Brian Friel (18:07):I tend to agree with that as well. That was my experience just learning the code as well. You're always looking for the equivalent of a crate react app but that's just because you want to get your ideas out in the world, but then pretty quickly you're going to need to actually learn what's going on here. I think that's the right approach for sure. So I want to switch gears a little bit and talk about how you guys are operating in a multichain world. Phantom and Alchemy came together because Phantom's expanding. In addition to solana are adding support for VM chains starting with Polygon and with Ethereum layer one. You guys, obviously, have your hands in a number of layer ones. Also, the world's moving to layer twos.(18:43):There's all this complexity that's happening in crypto and it feels like there's 1000 flowers blooming and everyone's trying to find what the right way to scale this is. I want to hear from you guys. How do you guys think about all these different L1 ecosystems? Do you believe in this multichain world? Do you focus most of your efforts in certain areas? How do you guys keep up with all these protocol changes? What's that from your guys' perspective?Rob Boyle (19:07):Totally. It's complex now. It's going to get worse or better depending on your perspective but I think it's going to get more complex from here. The most precious resource or scarcest resource in the world right now is block space. That is what it all comes down to. And especially if you look at onboarding billions of people onto blockchain, you're going to need so much more block space than we have right now. We don't see a clear path to that that doesn't involve a massively multichain world right now. There's some interesting scaling solutions, there's some L2s that are doing this, but if you just do the math on block space none of them are going to really meet the demand that we foresee for crypto without multichain.(19:46):And similarly, I think there's just a lot of different use cases where you can build a chain that's made for that. There are chains that are really good for gaming, there's ones that are really good for DeFi because of how the primitives are put together. So I absolutely think we're going to be in a multichain world. I saw an interesting quote from Optimism that I love. They're leaning into this really heavily. And they said, "Chains are the new smart contracts" where they want spinning up a chain to be as easy as deploying a smart contract is today. And they really think app chains, the first of which is base which, obviously, has gotten some good traction is the first of those.(20:17):We think about multichain as the defacto future. The question then is which chains do we support and how do we understand where we put that? A lot of that on our side is bandwidth constrained honestly. There's a bunch of chains I think are super exciting and we just have to be really disciplined around what chains does it make the most sense to support based on what we can provide the best support to. So that's one reason L2s have been straightforward for us but we've had to be more picky around L1s. We really pride ourselves in providing a really top-notch, incredibly reliable, incredibly scalable solution. And it is hard to do that if we're juggling multiple chains at once.(20:54):As you talked about, there are protocol releases all the time for these chains, a lot of them are moving targets. One of our requirements to work with a chain is a very, very deep relationship. So we have all of the protocol teams for all of these chains on Slack we talk to them constantly. We are pushing hundreds of fixes. I think at last count to a lot of these client teams we work really hand in hand with them. And it's hard to do that if we had 30, 40, 50 chains. So we form really deep relationships. So we have to be picky from a just how many chains we can support at the level of quality we want. But I absolutely think that we're going to have a ... Be in a multichain world.(21:29):In terms of looking into the crystal ball a little bit, I'm really bullish on the Eth ecosystem. I think using Eth as an L1 and then having a bunch of L2s and potentially thousands of app chains is a clear future. I think that's really interesting who's going to be a big player. But there's a lot of super interesting L1s. Like you said we support solana. I think they're doing really interesting stuff and have a very vibrant community. They have a future ahead of them.(21:51):And then the last piece that we're trying to think about is interoperability. There's people doing really cool projects in this space. That's going to have to be a core technology is actually being able to interact across chains. Because it's cool to say there's a bunch of chains, but if it's incredibly hard to move liquidity between them it's going to be very hard to use. People shouldn't actually even be thinking about chain necessarily if you're an end user, we have a long way to go for that to be seamless.Brian Friel (22:15):I would agree. We're noticing that already with Phantom's multichain betas. We're getting Eth people saying, "Hey, there's Clang source or a big mint happening on salona and I need to swap in but how do I do that? They want that simple one-click button interface. Like you guys are doing, there's a lot of lot of complexity to abstract the way behind the scenes there but it'll be a fun challenge in the year ahead.Rob Boyle (22:35):You guys are doing great work on this. That's what people expect, right? If you weren't so desensitized by being in crypto that you're used to all of this hoops you have to jump through you'd be like "Oh, I'm looking at my Phantom wallet, I have Eth, SOL's right there, I need more SOL. I should be able to move it over done and dusted. And instead it's a whole rigamarole. That's just not going to work.Brian Friel (22:55):We'll definitely get there. And we've already come a long way from the days of running the coin core full node on your desktop computer. Rob, this has been an awesome conversation. One question we ask all of our guests, and I want to hear this from you as well is, who is a builder that you admire in the web 3.0 ecosystem?Rob Boyle (23:13):At the risk of this sounding paid, I actually really love Phantom. I'll bring up someone else to just not to be totally a complete shill. Phantom is just upgrading the level of user experience to Web 2.0 standard. Phantom is obsessive about how good the UI is, the onboarding experience. And when I first used the Phantom wallet I was like "Oh, this feels like a Web 2.0 app, it doesn't feel like a web 3.0 app."(23:39):Web 3.0 apps, in general, is rough. It felt like a breath of fresh air. So I think you're doing a fantastic, fantastic job. I also also see it on the back end. As you know we work together on the RPC side. And if our latencies slip by a percent I get a message from a Phantom engineer right away being like "What's going on guys?" You guys are obsessive on the UI side and on the performance side. Anyway, so I really admire that. As much as it sometimes keeps me up at night because I know as soon as we have an issue I'm going to get a message the next day from you, I think you're really good.(24:08):I'll name a couple other ones. I think OpenSea is doing a really good job about also doing this multichain thing. I think they do a decent job about abstracting away chains. When you're browsing on there you can sometimes forget what chain something is on and they're trying to make that less of an issue at the front of people's minds. I think that's really the right direction to go. I really admire that. And then on the core tech. I know that's been a little controversial recently but LayerZero I think. We're going to need tech like that to be able to move across chains seamlessly so someone's got to be doing that type of stuff.Brian Friel (24:36):Those are all great suggestions. Appreciate the Phantom shout-out. That's definitely not required to be on this show.Rob Boyle (24:41):Of course. I'm an avid user so I can't not.Brian Friel (24:44):The feedback goes both ways so we obsess about our user's feedback too. If you have anything just let us know as well, we'll constantly try to improve on that. OpenSea and LayerZero too are great ones as well, I'll have to have those guys on the show. Rob, it's been awesome, thanks so much for coming on. Where can people go to learn more about Alchemy?Rob Boyle (25:01):Alchemy.com is pretty much your gateway to everything. It's your gateway to our university, learn more about education. That's your gateway to our products to get up and running and building. If you're sitting in a Web 2.0 company getting excited about building something, if your brand new to web 3.0, no matter what alchemy.com you can be directed. You'll find the resources you are looking for to learn more.Brian Friel (25:21):Love it. Rob Boyle, Product Lead at Alchemy, thanks so much for coming on.Rob Boyle (25:24):Brian, thank you so much this was super fun.
Cling and Clang go to the city above, leaving Athos and Figgs to meet a new partner and find the den of vampires near Undertide. Kenneth - GM JJ - Athos Lightfoot Tom - Figgs Cortplumo Hannah - Nymeria
Take Off That Mask, Clang Clang Clunk Went the Trolley and Where Were the Teachers?
Nick and Kyle recap the week in Heathcliff with Connor Clang, aka @RealHeathcliffs on Twitter! And what a week it was! It's Meat Robot Week! Plus, an exciting new HeathQuiz! Check out the images from the HeathQuiz here: https://imgur.com/gallery/Jcu0xkC Send us feedback on twitter @HeathcliffRecap or send us an email at HeathcliffRecap@gmail.com! Our theme song is Heathcliff's Meat Song by Louie Zong! Check him out at louiezong.com. Comics featured in the episode: February 9, 2023: https://www.gocomics.com/heathcliff/2023/02/09 February 10, 2023: https://www.gocomics.com/heathcliff/2023/02/10 February 11, 2023: https://www.gocomics.com/heathcliff/2023/02/11 February 13, 2023: https://www.gocomics.com/heathcliff/2023/02/13 February 14, 2023: https://www.gocomics.com/heathcliff/2023/02/14 February 15, 2023: https://www.gocomics.com/heathcliff/2023/02/15 February 16, 2023: https://www.gocomics.com/heathcliff/2023/02/16 July 6, 2022: https://www.gocomics.com/heathcliff/2022/07/06
Fletcher and Blaney's Notes, Quotes, and G.O.A.T.S – Magnesium Fire -Exec. Prods., nodebit, voidzero, Boolysteed, lavish – Carolyn and Fletcher go live after No Agenda with another musical show. We listen to music, read from the Regan-Era Encyclopedic Dictionary, listen to your voicemails and much more! Clang hoe ore wa baka NOTES Stand-Up Comedy By […]
Ayesha, you are under arrest for violating the curfew,” the words echo inside. “Tomorrow you are sentenced to the Ocean of Mistakes.” "God, please save me," I pray. I've been pleading all night in my prison cell. "I know I've failed. Please forgive me.".Sunrise peeps through the window. CLANG. The door opens. My jailers clamp chains around my hands. The rule book is clipped to my clothes. Silence..They drag me, throwing me into an open top truck as an example for all to see, a warning for others to avoid making the same mistake. Screaming and crying rings in my ears, and with horror I realize they are my own tortured screams. “I can't take any more,” I sob as the truck bounces along the uneven track to the sandy beach. “I tried to follow your rules and regulations. I can't be perfect.” Silence..The truck stops with a screech. I'm carried onto the speed boat. Mouth dry, bile burns inside of me until we reach the ebony part of the ocean. Hands grip mine. I taste salt and fear. Struggling, fighting to stay on the boat until weakness takes over. My body is thrown overboard. Icy water prickles my skin. Gasping, flailing, tears pour down my face, the chains and the rule book drag me down. The boat speeds off in the distance, a tiny speck. Here, then gone..“Ayesha, don't try to swim,” calls a voice. "Am I hallucinating?" I wonder, choking on the water. Suddenly the waves stop; a shining figure on the water glides toward me. “Stay calm.”.“Jesus?” I whisper. “Jesus?”.“Ayesha, I love you more than your mistakes,” He says, love in His eyes, His words breaking my chains. “I heard your prayers. I forgive you. This is a fresh start. Leave your old life behind. Follow me.” His words of love shred the pages of the rule book until it disappears. Reaching out to me, He grasps my hand. He leads me, and I follow Him to a fresh start and a new life. • Cindy Lee.• Have you ever felt like you were drowning in guilt—pulled down by the weight of your sin? It's never too late. God loves us deeply and sent His Son Jesus to die on the cross for sin and brokenness—and rise from the dead to renew His creation and His people. Through Jesus, God heals our brokenness. Consider taking a moment to bring any sin to Him and receive His forgiveness.....he freed us from the penalty for our sins. Romans 3:24 (NLT)
It's No More Whoppers, the holiday gourd of podcasts! This time around: a rousing start with all the bad things that happened; our pivot to video; mini Geneses and phantastic stars; revisiting Village; and basically a lot more video game talk because it's comforting and we need it right now! And the Scatman. This podcast needs to be extremely hardcore. ==MUSIC== Tomoya Otani / Seirou Okamoto - Heaven's Labyrinth (Magic Knight Rayearth) AIRCRAFT - Nekomusume
This week's episode looks at “All You Need is Love”, the Our World TV special, and the career of the Beatles from April 1966 through August 1967. Click the full post to read liner notes, links to more information, and a transcript of the episode. Patreon backers also have a thirteen-minute bonus episode available, on "Rain" by the Beatles. Tilt Araiza has assisted invaluably by doing a first-pass edit, and will hopefully be doing so from now on. Check out Tilt's irregular podcasts at http://www.podnose.com/jaffa-cakes-for-proust and http://sitcomclub.com/ NB for the first few hours this was up, there was a slight editing glitch. If you downloaded the old version and don't want to redownload the whole thing, just look in the transcript for "Other than fixing John's two flubbed" for the text of the two missing paragraphs. Errata I say "Come Together" was a B-side, but the single was actually a double A-side. Also, I say the Lennon interview by Maureen Cleave appeared in Detroit magazine. That's what my source (Steve Turner's book) says, but someone on Twitter says that rather than Detroit magazine it was the Detroit Free Press. Also at one point I say "the videos for 'Paperback Writer' and 'Penny Lane'". I meant to say "Rain" rather than "Penny Lane" there. Resources No Mixcloud this week due to the number of songs by the Beatles. I have read literally dozens of books on the Beatles, and used bits of information from many of them. All my Beatles episodes refer to: The Complete Beatles Chronicle by Mark Lewisohn, All The Songs: The Stories Behind Every Beatles Release by Jean-Michel Guesdon, And The Band Begins To Play: The Definitive Guide To The Songs of The Beatles by Steve Lambley, The Beatles By Ear by Kevin Moore, Revolution in the Head by Ian MacDonald, and The Beatles Anthology. For this episode, I also referred to Last Interview by David Sheff, a longform interview with John Lennon and Yoko Ono from shortly before Lennon's death; Many Years From Now by Barry Miles, an authorised biography of Paul McCartney; and Here, There, and Everywhere: My Life Recording the Music of the Beatles by Geoff Emerick and Howard Massey. Particularly useful this time was Steve Turner's book Beatles '66. I also used Turner's The Beatles: The Stories Behind the Songs 1967-1970. Johnny Rogan's Starmakers and Svengalis had some information on Epstein I hadn't seen anywhere else. Some information about the "Bigger than Jesus" scandal comes from Ward, B. (2012). “The ‘C' is for Christ”: Arthur Unger, Datebook Magazine and the Beatles. Popular Music and Society, 35(4), 541-560. https://doi.org/10.1080/03007766.2011.608978 Information on Robert Stigwood comes from Mr Showbiz by Stephen Dando-Collins. And the quote at the end from Simon Napier-Bell is from You Don't Have to Say You Love Me, which is more entertaining than it is accurate, but is very entertaining. Sadly the only way to get the single mix of "All You Need is Love" is on this ludicrously-expensive out-of-print box set, but the stereo mix is easily available on Magical Mystery Tour. Patreon This podcast is brought to you by the generosity of my backers on Patreon. Why not join them? Transcript A quick note before I start the episode -- this episode deals, in part, with the deaths of three gay men -- one by murder, one by suicide, and one by an accidental overdose, all linked at least in part to societal homophobia. I will try to deal with this as tactfully as I can, but anyone who's upset by those things might want to read the transcript instead of listening to the episode. This is also a very, very, *very* long episode -- this is likely to be the longest episode I *ever* do of this podcast, so settle in. We're going to be here a while. I obviously don't know how long it's going to be while I'm still recording, but based on the word count of my script, probably in the region of three hours. You have been warned. In 1967 the actor Patrick McGoohan was tired. He had been working on the hit series Danger Man for many years -- Danger Man had originally run from 1960 through 1962, then had taken a break, and had come back, retooled, with longer episodes in 1964. That longer series was a big hit, both in the UK and in the US, where it was retitled Secret Agent and had a new theme tune written by PF Sloan and Steve Barri and recorded by Johnny Rivers: [Excerpt: Johnny Rivers, "Secret Agent Man"] But McGoohan was tired of playing John Drake, the agent, and announced he was going to quit the series. Instead, with the help of George Markstein, Danger Man's script editor, he created a totally new series, in which McGoohan would star, and which McGoohan would also write and direct key episodes of. This new series, The Prisoner, featured a spy who is only ever given the name Number Six, and who many fans -- though not McGoohan himself -- took to be the same character as John Drake. Number Six resigns from his job as a secret agent, and is kidnapped and taken to a place known only as The Village -- the series was filmed in Portmeirion, an unusual-looking town in Gwynnedd, in North Wales -- which is full of other ex-agents. There he is interrogated to try to find out why he has quit his job. It's never made clear whether the interrogators are his old employers or their enemies, and there's a certain suggestion that maybe there is no real distinction between the two sides, that they're both running the Village together. He spends the entire series trying to escape, but refuses to explain himself -- and there's some debate among viewers as to whether it's implied or not that part of the reason he doesn't explain himself is that he knows his interrogators wouldn't understand why he quit: [Excerpt: The Prisoner intro, from episode Once Upon a Time, ] Certainly that explanation would fit in with McGoohan's own personality. According to McGoohan, the final episode of The Prisoner was, at the time, the most watched TV show ever broadcast in the UK, as people tuned in to find out the identity of Number One, the person behind the Village, and to see if Number Six would break free. I don't think that's actually the case, but it's what McGoohan always claimed, and it was certainly a very popular series. I won't spoil the ending for those of you who haven't watched it -- it's a remarkable series -- but ultimately the series seems to decide that such questions don't matter and that even asking them is missing the point. It's a work that's open to multiple interpretations, and is left deliberately ambiguous, but one of the messages many people have taken away from it is that not only are we trapped by a society that oppresses us, we're also trapped by our own identities. You can run from the trap that society has placed you in, from other people's interpretations of your life, your work, and your motives, but you ultimately can't run from yourself, and any time you try to break out of a prison, you'll find yourself trapped in another prison of your own making. The most horrifying implication of the episode is that possibly even death itself won't be a release, and you will spend all eternity trying to escape from an identity you're trapped in. Viewers became so outraged, according to McGoohan, that he had to go into hiding for an extended period, and while his later claims that he never worked in Britain again are an exaggeration, it is true that for the remainder of his life he concentrated on doing work in the US instead, where he hadn't created such anger. That final episode of The Prisoner was also the only one to use a piece of contemporary pop music, in two crucial scenes: [Excerpt: The Prisoner, "Fall Out", "All You Need is Love"] Back in October 2020, we started what I thought would be a year-long look at the period from late 1962 through early 1967, but which has turned out for reasons beyond my control to take more like twenty months, with a song which was one of the last of the big pre-Beatles pop hits, though we looked at it after their first single, "Telstar" by the Tornadoes: [Excerpt: The Tornadoes, "Telstar"] There were many reasons for choosing that as one of the bookends for this fifty-episode chunk of the podcast -- you'll see many connections between that episode and this one if you listen to them back-to-back -- but among them was that it's a song inspired by the launch of the first ever communications satellite, and a sign of how the world was going to become smaller as the sixties went on. Of course, to start with communications satellites didn't do much in that regard -- they were expensive to use, and had limited bandwidth, and were only available during limited time windows, but symbolically they meant that for the first time ever, people could see and hear events thousands of miles away as they were happening. It's not a coincidence that Britain and France signed the agreement to develop Concorde, the first supersonic airliner, a month after the first Beatles single and four months after the Telstar satellite was launched. The world was becoming ever more interconnected -- people were travelling faster and further, getting news from other countries quicker, and there was more cultural conversation – and misunderstanding – between countries thousands of miles apart. The Canadian media theorist Marshall McLuhan, the man who also coined the phrase “the medium is the message”, thought that this ever-faster connection would fundamentally change basic modes of thought in the Western world. McLuhan thought that technology made possible whole new modes of thought, and that just as the printing press had, in his view, caused Western liberalism and individualism, so these new electronic media would cause the rise of a new collective mode of thought. In 1962, the year of Concorde, Telstar, and “Love Me Do”, McLuhan wrote a book called The Gutenberg Galaxy, in which he said: “Instead of tending towards a vast Alexandrian library the world has become a computer, an electronic brain, exactly as an infantile piece of science fiction. And as our senses have gone outside us, Big Brother goes inside. So, unless aware of this dynamic, we shall at once move into a phase of panic terrors, exactly befitting a small world of tribal drums, total interdependence, and superimposed co-existence.… Terror is the normal state of any oral society, for in it everything affects everything all the time.…” He coined the term “the Global Village” to describe this new collectivism. The story we've seen over the last fifty episodes is one of a sort of cultural ping-pong between the USA and the UK, with innovations in American music inspiring British musicians, who in turn inspired American ones, whether that being the Beatles covering the Isley Brothers or the Rolling Stones doing a Bobby Womack song, or Paul Simon and Bob Dylan coming over to the UK and learning folk songs and guitar techniques from Martin Carthy. And increasingly we're going to see those influences spread to other countries, and influences coming *from* other countries. We've already seen one Jamaican artist, and the influence of Indian music has become very apparent. While the focus of this series is going to remain principally in the British Isles and North America, rock music was and is a worldwide phenomenon, and that's going to become increasingly a part of the story. And so in this episode we're going to look at a live performance -- well, mostly live -- that was seen by hundreds of millions of people all over the world as it happened, thanks to the magic of satellites: [Excerpt: The Beatles, "All You Need is Love"] When we left the Beatles, they had just finished recording "Tomorrow Never Knows", the most experimental track they had recorded up to that date, and if not the most experimental thing they *ever* recorded certainly in the top handful. But "Tomorrow Never Knows" was only the first track they recorded in the sessions for what would become arguably their greatest album, and certainly the one that currently has the most respect from critics. It's interesting to note that that album could have been very, very, different. When we think of Revolver now, we think of the innovative production of George Martin, and of Geoff Emerick and Ken Townshend's inventive ideas for pushing the sound of the equipment in Abbey Road studios, but until very late in the day the album was going to be recorded in the Stax studios in Memphis, with Steve Cropper producing -- whether George Martin would have been involved or not is something we don't even know. In 1965, the Rolling Stones had, as we've seen, started making records in the US, recording in LA and at the Chess studios in Chicago, and the Yardbirds had also been doing the same thing. Mick Jagger had become a convert to the idea of using American studios and working with American musicians, and he had constantly been telling Paul McCartney that the Beatles should do the same. Indeed, they'd put some feelers out in 1965 about the possibility of the group making an album with Holland, Dozier, and Holland in Detroit. Quite how this would have worked is hard to figure out -- Holland, Dozier, and Holland's skills were as songwriters, and in their work with a particular set of musicians -- so it's unsurprising that came to nothing. But recording at Stax was a different matter. While Steve Cropper was a great songwriter in his own right, he was also adept at getting great sounds on covers of other people's material -- like on Otis Blue, the album he produced for Otis Redding in late 1965, which doesn't include a single Cropper original: [Excerpt: Otis Redding, "Satisfaction"] And the Beatles were very influenced by the records Stax were putting out, often namechecking Wilson Pickett in particular, and during the Rubber Soul sessions they had recorded a "Green Onions" soundalike track, imaginatively titled "12-Bar Original": [Excerpt: The Beatles, "12-Bar Original"] The idea of the group recording at Stax got far enough that they were actually booked in for two weeks starting the ninth of April, and there was even an offer from Elvis to let them stay at Graceland while they recorded, but then a couple of weeks earlier, the news leaked to the press, and Brian Epstein cancelled the booking. According to Cropper, Epstein talked about recording at the Atlantic studios in New York with him instead, but nothing went any further. It's hard to imagine what a Stax-based Beatles album would have been like, but even though it might have been a great album, it certainly wouldn't have been the Revolver we've come to know. Revolver is an unusual album in many ways, and one of the ways it's most distinct from the earlier Beatles albums is the dominance of keyboards. Both Lennon and McCartney had often written at the piano as well as the guitar -- McCartney more so than Lennon, but both had done so regularly -- but up to this point it had been normal for them to arrange the songs for guitars rather than keyboards, no matter how they'd started out. There had been the odd track where one of them, usually Lennon, would play a simple keyboard part, songs like "I'm Down" or "We Can Work it Out", but even those had been guitar records first and foremost. But on Revolver, that changed dramatically. There seems to have been a complex web of cause and effect here. Paul was becoming increasingly interested in moving his basslines away from simple walking basslines and root notes and the other staples of rock and roll basslines up to this point. As the sixties progressed, rock basslines were becoming ever more complex, and Tyler Mahan Coe has made a good case that this is largely down to innovations in production pioneered by Owen Bradley, and McCartney was certainly aware of Bradley's work -- he was a fan of Brenda Lee, who Bradley produced, for example. But the two influences that McCartney has mentioned most often in this regard are the busy, jazz-influenced, basslines that James Jamerson was playing at Motown: [Excerpt: The Four Tops, "It's the Same Old Song"] And the basslines that Brian Wilson was writing for various Wrecking Crew bassists to play for the Beach Boys: [Excerpt: The Beach Boys, "Don't Talk (Put Your Head on My Shoulder)"] Just to be clear, McCartney didn't hear that particular track until partway through the recording of Revolver, when Bruce Johnston visited the UK and brought with him an advance copy of Pet Sounds, but Pet Sounds influenced the later part of Revolver's recording, and Wilson had already started his experiments in that direction with the group's 1965 work. It's much easier to write a song with this kind of bassline, one that's integral to the composition, on the piano than it is to write it on a guitar, as you can work out the bassline with your left hand while working out the chords and melody with your right, so the habit that McCartney had already developed of writing on the piano made this easier. But also, starting with the recording of "Paperback Writer", McCartney switched his style of working in the studio. Where up to this point it had been normal for him to play bass as part of the recording of the basic track, playing with the other Beatles, he now started to take advantage of multitracking to overdub his bass later, so he could spend extra time getting the bassline exactly right. McCartney lived closer to Abbey Road than the other three Beatles, and so could more easily get there early or stay late and tweak his parts. But if McCartney wasn't playing bass while the guitars and drums were being recorded, that meant he could play something else, and so increasingly he would play piano during the recording of the basic track. And that in turn would mean that there wouldn't always *be* a need for guitars on the track, because the harmonic support they would provide would be provided by the piano instead. This, as much as anything else, is the reason that Revolver sounds so radically different to any other Beatles album. Up to this point, with *very* rare exceptions like "Yesterday", every Beatles record, more or less, featured all four of the Beatles playing instruments. Now John and George weren't playing on "Good Day Sunshine" or "For No One", John wasn't playing on "Here, There, and Everywhere", "Eleanor Rigby" features no guitars or drums at all, and George's "Love You To" only features himself, plus a little tambourine from Ringo (Paul recorded a part for that one, but it doesn't seem to appear on the finished track). Of the three songwriting Beatles, the only one who at this point was consistently requiring the instrumental contributions of all the other band members was John, and even he did without Paul on "She Said, She Said", which by all accounts features either John or George on bass, after Paul had a rare bout of unprofessionalism and left the studio. Revolver is still an album made by a group -- and most of those tracks that don't feature John or George instrumentally still feature them vocally -- it's still a collaborative work in all the best ways. But it's no longer an album made by four people playing together in the same room at the same time. After starting work on "Tomorrow Never Knows", the next track they started work on was Paul's "Got to Get You Into My Life", but as it would turn out they would work on that song throughout most of the sessions for the album -- in a sign of how the group would increasingly work from this point on, Paul's song was subject to multiple re-recordings and tweakings in the studio, as he tinkered to try to make it perfect. The first recording to be completed for the album, though, was almost as much of a departure in its own way as "Tomorrow Never Knows" had been. George's song "Love You To" shows just how inspired he was by the music of Ravi Shankar, and how devoted he was to Indian music. While a few months earlier he had just about managed to pick out a simple melody on the sitar for "Norwegian Wood", by this point he was comfortable enough with Indian classical music that I've seen many, many sources claim that an outside session player is playing sitar on the track, though Anil Bhagwat, the tabla player on the track, always insisted that it was entirely Harrison's playing: [Excerpt: The Beatles, "Love You To"] There is a *lot* of debate as to whether it's George playing on the track, and I feel a little uncomfortable making a definitive statement in either direction. On the one hand I find it hard to believe that Harrison got that good that quickly on an unfamiliar instrument, when we know he wasn't a naturally facile musician. All the stories we have about his work in the studio suggest that he had to work very hard on his guitar solos, and that he would frequently fluff them. As a technical guitarist, Harrison was only mediocre -- his value lay in his inventiveness, not in technical ability -- and he had been playing guitar for over a decade, but sitar only a few months. There's also some session documentation suggesting that an unknown sitar player was hired. On the other hand there's the testimony of Anil Bhagwat that Harrison played the part himself, and he has been very firm on the subject, saying "If you go on the Internet there are a lot of questions asked about "Love You To". They say 'It's not George playing the sitar'. I can tell you here and now -- 100 percent it was George on sitar throughout. There were no other musicians involved. It was just me and him." And several people who are more knowledgeable than myself about the instrument have suggested that the sitar part on the track is played the way that a rock guitarist would play rather than the way someone with more knowledge of Indian classical music would play -- there's a blues feeling to some of the bends that apparently no genuine Indian classical musician would naturally do. I would suggest that the best explanation is that there's a professional sitar player trying to replicate a part that Harrison had previously demonstrated, while Harrison was in turn trying his best to replicate the sound of Ravi Shankar's work. Certainly the instrumental section sounds far more fluent, and far more stylistically correct, than one would expect: [Excerpt: The Beatles, "Love You To"] Where previous attempts at what got called "raga-rock" had taken a couple of surface features of Indian music -- some form of a drone, perhaps a modal scale -- and had generally used a guitar made to sound a little bit like a sitar, or had a sitar playing normal rock riffs, Harrison's song seems to be a genuine attempt to hybridise Indian ragas and rock music, combining the instrumentation, modes, and rhythmic complexity of someone like Ravi Shankar with lyrics that are seemingly inspired by Bob Dylan and a fairly conventional pop song structure (and a tiny bit of fuzz guitar). It's a record that could only be made by someone who properly understood both the Indian music he's emulating and the conventions of the Western pop song, and understood how those conventions could work together. Indeed, one thing I've rarely seen pointed out is how cleverly the album is sequenced, so that "Love You To" is followed by possibly the most conventional song on Revolver, "Here, There, and Everywhere", which was recorded towards the end of the sessions. Both songs share a distinctive feature not shared by the rest of the album, so the two songs can sound more of a pair than they otherwise would, retrospectively making "Love You To" seem more conventional than it is and "Here, There, and Everywhere" more unconventional -- both have as an introduction a separate piece of music that states some of the melodic themes of the rest of the song but isn't repeated later. In the case of "Love You To" it's the free-tempo bit at the beginning, characteristic of a lot of Indian music: [Excerpt: The Beatles, "Love You To"] While in the case of "Here, There, and Everywhere" it's the part that mimics an older style of songwriting, a separate intro of the type that would have been called a verse when written by the Gershwins or Cole Porter, but of course in the intervening decades "verse" had come to mean something else, so we now no longer have a specific term for this kind of intro -- but as you can hear, it's doing very much the same thing as that "Love You To" intro: [Excerpt: The Beatles, "Here, There, and Everywhere"] In the same day as the group completed "Love You To", overdubbing George's vocal and Ringo's tambourine, they also started work on a song that would show off a lot of the new techniques they had been working on in very different ways. Paul's "Paperback Writer" could indeed be seen as part of a loose trilogy with "Love You To" and "Tomorrow Never Knows", one song by each of the group's three songwriters exploring the idea of a song that's almost all on one chord. Both "Tomorrow Never Knows" and "Love You To" are based on a drone with occasional hints towards moving to one other chord. In the case of "Paperback Writer", the entire song stays on a single chord until the title -- it's on a G7 throughout until the first use of the word "writer", when it quickly goes to a C for two bars. I'm afraid I'm going to have to sing to show you how little the chords actually change, because the riff disguises this lack of movement somewhat, but the melody is also far more horizontal than most of McCartney's, so this shouldn't sound too painful, I hope: [demonstrates] This is essentially the exact same thing that both "Love You To" and "Tomorrow Never Knows" do, and all three have very similarly structured rising and falling modal melodies. There's also a bit of "Paperback Writer" that seems to tie directly into "Love You To", but also points to a possible very non-Indian inspiration for part of "Love You To". The Beach Boys' single "Sloop John B" was released in the UK a couple of days after the sessions for "Paperback Writer" and "Love You To", but it had been released in the US a month before, and the Beatles all got copies of every record in the American top thirty shipped to them. McCartney and Harrison have specifically pointed to it as an influence on "Paperback Writer". "Sloop John B" has a section where all the instruments drop out and we're left with just the group's vocal harmonies: [Excerpt: The Beach Boys, "Sloop John B"] And that seems to have been the inspiration behind the similar moment at a similar point in "Paperback Writer", which is used in place of a middle eight and also used for the song's intro: [Excerpt: The Beatles, "Paperback Writer"] Which is very close to what Harrison does at the end of each verse of "Love You To", where the instruments drop out for him to sing a long melismatic syllable before coming back in: [Excerpt: The Beatles, "Love You To"] Essentially, other than "Got to Get You Into My Life", which is an outlier and should not be counted, the first three songs attempted during the Revolver sessions are variations on a common theme, and it's a sign that no matter how different the results might sound, the Beatles really were very much a group at this point, and were sharing ideas among themselves and developing those ideas in similar ways. "Paperback Writer" disguises what it's doing somewhat by having such a strong riff. Lennon referred to "Paperback Writer" as "son of 'Day Tripper'", and in terms of the Beatles' singles it's actually their third iteration of this riff idea, which they originally got from Bobby Parker's "Watch Your Step": [Excerpt: Bobby Parker, "Watch Your Step"] Which became the inspiration for "I Feel Fine": [Excerpt: The Beatles, "I Feel Fine"] Which they varied for "Day Tripper": [Excerpt: The Beatles, "Day Tripper"] And which then in turn got varied for "Paperback Writer": [Excerpt: The Beatles, "Paperback Writer"] As well as compositional ideas, there are sonic ideas shared between "Paperback Writer", "Tomorrow Never Knows", and "Love You To", and which would be shared by the rest of the tracks the Beatles recorded in the first half of 1966. Since Geoff Emerick had become the group's principal engineer, they'd started paying more attention to how to get a fuller sound, and so Emerick had miced the tabla on "Love You To" much more closely than anyone would normally mic an instrument from classical music, creating a deep, thudding sound, and similarly he had changed the way they recorded the drums on "Tomorrow Never Knows", again giving a much fuller sound. But the group also wanted the kind of big bass sounds they'd loved on records coming out of America -- sounds that no British studio was getting, largely because it was believed that if you cut too loud a bass sound into a record it would make the needle jump out of the groove. The new engineering team of Geoff Emerick and Ken Scott, though, thought that it was likely you could keep the needle in the groove if you had a smoother frequency response. You could do that if you used a microphone with a larger diaphragm to record the bass, but how could you do that? Inspiration finally struck -- loudspeakers are actually the same thing as microphones wired the other way round, so if you wired up a loudspeaker as if it were a microphone you could get a *really big* speaker, place it in front of the bass amp, and get a much stronger bass sound. The experiment wasn't a total success -- the sound they got had to be processed quite extensively to get rid of room noise, and then compressed in order to further prevent the needle-jumping issue, and so it's a muddier, less defined, tone than they would have liked, but one thing that can't be denied is that "Paperback Writer"'s bass sound is much, much, louder than on any previous Beatles record: [Excerpt: The Beatles, "Paperback Writer"] Almost every track the group recorded during the Revolver sessions involved all sorts of studio innovations, though rarely anything as truly revolutionary as the artificial double-tracking they'd used on "Tomorrow Never Knows", and which also appeared on "Paperback Writer" -- indeed, as "Paperback Writer" was released several months before Revolver, it became the first record released to use the technique. I could easily devote a good ten minutes to every track on Revolver, and to "Paperback Writer"s B-side, "Rain", but this is already shaping up to be an extraordinarily long episode and there's a lot of material to get through, so I'll break my usual pattern of devoting a Patreon bonus episode to something relatively obscure, and this week's bonus will be on "Rain" itself. "Paperback Writer", though, deserved the attention here even though it was not one of the group's more successful singles -- it did go to number one, but it didn't hit number one in the UK charts straight away, being kept off the top by "Strangers in the Night" by Frank Sinatra for the first week: [Excerpt: Frank Sinatra, "Strangers in the Night"] Coincidentally, "Strangers in the Night" was co-written by Bert Kaempfert, the German musician who had produced the group's very first recording sessions with Tony Sheridan back in 1961. On the group's German tour in 1966 they met up with Kaempfert again, and John greeted him by singing the first couple of lines of the Sinatra record. The single was the lowest-selling Beatles single in the UK since "Love Me Do". In the US it only made number one for two non-consecutive weeks, with "Strangers in the Night" knocking it off for a week in between. Now, by literally any other band's standards, that's still a massive hit, and it was the Beatles' tenth UK number one in a row (or ninth, depending on which chart you use for "Please Please Me"), but it's a sign that the group were moving out of the first phase of total unequivocal dominance of the charts. It was a turning point in a lot of other ways as well. Up to this point, while the group had been experimenting with different lyrical subjects on album tracks, every single had lyrics about romantic relationships -- with the possible exception of "Help!", which was about Lennon's emotional state but written in such a way that it could be heard as a plea to a lover. But in the case of "Paperback Writer", McCartney was inspired by his Aunt Mill asking him "Why do you write songs about love all the time? Can you ever write about a horse or the summit conference or something interesting?" His response was to think "All right, Aunt Mill, I'll show you", and to come up with a lyric that was very much in the style of the social satires that bands like the Kinks were releasing at the time. People often miss the humour in the lyric for "Paperback Writer", but there's a huge amount of comedy in lyrics about someone writing to a publisher saying they'd written a book based on someone else's book, and one can only imagine the feeling of weary recognition in slush-pile readers throughout the world as they heard the enthusiastic "It's a thousand pages, give or take a few, I'll be writing more in a week or two. I can make it longer..." From this point on, the group wouldn't release a single that was unambiguously about a romantic relationship until "The Ballad of John and Yoko", the last single released while the band were still together. "Paperback Writer" also saw the Beatles for the first time making a promotional film -- what we would now call a rock video -- rather than make personal appearances on TV shows. The film was directed by Michael Lindsay-Hogg, who the group would work with again in 1969, and shows Paul with a chipped front tooth -- he'd been in an accident while riding mopeds with his friend Tara Browne a few months earlier, and hadn't yet got round to having the tooth capped. When he did, the change in his teeth was one of the many bits of evidence used by conspiracy theorists to prove that the real Paul McCartney was dead and replaced by a lookalike. It also marks a change in who the most prominent Beatle on the group's A-sides was. Up to this point, Paul had had one solo lead on an A-side -- "Can't Buy Me Love" -- and everything else had been either a song with multiple vocalists like "Day Tripper" or "Love Me Do", or a song with a clear John lead like "Ticket to Ride" or "I Feel Fine". In the rest of their career, counting "Paperback Writer", the group would release nine new singles that hadn't already been included on an album. Of those nine singles, one was a double A-side with one John song and one Paul song, two had John songs on the A-side, and the other six were Paul. Where up to this point John had been "lead Beatle", for the rest of the sixties, Paul would be the group's driving force. Oddly, Paul got rather defensive about the record when asked about it in interviews after it failed to go straight to the top, saying "It's not our best single by any means, but we're very satisfied with it". But especially in its original mono mix it actually packs a powerful punch: [Excerpt: The Beatles, "Paperback Writer"] When the "Paperback Writer" single was released, an unusual image was used in the advertising -- a photo of the Beatles dressed in butchers' smocks, covered in blood, with chunks of meat and the dismembered body parts of baby dolls lying around on them. The image was meant as part of a triptych parodying religious art -- the photo on the left was to be an image showing the four Beatles connected to a woman by an umbilical cord made of sausages, the middle panel was meant to be this image, but with halos added over the Beatles' heads, and the panel on the right was George hammering a nail into John's head, symbolising both crucifixion and that the group were real, physical, people, not just images to be worshipped -- these weren't imaginary nails, and they weren't imaginary people. The photographer Robert Whittaker later said: “I did a photograph of the Beatles covered in raw meat, dolls and false teeth. Putting meat, dolls and false teeth with The Beatles is essentially part of the same thing, the breakdown of what is regarded as normal. The actual conception for what I still call “Somnambulant Adventure” was Moses coming down from Mount Sinai with the Ten Commandments. He comes across people worshipping a golden calf. All over the world I'd watched people worshiping like idols, like gods, four Beatles. To me they were just stock standard normal people. But this emotion that fans poured on them made me wonder where Christianity was heading.” The image wasn't that controversial in the UK, when it was used to advertise "Paperback Writer", but in the US it was initially used for the cover of an album, Yesterday... And Today, which was made up of a few tracks that had been left off the US versions of the Rubber Soul and Help! albums, plus both sides of the "We Can Work It Out"/"Day Tripper" single, and three rough mixes of songs that had been recorded for Revolver -- "Doctor Robert", "And Your Bird Can Sing", and "I'm Only Sleeping", which was the song that sounded most different from the mixes that were finally released: [Excerpt: The Beatles, "I'm Only Sleeping (Yesterday... and Today mix)"] Those three songs were all Lennon songs, which had the unfortunate effect that when the US version of Revolver was brought out later in the year, only two of the songs on the album were by Lennon, with six by McCartney and three by Harrison. Some have suggested that this was the motivation for the use of the butcher image on the cover of Yesterday... And Today -- saying it was the Beatles' protest against Capitol "butchering" their albums -- but in truth it was just that Capitol's art director chose the cover because he liked the image. Alan Livingston, the president of Capitol was not so sure, and called Brian Epstein to ask if the group would be OK with them using a different image. Epstein checked with John Lennon, but Lennon liked the image and so Epstein told Livingston the group insisted on them using that cover. Even though for the album cover the bloodstains on the butchers' smocks were airbrushed out, after Capitol had pressed up a million copies of the mono version of the album and two hundred thousand copies of the stereo version, and they'd sent out sixty thousand promo copies, they discovered that no record shops would stock the album with that cover. It cost Capitol more than two hundred thousand dollars to recall the album and replace the cover with a new one -- though while many of the covers were destroyed, others had the new cover, with a more acceptable photo of the group, pasted over them, and people have later carefully steamed off the sticker to reveal the original. This would not be the last time in 1966 that something that was intended as a statement on religion and the way people viewed the Beatles would cause the group trouble in America. In the middle of the recording sessions for Revolver, the group also made what turned out to be their last ever UK live performance in front of a paying audience. The group had played the NME Poll-Winners' Party every year since 1963, and they were always shows that featured all the biggest acts in the country at the time -- the 1966 show featured, as well as the Beatles and a bunch of smaller acts, the Rolling Stones, the Who, the Yardbirds, Roy Orbison, Cliff Richard and the Shadows, the Seekers, the Small Faces, the Walker Brothers, and Dusty Springfield. Unfortunately, while these events were always filmed for TV broadcast, the Beatles' performance on the first of May wasn't filmed. There are various stories about what happened, but the crux appears to be a disagreement between Andrew Oldham and Brian Epstein, sparked by John Lennon. When the Beatles got to the show, they were upset to discover that they had to wait around before going on stage -- normally, the awards would all be presented at the end, after all the performances, but the Rolling Stones had asked that the Beatles not follow them directly, so after the Stones finished their set, there would be a break for the awards to be given out, and then the Beatles would play their set, in front of an audience that had been bored by twenty-five minutes of awards ceremony, rather than one that had been excited by all the bands that came before them. John Lennon was annoyed, and insisted that the Beatles were going to go on straight after the Rolling Stones -- he seems to have taken this as some sort of power play by the Stones and to have got his hackles up about it. He told Epstein to deal with the people from the NME. But the NME people said that they had a contract with Andrew Oldham, and they weren't going to break it. Oldham refused to change the terms of the contract. Lennon said that he wasn't going to go on stage if they didn't directly follow the Stones. Maurice Kinn, the publisher of the NME, told Epstein that he wasn't going to break the contract with Oldham, and that if the Beatles didn't appear on stage, he would get Jimmy Savile, who was compering the show, to go out on stage and tell the ten thousand fans in the audience that the Beatles were backstage refusing to appear. He would then sue NEMS for breach of contract *and* NEMS would be liable for any damage caused by the rioting that was sure to happen. Lennon screamed a lot of abuse at Kinn, and told him the group would never play one of their events again, but the group did go on stage -- but because they hadn't yet signed the agreement to allow their performance to be filmed, they refused to allow it to be recorded. Apparently Andrew Oldham took all this as a sign that Epstein was starting to lose control of the group. Also during May 1966 there were visits from musicians from other countries, continuing the cultural exchange that was increasingly influencing the Beatles' art. Bruce Johnston of the Beach Boys came over to promote the group's new LP, Pet Sounds, which had been largely the work of Brian Wilson, who had retired from touring to concentrate on working in the studio. Johnston played the record for John and Paul, who listened to it twice, all the way through, in silence, in Johnston's hotel room: [Excerpt: The Beach Boys, "God Only Knows"] According to Johnston, after they'd listened through the album twice, they went over to a piano and started whispering to each other, picking out chords. Certainly the influence of Pet Sounds is very noticeable on songs like "Here, There, and Everywhere", written and recorded a few weeks after this meeting: [Excerpt: The Beatles, "Here, There, and Everywhere"] That track, and the last track recorded for the album, "She Said She Said" were unusual in one very important respect -- they were recorded while the Beatles were no longer under contract to EMI Records. Their contract expired on the fifth of June, 1966, and they finished Revolver without it having been renewed -- it would be several months before their new contract was signed, and it's rather lucky for music lovers that Brian Epstein was the kind of manager who considered personal relationships and basic honour and decency more important than the legal niceties, unlike any other managers of the era, otherwise we would not have Revolver in the form we know it today. After the meeting with Johnston, but before the recording of those last couple of Revolver tracks, the Beatles also met up again with Bob Dylan, who was on a UK tour with a new, loud, band he was working with called The Hawks. While the Beatles and Dylan all admired each other, there was by this point a lot of wariness on both sides, especially between Lennon and Dylan, both of them very similar personality types and neither wanting to let their guard down around the other or appear unhip. There's a famous half-hour-long film sequence of Lennon and Dylan sharing a taxi, which is a fascinating, excruciating, example of two insecure but arrogant men both trying desperately to impress the other but also equally desperate not to let the other know that they want to impress them: [Excerpt: Dylan and Lennon taxi ride] The day that was filmed, Lennon and Harrison also went to see Dylan play at the Royal Albert Hall. This tour had been controversial, because Dylan's band were loud and raucous, and Dylan's fans in the UK still thought of him as a folk musician. At one gig, earlier on the tour, an audience member had famously yelled out "Judas!" -- (just on the tiny chance that any of my listeners don't know that, Judas was the disciple who betrayed Jesus to the authorities, leading to his crucifixion) -- and that show was for many years bootlegged as the "Royal Albert Hall" show, though in fact it was recorded at the Free Trade Hall in Manchester. One of the *actual* Royal Albert Hall shows was released a few years ago -- the one the night before Lennon and Harrison saw Dylan: [Excerpt: Bob Dylan, "Like a Rolling Stone", Royal Albert Hall 1966] The show Lennon and Harrison saw would be Dylan's last for many years. Shortly after returning to the US, Dylan was in a motorbike accident, the details of which are still mysterious, and which some fans claim was faked altogether. The accident caused him to cancel all the concert dates he had booked, and devote himself to working in the studio for several years just like Brian Wilson. And from even further afield than America, Ravi Shankar came over to Britain, to work with his friend the violinist Yehudi Menuhin, on a duet album, West Meets East, that was an example in the classical world of the same kind of international cross-fertilisation that was happening in the pop world: [Excerpt: Yehudi Menuhin and Ravi Shankar, "Prabhati (based on Raga Gunkali)"] While he was in the UK, Shankar also performed at the Royal Festival Hall, and George Harrison went to the show. He'd seen Shankar live the year before, but this time he met up with him afterwards, and later said "He was the first person that impressed me in a way that was beyond just being a famous celebrity. Ravi was my link to the Vedic world. Ravi plugged me into the whole of reality. Elvis impressed me when I was a kid, and impressed me when I met him, but you couldn't later on go round to him and say 'Elvis, what's happening with the universe?'" After completing recording and mixing the as-yet-unnamed album, which had been by far the longest recording process of their career, and which still nearly sixty years later regularly tops polls of the best album of all time, the Beatles took a well-earned break. For a whole two days, at which point they flew off to Germany to do a three-day tour, on their way to Japan, where they were booked to play five shows at the Budokan. Unfortunately for the group, while they had no idea of this when they were booked to do the shows, many in Japan saw the Budokan as sacred ground, and they were the first ever Western group to play there. This led to numerous death threats and loud protests from far-right activists offended at the Beatles defiling their religious and nationalistic sensibilities. As a result, the police were on high alert -- so high that there were three thousand police in the audience for the shows, in a venue which only held ten thousand audience members. That's according to Mark Lewisohn's Complete Beatles Chronicle, though I have to say that the rather blurry footage of the audience in the video of those shows doesn't seem to show anything like those numbers. But frankly I'll take Lewisohn's word over that footage, as he's not someone to put out incorrect information. The threats to the group also meant that they had to be kept in their hotel rooms at all times except when actually performing, though they did make attempts to get out. At the press conference for the Tokyo shows, the group were also asked publicly for the first time their views on the war in Vietnam, and John replied "Well, we think about it every day, and we don't agree with it and we think that it's wrong. That's how much interest we take. That's all we can do about it... and say that we don't like it". I say they were asked publicly for the first time, because George had been asked about it for a series of interviews Maureen Cleave had done with the group a couple of months earlier, as we'll see in a bit, but nobody was paying attention to those interviews. Brian Epstein was upset that the question had gone to John. He had hoped that the inevitable Vietnam question would go to Paul, who he thought might be a bit more tactful. The last thing he needed was John Lennon saying something that would upset the Americans before their tour there a few weeks later. Luckily, people in America seemed to have better things to do than pay attention to John Lennon's opinions. The support acts for the Japanese shows included several of the biggest names in Japanese rock music -- or "group sounds" as the genre was called there, Japanese people having realised that trying to say the phrase "rock and roll" would open them up to ridicule given that it had both "r" and "l" sounds in the phrase. The man who had coined the term "group sounds", Jackey Yoshikawa, was there with his group the Blue Comets, as was Isao Bito, who did a rather good cover version of Cliff Richard's "Dynamite": [Excerpt: Isao Bito, "Dynamite"] Bito, the Blue Comets, and the other two support acts, Yuya Uchida and the Blue Jeans, all got together to perform a specially written song, "Welcome Beatles": [Excerpt: "Welcome Beatles" ] But while the Japanese audience were enthusiastic, they were much less vocal about their enthusiasm than the audiences the Beatles were used to playing for. The group were used, of course, to playing in front of hordes of screaming teenagers who could not hear a single note, but because of the fear that a far-right terrorist would assassinate one of the group members, the police had imposed very, very, strict rules on the audience. Nobody in the audience was allowed to get out of their seat for any reason, and the police would clamp down very firmly on anyone who was too demonstrative. Because of that, the group could actually hear themselves, and they sounded sloppy as hell, especially on the newer material. Not that there was much of that. The only song they did from the Revolver sessions was "Paperback Writer", the new single, and while they did do a couple of tracks from Rubber Soul, those were under-rehearsed. As John said at the start of this tour, "I can't play any of Rubber Soul, it's so unrehearsed. The only time I played any of the numbers on it was when I recorded it. I forget about songs. They're only valid for a certain time." That's certainly borne out by the sound of their performances of Rubber Soul material at the Budokan: [Excerpt: The Beatles, "If I Needed Someone (live at the Budokan)"] It was while they were in Japan as well that they finally came up with the title for their new album. They'd been thinking of all sorts of ideas, like Abracadabra and Magic Circle, and tossing names around with increasing desperation for several days -- at one point they seem to have just started riffing on other groups' albums, and seem to have apparently seriously thought about naming the record in parodic tribute to their favourite artists -- suggestions included The Beatles On Safari, after the Beach Boys' Surfin' Safari (and possibly with a nod to their recent Pet Sounds album cover with animals, too), The Freewheelin' Beatles, after Dylan's second album, and my favourite, Ringo's suggestion After Geography, for the Rolling Stones' Aftermath. But eventually Paul came up with Revolver -- like Rubber Soul, a pun, in this case because the record itself revolves when on a turntable. Then it was off to the Philippines, and if the group thought Japan had been stressful, they had no idea what was coming. The trouble started in the Philippines from the moment they stepped off the plane, when they were bundled into a car without Neil Aspinall or Brian Epstein, and without their luggage, which was sent to customs. This was a problem in itself -- the group had got used to essentially being treated like diplomats, and to having their baggage let through customs without being searched, and so they'd started freely carrying various illicit substances with them. This would obviously be a problem -- but as it turned out, this was just to get a "customs charge" paid by Brian Epstein. But during their initial press conference the group were worried, given the hostility they'd faced from officialdom, that they were going to be arrested during the conference itself. They were asked what they would tell the Rolling Stones, who were going to be visiting the Philippines shortly after, and Lennon just said "We'll warn them". They also asked "is there a war on in the Philippines? Why is everybody armed?" At this time, the Philippines had a new leader, Ferdinand Marcos -- who is not to be confused with his son, Ferdinand Marcos Jr, also known as Bongbong Marcos, who just became President-Elect there last month. Marcos Sr was a dictatorial kleptocrat, one of the worst leaders of the latter half of the twentieth century, but that wasn't evident yet. He'd been elected only a few months earlier, and had presented himself as a Kennedy-like figure -- a young man who was also a war hero. He'd recently switched parties from the Liberal party to the right-wing Nacionalista Party, but wasn't yet being thought of as the monstrous dictator he later became. The person organising the Philippines shows had been ordered to get the Beatles to visit Ferdinand and Imelda Marcos at 11AM on the day of the show, but for some reason had instead put on their itinerary just the *suggestion* that the group should meet the Marcoses, and had put the time down as 3PM, and the Beatles chose to ignore that suggestion -- they'd refused to do that kind of government-official meet-and-greet ever since an incident in 1964 at the British Embassy in Washington where someone had cut off a bit of Ringo's hair. A military escort turned up at the group's hotel in the morning, to take them for their meeting. The group were all still in their rooms, and Brian Epstein was still eating breakfast and refused to disturb them, saying "Go back and tell the generals we're not coming." The group gave their performances as scheduled, but meanwhile there was outrage at the way the Beatles had refused to meet the Marcos family, who had brought hundreds of children -- friends of their own children, and relatives of top officials -- to a party to meet the group. Brian Epstein went on TV and tried to smooth things over, but the broadcast was interrupted by static and his message didn't get through to anyone. The next day, the group's security was taken away, as were the cars to take them to the airport. When they got to the airport, the escalators were turned off and the group were beaten up at the arrangement of the airport manager, who said in 1984 "I beat up the Beatles. I really thumped them. First I socked Epstein and he went down... then I socked Lennon and Ringo in the face. I was kicking them. They were pleading like frightened chickens. That's what happens when you insult the First Lady." Even on the plane there were further problems -- Brian Epstein and the group's road manager Mal Evans were both made to get off the plane to sort out supposed financial discrepancies, which led to them worrying that they were going to be arrested or worse -- Evans told the group to tell his wife he loved her as he left the plane. But eventually, they were able to leave, and after a brief layover in India -- which Ringo later said was the first time he felt he'd been somewhere truly foreign, as opposed to places like Germany or the USA which felt basically like home -- they got back to England: [Excerpt: "Ordinary passenger!"] When asked what they were going to do next, George replied “We're going to have a couple of weeks to recuperate before we go and get beaten up by the Americans,” The story of the "we're bigger than Jesus" controversy is one of the most widely misreported events in the lives of the Beatles, which is saying a great deal. One book that I've encountered, and one book only, Steve Turner's Beatles '66, tells the story of what actually happened, and even that book seems to miss some emphases. I've pieced what follows together from Turner's book and from an academic journal article I found which has some more detail. As far as I can tell, every single other book on the Beatles released up to this point bases their account of the story on an inaccurate press statement put out by Brian Epstein, not on the truth. Here's the story as it's generally told. John Lennon gave an interview to his friend, Maureen Cleave of the Evening Standard, during which he made some comments about how it was depressing that Christianity was losing relevance in the eyes of the public, and that the Beatles are more popular than Jesus, speaking casually because he was talking to a friend. That story was run in the Evening Standard more-or-less unnoticed, but then an American teen magazine picked up on the line about the Beatles being bigger than Jesus, reprinted chunks of the interview out of context and without the Beatles' knowledge or permission, as a way to stir up controversy, and there was an outcry, with people burning Beatles records and death threats from the Ku Klux Klan. That's... not exactly what happened. The first thing that you need to understand to know what happened is that Datebook wasn't a typical teen magazine. It *looked* just like a typical teen magazine, certainly, and much of its content was the kind of thing that you would get in Tiger Beat or any of the other magazines aimed at teenage girls -- the September 1966 issue was full of articles like "Life with the Walker Brothers... by their Road Manager", and interviews with the Dave Clark Five -- but it also had a long history of publishing material that was intended to make its readers think about social issues of the time, particularly Civil Rights. Arthur Unger, the magazine's editor and publisher, was a gay man in an interracial relationship, and while the subject of homosexuality was too taboo in the late fifties and sixties for him to have his magazine cover that, he did regularly include articles decrying segregation and calling for the girls reading the magazine to do their part on a personal level to stamp out racism. Datebook had regularly contained articles like one from 1963 talking about how segregation wasn't just a problem in the South, saying "If we are so ‘integrated' why must men in my own city of Philadelphia, the city of Brotherly Love, picket city hall because they are discriminated against when it comes to getting a job? And how come I am still unable to take my dark- complexioned friends to the same roller skating rink or swimming pool that I attend?” One of the writers for the magazine later said “We were much more than an entertainment magazine . . . . We tried to get kids involved in social issues . . . . It was a well-received magazine, recommended by libraries and schools, but during the Civil Rights period we did get pulled off a lot of stands in the South because of our views on integration” Art Unger, the editor and publisher, wasn't the only one pushing this liberal, integrationist, agenda. The managing editor at the time, Danny Fields, was another gay man who wanted to push the magazine even further than Unger, and who would later go on to manage the Stooges and the Ramones, being credited by some as being the single most important figure in punk rock's development, and being immortalised by the Ramones in their song "Danny Says": [Excerpt: The Ramones, "Danny Says"] So this was not a normal teen magazine, and that's certainly shown by the cover of the September 1966 issue, which as well as talking about the interviews with John Lennon and Paul McCartney inside, also advertised articles on Timothy Leary advising people to turn on, tune in, and drop out; an editorial about how interracial dating must be the next step after desegregation of schools, and a piece on "the ten adults you dig/hate the most" -- apparently the adult most teens dug in 1966 was Jackie Kennedy, the most hated was Barry Goldwater, and President Johnson, Billy Graham, and Martin Luther King appeared in the top ten on both lists. Now, in the early part of the year Maureen Cleave had done a whole series of articles on the Beatles -- double-page spreads on each band member, plus Brian Epstein, visiting them in their own homes (apart from Paul, who she met at a restaurant) and discussing their daily lives, their thoughts, and portraying them as rounded individuals. These articles are actually fascinating, because of something that everyone who met the Beatles in this period pointed out. When interviewed separately, all of them came across as thoughtful individuals, with their own opinions about all sorts of subjects, and their own tastes and senses of humour. But when two or more of them were together -- especially when John and Paul were interviewed together, but even in social situations, they would immediately revert to flip in-jokes and riffing on each other's statements, never revealing anything about themselves as individuals, but just going into Beatle mode -- simultaneously preserving the band's image, closing off outsiders, *and* making sure they didn't do or say anything that would get them mocked by the others. Cleave, as someone who actually took them all seriously, managed to get some very revealing information about all of them. In the article on Ringo, which is the most superficial -- one gets the impression that Cleave found him rather difficult to talk to when compared to the other, more verbally facile, band members -- she talked about how he had a lot of Wild West and military memorabilia, how he was a devoted family man and also devoted to his friends -- he had moved to the suburbs to be close to John and George, who already lived there. The most revealing quote about Ringo's personality was him saying "Of course that's the great thing about being married -- you have a house to sit in and company all the time. And you can still go to clubs, a bonus for being married. I love being a family man." While she looked at the other Beatles' tastes in literature in detail, she'd noted that the only books Ringo owned that weren't just for show were a few science fiction paperbacks, but that as he said "I'm not thick, it's just that I'm not educated. People can use words and I won't know what they mean. I say 'me' instead of 'my'." Ringo also didn't have a drum kit at home, saying he only played when he was on stage or in the studio, and that you couldn't practice on your own, you needed to play with other people. In the article on George, she talked about how he was learning the sitar, and how he was thinking that it might be a good idea to go to India to study the sitar with Ravi Shankar for six months. She also talks about how during the interview, he played the guitar pretty much constantly, playing everything from songs from "Hello Dolly" to pieces by Bach to "the Trumpet Voluntary", by which she presumably means Clarke's "Prince of Denmark's March": [Excerpt: Jeremiah Clarke, "Prince of Denmark's March"] George was also the most outspoken on the subjects of politics, religion, and society, linking the ongoing war in Vietnam with the UK's reverence for the Second World War, saying "I think about it every day and it's wrong. Anything to do with war is wrong. They're all wrapped up in their Nelsons and their Churchills and their Montys -- always talking about war heroes. Look at All Our Yesterdays [a show on ITV that showed twenty-five-year-old newsreels] -- how we killed a few more Huns here and there. Makes me sick. They're the sort who are leaning on their walking sticks and telling us a few years in the army would do us good." He also had very strong words to say about religion, saying "I think religion falls flat on its face. All this 'love thy neighbour' but none of them are doing it. How can anybody get into the position of being Pope and accept all the glory and the money and the Mercedes-Benz and that? I could never be Pope until I'd sold my rich gates and my posh hat. I couldn't sit there with all that money on me and believe I was religious. Why can't we bring all this out in the open? Why is there all this stuff about blasphemy? If Christianity's as good as they say it is, it should stand up to a bit of discussion." Harrison also comes across as a very private person, saying "People keep saying, ‘We made you what you are,' well, I made Mr. Hovis what he is and I don't go round crawling over his gates and smashing up the wall round his house." (Hovis is a British company that makes bread and wholegrain flour). But more than anything else he comes across as an instinctive anti-authoritarian, being angry at bullying teachers, Popes, and Prime Ministers. McCartney's profile has him as the most self-consciously arty -- he talks about the plays of Alfred Jarry and the music of Karlheinz Stockhausen and Luciano Berio: [Excerpt: Luciano Berio, "Momenti (for magnetic tape)"] Though he was very worried that he might be sounding a little too pretentious, saying “I don't want to sound like Jonathan Miller going on" --
In Episode 247, Jeff Belanger and Ray Auger try out stilts in Rye, New Hampshire, in search of a 19th-century eccentric legendary peddler known as Cling-Clang. Known to travel on stilts along the coast between New Hampshire, Maine, and even into Nova Scotia, this odd vagabond has some strange habits, he only slept outside, and left a mark behind that we can still see today. But who was this regional folk legend? Murderer on the run? Heartbroken noble? Something else?