Podcast appearances and mentions of steve can

  • 24PODCASTS
  • 38EPISODES
  • 54mAVG DURATION
  • 1MONTHLY NEW EPISODE
  • Mar 18, 2025LATEST

POPULARITY

20172018201920202021202220232024


Best podcasts about steve can

Latest podcast episodes about steve can

Father Simon Says
Mortal Sin - Father Simon Says - March 18, 2025

Father Simon Says

Play Episode Listen Later Mar 18, 2025 49:10


(2:36) Bible Study: Isaiah 1:10, 16-20 –This passage is about God’s power to transform even our sinfulness. Go can take our sinfulness and use it to reform us. Matthew 23:1-12 – Jesus goes up the mountain, sat down and spoke. Rabbi’s sat to teach their disciples. Peter according to tradition had a teaching chair. This idea of a chair of Moses is seen as Moses our teacher. Fr. Simon also explains what a phylactery is and what it means to be a father. (20:06) Break 1 (22:33) Letters: What does it mean to be in a state of mortal sin and how do you know if you are in it? Father answers this question and others. Send him a letter at simon@relevantradio.com (32:41) Break 2 (33:23) Word of the Day: Finger (35:47) Phones: Ken - I'm 82 years-old, not in best of health, pastor doesn't purify his hands before distributing communion. Last time I got sick for 3 weeks. He doesn't take criticism well. What should I do? Steve - Can you explain the tradition in the Bible of tying a red thread around the baby to mark them as the firstborn? Like Jacob and Esau? George - What is the definition of covenant? Does covenant allow divorce? Patrick - How is cremation justified in the Bible when the body is a temple of the Holy Spirit?

The Patrick Madrid Show
The Patrick Madrid Show: February 13, 2025 - Hour 1

The Patrick Madrid Show

Play Episode Listen Later Feb 13, 2025 51:09


Patrick gives advice for a caller with infertility issues especially in regards to a procedure that the Church hasn’t ruled if it’s moral or immoral. Patrick answers a question about the Primacy of Peter and its significance in the Church. Gabriel - Baptism at the end of John Chapter 3: Would they be applying Sanctifying Grace since they were done in the Old Covenant? (0:59) John - What do you think about assisted reproductive technology like GIFT (Gamete Interfallopian Transfer)? (5:02) Steve-Can you talk about the primacy of the pope for me? I see a contradiction in one of the scriptures: When Paul confronts Saint Peter about eating with the uncircumzied Jews and early cannons seems to refute the Pope (15:23) Patrick explains more about the Primacy of Peter (22:44) Steve-Yesterday you spoke about the Saints and the Virgin Mary that we don't worship them. Do you think we should explain that better what we mean when we say we don't worship Mary? (33:55) Jennifer-My mother received a copy of the true letter of Oration (True Letter of Jesus Christ) given by Saint Bridget. Is it legit? (45:54) Resources: National Catholic Bioethics Center: https://www.ncbcenter.org/ Book Recommendation: “Jesus, Peter & the Keys” https://www.amazon.com/Jesus-Peter-Keys-Scriptural-Handbook/dp/1882972546 Book Recommendation: “The Divine Primacy” https://stpaulcenter.com/product/the-divine-primacy-of-the-bishop-of-rome-and-modern-eastern-orthodoxy/ Book Recommendation: “Pope Fiction” https://www.amazon.com/Pope-Fiction-Answers-Misconceptions-Papacy/dp/0964261006

Seeing Red Show
Episode 239: Highup - Seeing Red Podcast Yearly Mix 2024

Seeing Red Show

Play Episode Listen Later Jan 4, 2025 155:26


01 - Öwnboss x Dino Warriors - Can You Dig It 02 - Steve Aoki & 22 Bullets  - Play the track03 - David Guetta & Mason vs Princess Superstar - Perfect (Exceeder)04 - Anyma - Pictures Of You05 - Cat Dealers - MEOW06 - Mau P -  On Again07 - Sikdope & RayRay - Feel It08 - Laidback Luke, Vion Konger - Rocking With The Best09 - Gabry Ponte x Giuseppe Ottaviani feat. Malou - In My Mind10 - David Guetta & MORTEN & Prophecy - Kill The Vibe11 - Tigerlily & Madism - Sway12 - A7S - Monster (ALOK)13 - KAAZE - Destination Calabria14 - Highup, The Stafford Brothers feat Veronica Bravo -  Elevate15 - KAAZE - Papi16 - Bombs Away - Monster 17 - Bonka & Bombs Away - Airborn18 - Dj Fluke, Highup, Julia Temos - Save Me19 - Tiesto x FAST BOY - All My Life20 - Bassjackers - All Around The World (La La La La La)21 - DJ Yasmin & H_NN_H X - Clap Anthem22 - Will Sparks x Mairee - Insurgence 23 - Uberjakd & Ella Young -

ScaleUpRadio's podcast
Episode #441 - Scaling with Values and Vision – Creating a High-Performance Culture - with Steve Rawlingson from Samuel Knight Group

ScaleUpRadio's podcast

Play Episode Listen Later Dec 23, 2024 26:09


In this special episode of ScaleUp Radio, we're joined by Steve Rawlingson, a returning guest and the founder of Samuel Knight Group. With a remarkable scale-up journey under his belt, Steve shares insights on building a scalable team, maintaining a high-performance culture, and staying focused on strategic goals. Samuel Knight Group, a global recruitment company specialising in the renewable energy sector, has achieved impressive milestones, including £35 million in sales in 2023 and the potential to reach £1 billion+ within five years. From hiring strategies to leadership approaches, Steve's experience offers invaluable lessons for scale-up leaders navigating rapid growth and complex challenges. Key Discussion Points Scaling with a Focus on Values and Mindset: The importance of hiring for values and a growth mindset over skills. Lessons learned from early hiring mistakes, with only 2 of the first 20 hires remaining long-term. Psychometric testing as a tool to align candidates with the mission of achieving zero carbon emissions by 2050. Creating and Sustaining a High-Performance Culture: Transitioning from a management to a leadership-driven organisation, leading by example. The move towards a 100% employee ownership model by 2026 to align mission and accountability. Building a self-managing culture where every team member understands their role and contributions. Strategic Focus and Execution: Avoiding distractions by sticking to core market sectors and focusing on environmental impact metrics, such as megawatts of power created. Utilising regular planning rhythms like 90-day plans and clear KPIs to maintain alignment and drive growth. The significance of peer support networks like Smart Boards and the Smart90 program for effective planning and execution. Overcoming Challenges with a Growth Mindset: Viewing obstacles as opportunities and embracing change. Steve compares the scale-up journey to climbing a mountain: persistence and belief in your mission are critical to achieving the next level. Make sure you don't miss any future episodes by subscribing to ScaleUp Radio wherever you like to listen to your podcasts. For now, continue listening for the full story from Steve. Scaling up your business isn't easy, and can be a little daunting. Let ScaleUp Radio make it a little easier for you. With guests who have been where you are now, and can offer their thoughts and advice on several aspects of business. ScaleUp Radio is the business podcast you've been waiting for. If you would like to be a guest on ScaleUp Radio, please click here: https://bizsmarts.co.uk/scaleupradio/kevin You can get in touch with Kevin & Granger here: kevin@biz-smart.co.uk grangerf@biz-smart.co.uk   Kevin's Latest Book Is Available! Drawing on BizSmart's own research and experiences of working with hundreds of owner-managers, Kevin Brent explores the key reasons why most organisations do not scale and how the challenges change as they reach different milestones on the ScaleUp Journey. He then details a practical step by step guide to successfully navigate between the milestones in the form of ESUS - a proven system for entrepreneurs to scale up. More on the Book HERE - https://www.esusgroup.co.uk/   Steve Can be found here: linkedin.com/in/stevenrawlingson https://www.samuel-knight.com/ info@samuel-knight.com  

Dj Murphy Podcast
November 2024 (Podcast 129)

Dj Murphy Podcast

Play Episode Listen Later Nov 1, 2024 92:01


My mix of the newest dance anthems, genre hopping and building up from 135BPM to 150BPM. Enjoy!Dj Murphy - November 2024 (Podcast 129) TracklistGalantis Feat. Rosa Linn - One Cry (Ty Sunderland Remix) Jazzy, Sonny Fodera, D.O.D - Somedays (Extended Mix)Skrillex, Boys Noize - Fine Day Anthem (Extended Mix)Benny Benassi, Chris Nasty, Chris Nasty & Benny Benassi - Everybody Loves Love (Extended Mix)Caz - Sweet Tea (Extended)C100 - Grip N ShootEli Brown - Diamonds On My Mind (Extended Mix)Mauro Picotto - Esperanza (Extended Mix)Symmetrik - Caroline (Extended)Chrystal - The Days (Calvin Logue Remix)Symmetrik & Rae Morris - Dance With Me (Extended Mix)Zorza - Everything U Need (Extended)Bl3Ss, Camrinwatsin, Bbyclose - Kisses (Feat. Bbyclose) (Extended Mix)Matty Ralph - My LovingKimmic - No Come Down Feat Aya Anne (Extended Mix)Matty Ralph - Dreaming (Extended Mix)Dbf X David Rust - Time Of Our LifeSubsonik - Out Of Time NowC100 - Ring Ring PussyLee Pollitt & Yes-Ii - No OneBounce Projectz & Mddltn - Give It To MeCrazyhutz - I WishWillie G Vs Poomstyles - BroomstickWarren H - Top Of The Drops Edition 2Jordan Irwin - NarcissistDj's Factory - Taking Me HigherTindle X Sonic Sound - Slide AwayAtc - Around The World (La La La La) (Sammy Extended Remix)Dj Rome & Groove Control - Groove ThangPhatboi - What You HadThe Bounce Brothers - ScreamRik Shaw - DoopdonkC100 - Say My NameTerminal Ii & Katrina - Stop Crying Your Heart OutDancecore N3Rd & Aaron Delaron - Fast Love (Extended Mix)S.J.J - MiracleLum!X X Lucas & Steve - Can't Forget You (Extended Club Mix)Steven Jay - BlowoutDj Remo - There For You Hosted on Acast. See acast.com/privacy for more information.

ENERGY Club Files Podcast - Flip Capella
Flip Capella 848 Energy Club Files Podcast - 26. 07. 2024 | Emdey takeover Part 03

ENERGY Club Files Podcast - Flip Capella

Play Episode Listen Later Jul 28, 2024 94:46


Best of House, Dance, Hypertechno, Techno, Hardstyle, Hard Dance, Raw, Drum & Bass, D&B, EDM, Psy, Dance Pop, Techhouse, Bass, Mash Up, Flip Capella Music,... Tracklist on https://www.1001tracklists.com/ Insagram: https://www.instagram.com/djflipcapella/ Tracklist Show 848: Beauz - Umbrella 2024  Cassö, RAYE, D-Block Europe - Prada (Bassjackers & Dimitri Vegas Bootleg) David Guetta & OneRepublic - I Don't Wanna Wait (Hardwell & Olly James Extended Remix) Bonka & Kaitlyn - Sing Hallelujah! (Extended Mix) Short Mod (20) Eminem - Houdini (Dunisco House Remix) David Guetta & Cedric Gervais - Switch Dimitri Vegas x Vin Diesel x Zion - Don't Stop The Music Disturbed - Sound Of Silence (CYRIL Remix) (Sterbinszky x MYNEA Edit) DMNDS feat. Nito-Onna - Gimme More DNCE - Cake By The Ocean (Gin and Sonic Remix) LUM!X x Lucas & Steve - Can't Forget You (Club Mix) Jaxomy x Agatino Romero x W&W ft. Raffaella Carrà  - Pedro (W&W Remix) Don Omar & Lucenzo - Danza Kuduro (Tiësto Remix) Dubdogz & MOJJO - Could You Be Loved Jack Wins x Jonasu x Robbie Jay - I Wanna Back (Summer) Jax Jones & Zoe Wees - Never Be Lonely Gabry Ponte & SMACK - Rock Da House Justin Bieber, Nicki Minaj - Beauty and a Beat (Gin and Sonic Remix) Marlon Hoffstadt AKA DJ Daddy Trance - It's That Time (FISHER Remix) Missy Elliott - Get Ur Freak On (CIBUS TECHNO REMIX) Dimitri Vegas & Like Mike, Bassjackers - Axel F   NERVO & 22Bullets & Naeleck - Voices LUSSO Mashup - David Guetta vs. John Summit & Hayla vs. Duke Dumont vs. Bassjackers - AROUND WHERE YOU ARE Öwnboss, NXNJAS, Chamillionaire - Ridin' Dirty Pickle & TOYZZ - So Different Robbe, Danny Ores & KUOKKA - Boy Oh Boy Skytech, Vion Konger - Rhythm of the Night (R3HAB Edit) Sean Paul & Odd Mob - Get Busy Short Mod (20) Robert Falcon - Slim Shady  SP3CTRUM, Milan Gavris - Pon De Replay Tungevaag & Kraiz - Seven Nation Army Switch Disco, R3HAB, Sam Feldt - Sleep Tonight (This Is The Life) Taylor Swift - Cruel Summer (Sterbinszky x MYNEA Remix) Tate McRae - Greedy (Ape Rave Club Bootleg) Hidden Podcast Track Special by DJane Victoria Showtek - Dream (Adrenalize Remix) Sickmode, Mish, Bloodlust - YEAHBOIII Timmy Trumpet x Da Tweekaz - Boom Boom Boom Travis Scott - FE!N (Rudeejay & Da Brozz Bootleg) Tate McRae - Greedy (Restricted Edit) Basshunter - Boten Anna (Jonas Wood & BassWar & CaoX Bootleg) Keanu Silva - Have You Ever Been Mellowed (Ninkid Remix) Zedd vs. Technoboy vs. Isaac - Digital Clarity (Progressive Brothers Hardstyle Edit)  Hardwell ft. Amba Shepherd - Apollo (Dr Phunk Remix) Hardwell & Dannic ft. Haris - Survivors (RMCM Remix) Tatsunoshin x Giin - I'm Good (Blue) (Hardstyle Mix) Avi8 - Better Now Jay Sean ft. Lil Wayne - Down (BassWar & CaoX Edit)

Radio Record
Record Club Guest Mix Lucas & Steve #089 (25-07-2024)

Radio Record

Play Episode Listen Later Jul 24, 2024


Guest Mix by Lucas & Steve 01. LUM!X x Lucas & Steve - Can't Forget You 02. Pickle - Get Silly 03. NIGHT / MOVES - Now and Forever 04. Lucas & Steve, Yves V - After Midnight (feat. Xoro) [Tribute Mix] 05. Sonny Fodera - Mind Still (feat. blythe) 06. Disclosure - She's Gone, Dance On 07. David Guetta, MORTEN, Prophecy - Kill The Vibe 08. Lucas & Steve x Skinny Days - When I Wake Up (Club Mix) 09. Swedish House Mafia feat. Niki and the Dove - Lioness 10. Ferreck Dawn - Heartache (feat. Alex Mills) 11. Lucas & Steve - LFG 12. DJ Kuba & Neitan x EMDI ft. Nicci - Drake 13. Nora Van Elken - Supersonic 14. Lucas & Steve x Lawrent - End Of Time (feat. Jordan Shaw) 15. Laidback Luke - Left Or Right (feat. Matthew Nolan) 16. Floris van Oranje - Feel What You Want 17. Lucas & Steve - What We Know (feat. Conor Byrne) [Club Mix] 18. Bingo Players x Firebeatz - Droppin' Hot (feat. Sonny Wilson)

JOURNEYS
XABI ONLY - JOURNEYS #283

JOURNEYS

Play Episode Listen Later Jul 18, 2024 119:02


Follow me: Facebook: fb.me/xabionly Twitter: twitter.com/xabionly Youtube: youtube.com/xabionly Mixcloud: mixcloud.com/xabionly Instagram: instagram.com/xabionly TRACKLIST: https://1001.tl/2lg1f1t1 Spotify playlist: https://open.spotify.com/playlist/4STV7DPVgwI4ntvi1sQvjh?si=CU6lCNZcRkKiZytdXaI5TQ TRACKLIST: 01. Kryder & Sarah de Warren - Lights Out (Dominiq Vanquish Remix) [ARMADA] 02. EDX - Osculate [SIRUP] 03. Korolova & Ross Quinn - Say [CAPTIVE SOUL] 04. Che Jose - Escape [VANDIT ALTERNATIVE] 05. Che Jose - Untold [VANDIT ALTERNATIVE] 06. Simea & Cris Taylor - Acelerá [NATURAL RHYTHM]  07. Miss Monique ft. braev - Every Breath [TOMORROWLAND MUSIC] 08. Reezer - Stay [BRASLIVE] 09. Mind Against & Sideral - Colossal [AFTERLIFE] [RELEASE OF THE WEEK] 10. Bruno Martini & Soldera & Laau - Smalltalk [BEESIDE] 11. Charles B & AGAP & Ben Haydie & Reege - Everglow [PROTOCOL] 12. EDX - ID [SIRUP] 13. KREAM - Wicked Game [TOMORROWLAND MUSIC] [TRACK OF THE WEEK] 14. Push - We'll Find Peace [TOMORROWLAND MUSIC] 15. The Aston Shuffle ft. Koko LaRoo - Hold On To Me [ONLY 100S] 16. ANU & Linney - Crosslands [HELIX] 17. Mike Cervello - Bang That [CONFESSION] 18. Push - Mystica [TOMORROWLAND MUSIC] 19. Dimitri Vegas & Like Mike & Bassjackers - Axel F [SMASH THE HOUSE] 20. Madison Mars - Favorite Sound [GEMSTONE] 21. Karasso - Water [LITHUANIA HQ] 22. Gabry Ponte & SMACK - Rock Da House [SPINNIN] 23. Visco - Olympia [GENERATION SMASH] 24. Red Showtell - Aerial [LEGION] 25. Ghostnaps - grow apart [NCS] 26. 2ACES - Get Down [REVEALED RADAR] 27. Vini Vici & Maddix & Shibui - Spiritum (Giorgia Angiuli Remix [WE NEXT] 28. Nanoviola - Romeo and Juliet [NNVL RCRDS] 29. LUM!X & Lucas & Steve - Can't Forget You (Club Mix) [SPINNIN] 30. Patrick Moreno & YAMATOMAYA - Falling In Love [CRASH & SMILE] 31. Lockdown & TANK - Overdrive [RAVE ROOM] [PROMO OF THE WEEK] 32. LMFAO - Sexy And I Know It (KEVU Remix) 33. Vini Vici & Maddix & Shibui - Spiritum (Bass Modulators Remix) [WE NEXT] 34. Jake Ryan & R&P-X - Game Of The Ages [DRAGON] 35. MATTN x Mairee - Girlz Wanna Have Fun 2024 [SMASH THE HOUSE]

Maxximize On Air
Blasterjaxx present - Maxximize On Air 524

Maxximize On Air

Play Episode Listen Later Jul 1, 2024 58:16


Yes, yes, what's up? We are Blasterjaxx, and this is your weekly dose of Maxximize On Air. We will be back again next week. Until then, keep it maxximized. 01 AmyElle - Silver Lining 02 Splonie - Bring It Back 03 Mairee, Salkin, Jules - Resonate 04 Green Velvet & Marco Lys feat. Walter Phillips - Kiss From A Rose 05 Domeno feat. Chloé Doyon - Before It's Over 06 EMDI & NEXBOY feat. Ceres - Don't Stop 07 JOXION feat. Kxne - Hype Unload 08 Will Clarke - Set Me Free 09 Blasterjaxx X Naeleck X 3rd Wall - Crush Me Down (You Spin Me Around) 10 W&W x AXMO - If I Die Young 11 Beauz - You Like It Rough 12 HΛNNΛH X - Gandrung Awakening 13 Galoski & YAKSA - Hypnotized 14 Breathe Carolina - New Sound 15 Antillas feat. Fiora - Damaged 16 LUM!X x Lucas & Steve - Can't Forget You (Club Mix) 17 Wiwek - Remedy 18 MaRLo, V3NOM - Open Your Eyes (MaRLo Presents V3NOM)

Maxximize On Air
Blasterjaxx present - Maxximize On Air 524

Maxximize On Air

Play Episode Listen Later Jul 1, 2024 58:16


Yes, yes, what's up? We are Blasterjaxx, and this is your weekly dose of Maxximize On Air. We will be back again next week. Until then, keep it maxximized. 01 AmyElle - Silver Lining 02 Splonie - Bring It Back 03 Mairee, Salkin, Jules - Resonate 04 Green Velvet & Marco Lys feat. Walter Phillips - Kiss From A Rose 05 Domeno feat. Chloé Doyon - Before It's Over 06 EMDI & NEXBOY feat. Ceres - Don't Stop 07 JOXION feat. Kxne - Hype Unload 08 Will Clarke - Set Me Free 09 Blasterjaxx X Naeleck X 3rd Wall - Crush Me Down (You Spin Me Around) 10 W&W x AXMO - If I Die Young 11 Beauz - You Like It Rough 12 HΛNNΛH X - Gandrung Awakening 13 Galoski & YAKSA - Hypnotized 14 Breathe Carolina - New Sound 15 Antillas feat. Fiora - Damaged 16 LUM!X x Lucas & Steve - Can't Forget You (Club Mix) 17 Wiwek - Remedy 18 MaRLo, V3NOM - Open Your Eyes (MaRLo Presents V3NOM)

103 Klubb
103 Klubb - Lucas Steve - 06 Juin 2024

103 Klubb

Play Episode Listen Later Jun 11, 2024 56:02


Le mix de Lucas Steve dans 103 Klubb le 06 Juin 2024 de 20H à 21H Tracklist: Vintage Culture & Goodboys - Chemicals *** Salento Guys, Paki, Nicola Fasano - Gimme More Higher *** Lucas & Steve x Lawrent feat. Jordan Shaw - End Of Time *** Disclosure - She’s Gone, Dance On *** NUZB & KEYTON - Biohazard *** R3WIRE x Sebastian Wibe - Main Attraction *** NIGHT / MOVES - Now and Forever *** LUM!X x Lucas & Steve - Can't Forget You *** Crunkz & LEØN - Don’t Want You To Go *** Jack Wins & ILY - Dunno (What To Do) *** Lucas & Steve - Emergency *** Marten Hørger x BIJOU - The Power *** MorganJ - From The Beat *** Goodboys - Chain Reaction *** SVRT - Push Me Away *** Alle Farben - Lotus (PAROOKAVILLE Anthem 2024) *** The Bloody Beetroots feat. Steve Aoki - Warp 1.9 (Westend & Cherry Tooth Remix) *** Lucas & Steve x Skinny Days - When I Wake Up (Club Mix)

Maxximize On Air
Blasterjaxx present - Maxximize On Air 520

Maxximize On Air

Play Episode Listen Later Jun 3, 2024 58:14


Yes yes, you're in the mix on a special edition of Maxximize On Air with Blasterjaxx! Have a great weekend, and keep it maxximized! 01 Alle Farben - Lotus (PAROOKAVILLE Anthem 2024) 02 Armin van Buuren & Chef'Special - Larger Than Life (Extended Mix) 03 Plastik Funk x Sagan x Esox - Everybody Up 04 Dada Life - That Song With The Kick Drum 05 Renato S - Higher 06 Mark Roma - Ultimatum 07 Up To No Good (Firebeatz Remix) 08 SONO115_Grimix_-Carnival 09 Matty Ralph - Te Adoro (Extended Mix) 10 Blasterjaxx X Naeleck X 3rd Wall - Crush Me Down (You Spin Me Around) 11 VIVID & Qwerty - The Neon City 12 Luca-Dante Spadafora, MOTi, VIZE - Let Me Go 13 LUM!X x Lucas & Steve - Can't Forget You 14 Will Sparks & Mairee - Insurgence 15 Blasterjaxx X Lockdown X Vion Konger - Feel The Bass 16 Tiësto x Hedex x Basslayerz - Click Click Click 17 STVW feat. Story Untold - Boulevard Of Broken Dreams 18 Dimitri Vegas x Darren Styles - Summer Dream Of Love

Maxximize On Air
Blasterjaxx present - Maxximize On Air 520

Maxximize On Air

Play Episode Listen Later Jun 3, 2024 58:14


Yes yes, you're in the mix on a special edition of Maxximize On Air with Blasterjaxx! Have a great weekend, and keep it maxximized! 01 Alle Farben - Lotus (PAROOKAVILLE Anthem 2024) 02 Armin van Buuren & Chef'Special - Larger Than Life (Extended Mix) 03 Plastik Funk x Sagan x Esox - Everybody Up 04 Dada Life - That Song With The Kick Drum 05 Renato S - Higher 06 Mark Roma - Ultimatum 07 Up To No Good (Firebeatz Remix) 08 SONO115_Grimix_-Carnival 09 Matty Ralph - Te Adoro (Extended Mix) 10 Blasterjaxx X Naeleck X 3rd Wall - Crush Me Down (You Spin Me Around) 11 VIVID & Qwerty - The Neon City 12 Luca-Dante Spadafora, MOTi, VIZE - Let Me Go 13 LUM!X x Lucas & Steve - Can't Forget You 14 Will Sparks & Mairee - Insurgence 15 Blasterjaxx X Lockdown X Vion Konger - Feel The Bass 16 Tiësto x Hedex x Basslayerz - Click Click Click 17 STVW feat. Story Untold - Boulevard Of Broken Dreams 18 Dimitri Vegas x Darren Styles - Summer Dream Of Love

JOURNEYS
XABI ONLY - JOURNEYS #270

JOURNEYS

Play Episode Listen Later Jun 3, 2024 117:19


Follow me: Facebook: fb.me/xabionly Twitter: twitter.com/xabionly Youtube: youtube.com/xabionly Mixcloud: mixcloud.com/xabionly Instagram: instagram.com/xabionly TRACKLIST: https://1001.tl/wf8nss1 Spotify playlist: https://open.spotify.com/playlist/4STV7DPVgwI4ntvi1sQvjh?si=CU6lCNZcRkKiZytdXaI5TQ TRACKLIST: 01. Eynka & Layla Benitez - No Place To Go [INTERSTELLAR] 02. Son Of Son - Lost Control [SIAMESE] 03. Lauren Mia - Shadow (Alfa Romero Remix) [EAR PORN] 04. Son Of Son - Du Ska Inte [SIAMESE] 05. Ginchy - Calabria [GINCHIEST] 06. HOVR - Holy [THIS NEVER HAPPENED] 07. Son Of Son - Elucidum [SIAMESE] 08. Vintage Culture ft. Yellowitz - Just Like Home [VINTAGE CULTURE] 09. Vintage Culture & Braev - Time [VINTAGE CULTURE] 10. EDX - Setema [PINKSTAR] 11. Enlight - Memoria [HIGH CONTRAST] 12. Vanillaz - Azmara [SOAVE] 13. DJ Kuba & Neitan x EMDI ft. Nicci - Drake [HEXAGON] 14. Raven & Kreyn x KDH x Pure Cold - Tech Phonk [HYSTERIA] 15. Miss Melera - Sage [DAYS LIKE NIGHTS] 16. Ludwig Goransson - Can You Hear The Music (Skytech Remix) 17. MATTN & Rush Avenue, Chiptune - Wannabe [HOUSE OF HOUSE] 18. Breathe Carolina - SOMEBODY [HEXAGON] 19. The Blizzard - Rabagast [A STATE OF TRANCE] 20. Thomas Newson & Marco V - CIDOLEM [EPIC247] 21. Rag - Make It Fiyah [GENERATION HEX] 22. Calussa x Sllash & Doppe - Tu Camino [REALM] 23. BÔN, Zack Torrez & Julian Revs ft. Joe Bills - Feel Alive [MADOX NETWORK] 24. Arvenius & Kamish - Let You Down [FEELQ] 25. Lewis Laite, Jake Ryan - Saving Grace [FUTURE RAVE MUSIC] 26. TRIBBS & RSCL ft. Zana - Breathe [GEMSTONE] 27. C-Systems - Voyager [A STATE OF TRANCE] [RELEASE OF THE WEEK] 28. Kill The Buzz - Dark Atmosphere [REVEALED] 29. Albzzy - Know That [MINUTE TO MIDNITE] 30. Matt Dybal, R4KIDOR - Fantasy [REVEALED] 31. WhiteLight x Spectorsonic x Alex BELIEVE - Shooting Stars [INTERPLAY] 32. BEAUZ & Kevu - Stupid [RAVE CULTURE] [PROMO OF THE WEEK] 33. SP3CTRUM - Dance Therapy [BOUNCE & BASS] 34. Argy & Omnya - Aria (Nick Havsen Edit) 35. VIVID & Qwerty - The Neon City [REVEALED] 36. Tate McRae - greedy (Nick Havsen Bootleg) 37. LUM!X x Lucas & Steve - Can't Forget You [SPINNIN] [TRACK OF THE WEEK] 38. Bart Claessen - Playmo (1st Play) [YAKUZA] [CLASSIC] 39. Dimitri Vegas & Like Mike vs. Will Sparks - Rave Generator [SMASH THE HOUSE]

Latent Space: The AI Engineer Podcast — CodeGen, Agents, Computer Vision, Data Science, AI UX and all things Software 3.0
The "Normsky" architecture for AI coding agents — with Beyang Liu + Steve Yegge of SourceGraph

Latent Space: The AI Engineer Podcast — CodeGen, Agents, Computer Vision, Data Science, AI UX and all things Software 3.0

Play Episode Listen Later Dec 14, 2023 79:37


We are running an end of year survey for our listeners. Let us know any feedback you have for us, what episodes resonated with you the most, and guest requests for 2024! RAG has emerged as one of the key pieces of the AI Engineer stack. Jerry from LlamaIndex called it a “hack”, Bryan from Hex compared it to “a recommendation system from LLMs”, and even LangChain started with it. RAG is crucial in any AI coding workflow. We talked about context quality for code in our Phind episode. Today's guests, Beyang Liu and Steve Yegge from SourceGraph, have been focused on code indexing and retrieval for over 15 years. We locked them in our new studio to record a 1.5 hours masterclass on the history of code search, retrieval interfaces for code, and how they get SOTA 30% completion acceptance rate in their Cody product by being better at the “bin packing problem” of LLM context generation. Google Grok → SourceGraph → CodyWhile at Google in 2008, Steve built Grok, which lives on today as Google Kythe. It allowed engineers to do code parsing and searching across different codebases and programming languages. (You might remember this blog post from Steve's time at Google) Beyang was an intern at Google at the same time, and Grok became the inspiration to start SourceGraph in 2013. The two didn't know eachother personally until Beyang brought Steve out of retirement 9 years later to join him as VP Engineering. Fast forward 10 years, SourceGraph has become to best code search tool out there and raised $223M along the way. Nine months ago, they open sourced SourceGraph Cody, their AI coding assistant. All their code indexing and search infrastructure allows them to get SOTA results by having better RAG than competitors:* Code completions as you type that achieve an industry-best Completion Acceptance Rate (CAR) as high as 30% using a context-enhanced open-source LLM (StarCoder)* Context-aware chat that provides the option of using GPT-4 Turbo, Claude 2, GPT-3.5 Turbo, Mistral 7x8B, or Claude Instant, with more model integrations planned* Doc and unit test generation, along with AI quick fixes for common coding errors* AI-enhanced natural language code search, powered by a hybrid dense/sparse vector search engine There are a few pieces of infrastructure that helped Cody achieve these results:Dense-sparse vector retrieval system For many people, RAG = vector similarity search, but there's a lot more that you can do to get the best possible results. From their release:"Sparse vector search" is a fancy name for keyword search that potentially incorporates LLMs for things like ranking and term expansion (e.g., "k8s" expands to "Kubernetes container orchestration", possibly weighted as in SPLADE): * Dense vector retrieval makes use of embeddings, the internal representation that LLMs use to represent text. Dense vector retrieval provides recall over a broader set of results that may have no exact keyword matches but are still semantically similar. * Sparse vector retrieval is very fast, human-understandable, and yields high recall of results that closely match the user query. * We've found the approaches to be complementary.There's a very good blog post by Pinecone on SPLADE for sparse vector search if you're interested in diving in. If you're building RAG applications in areas that have a lot of industry-specific nomenclature, acronyms, etc, this is a good approach to getting better results.SCIPIn 2016, Microsoft announced the Language Server Protocol (LSP) and the Language Server Index Format (LSIF). This protocol makes it easy for IDEs to get all the context they need from a codebase to get things like file search, references, “go to definition”, etc. SourceGraph developed SCIP, “a better code indexing format than LSIF”:* Simpler and More Efficient Format: SCIP utilizes Protobuf instead of JSON, which is used by LSIF. Protobuf is more space-efficient, simpler, and more suitable for systems programming. * Better Performance and Smaller Index Sizes: SCIP indexers, such as scip-clang, show enhanced performance and reduced index file sizes compared to LSIF indexers (10%-20% smaller)* Easier to Develop and Debug: SCIP's design, centered around human-readable string IDs for symbols, makes it faster and more straightforward to develop new language indexers. Having more efficient indexing is key to more performant RAG on code. Show Notes* Sourcegraph* Cody* Copilot vs Cody* Steve's Stanford seminar on Grok* Steve's blog* Grab* Fireworks* Peter Norvig* Noam Chomsky* Code search* Kelly Norton* Zoekt* v0.devSee also our past episodes on Cursor, Phind, Codeium and Codium as well as the GitHub Copilot keynote at AI Engineer Summit.Timestamps* [00:00:00] Intros & Backgrounds* [00:05:20] How Steve's work on Grok inspired SourceGraph for Beyang* [00:08:10] What's Cody?* [00:11:22] Comparison of coding assistants and the capabilities of Cody* [00:16:00] The importance of context (RAG) in AI coding tools* [00:21:33] The debate between Chomsky and Norvig approaches in AI* [00:30:06] Normsky: the Norvig + Chomsky models collision* [00:36:00] The death of the DSL?* [00:40:00] LSP, Skip, Kythe, BFG, and all that fun stuff* [00:53:00] The SourceGraph internal stack* [00:58:46] Building on open source models* [01:02:00] SourceGraph for engineering managers?* [01:12:00] Lightning RoundTranscriptAlessio: Hey everyone, welcome to the Latent Space podcast. This is Alessio, partner and CTO-in-Residence at Decibel Partners, and I'm joined by my co-host Swyx, founder of Smol AI. [00:00:16]Swyx: Hey, and today we're christening our new podcast studio in the Newton, and we have Beyang and Steve from Sourcegraph. Welcome. [00:00:25]Beyang: Hey, thanks for having us. [00:00:26]Swyx: So this has been a long time coming. I'm very excited to have you. We also are just celebrating the one year anniversary of ChatGPT yesterday, but also we'll be talking about the GA of Cody later on today. We'll just do a quick intros of both of you. Obviously, people can research you and check the show notes for more. Beyang, you worked in computer vision at Stanford and then you worked at Palantir. I did, yeah. You also interned at Google. [00:00:48]Beyang: I did back in the day where I get to use Steve's system, DevTool. [00:00:53]Swyx: Right. What was it called? [00:00:55]Beyang: It was called Grok. Well, the end user thing was Google Code Search. That's what everyone called it, or just like CS. But the brains of it were really the kind of like Trigram index and then Grok, which provided the reference graph. [00:01:07]Steve: Today it's called Kythe, the open source Google one. It's sort of like Grok v3. [00:01:11]Swyx: On your podcast, which you've had me on, you've interviewed a bunch of other code search developers, including the current developer of Kythe, right? [00:01:19]Beyang: No, we didn't have any Kythe people on, although we would love to if they're up for it. We had Kelly Norton, who built a similar system at Etsy, it's an open source project called Hound. We also had Han-Wen Nienhuys, who created Zoekt, which is, I think, heavily inspired by the Trigram index that powered Google's original code search and that we also now use at Sourcegraph. Yeah. [00:01:45]Swyx: So you teamed up with Quinn over 10 years ago to start Sourcegraph and you were indexing all code on the internet. And now you're in a perfect spot to create a code intelligence startup. Yeah, yeah. [00:01:56]Beyang: I guess the backstory was, I used Google Code Search while I was an intern. And then after I left that internship and worked elsewhere, it was the single dev tool that I missed the most. I felt like my job was just a lot more tedious and much more of a hassle without it. And so when Quinn and I started working together at Palantir, he had also used various code search engines in open source over the years. And it was just a pain point that we both felt, both working on code at Palantir and also working within Palantir's clients, which were a lot of Fortune 500 companies, large financial institutions, folks like that. And if anything, the pains they felt in dealing with large complex code bases made our pain points feel small by comparison. So that was really the impetus for starting Sourcegraph. [00:02:42]Swyx: Yeah, excellent. Steve, you famously worked at Amazon. And you've told many, many stories. I want every single listener of Latent Space to check out Steve's YouTube because he effectively had a podcast that you didn't tell anyone about or something. You just hit record and just went on a few rants. I'm always here for your Stevie rants. And then you moved to Google, where you also had some interesting thoughts on just the overall Google culture versus Amazon. You joined Grab as head of eng for a couple of years. I'm from Singapore, so I have actually personally used a lot of Grab's features. And it was very interesting to see you talk so highly of Grab's engineering and sort of overall prospects. [00:03:21]Steve: Because as a customer, it sucked? [00:03:22]Swyx: Yeah, no, it's just like, being from a smaller country, you never see anyone from our home country being on a global stage or talked about as a startup that people admire or look up to, like on the league that you, with all your legendary experience, would consider equivalent. Yeah. [00:03:41]Steve: Yeah, no, absolutely. They actually, they didn't even know that they were as good as they were, in a sense. They started hiring a bunch of people from Silicon Valley to come in and sort of like fix it. And we came in and we were like, Oh, we could have been a little better operational excellence and stuff. But by and large, they're really sharp. The only thing about Grab is that they get criticized a lot for being too westernized. Oh, by who? By Singaporeans who don't want to work there. [00:04:06]Swyx: Okay. I guess I'm biased because I'm here, but I don't see that as a problem. If anything, they've had their success because they were more westernized than the Sanders Singaporean tech company. [00:04:15]Steve: I mean, they had their success because they are laser focused. They copy to Amazon. I mean, they're executing really, really, really well for a giant. I was on a slack with 2,500 engineers. It was like this giant waterfall that you could dip your toe into. You'd never catch up. Actually, the AI summarizers would have been really helpful there. But yeah, no, I think Grab is successful because they're just out there with their sleeves rolled up, just making it happen. [00:04:43]Swyx: And for those who don't know, it's not just like Uber of Southeast Asia, it's also a super app. PayPal Plus. [00:04:48]Steve: Yeah. [00:04:49]Swyx: In the way that super apps don't exist in the West. It's one of the enduring mysteries of B2C that super apps work in the East and don't work in the West. We just don't understand it. [00:04:57]Beyang: Yeah. [00:04:58]Steve: It's just kind of curious. They didn't work in India either. And it was primarily because of bandwidth reasons and smaller phones. [00:05:03]Swyx: That should change now. It should. [00:05:05]Steve: And maybe we'll see a super app here. [00:05:08]Swyx: You retired-ish? I did. You retired-ish on your own video game? Mm-hmm. Any fun stories about that? And that's also where you discovered some need for code search, right? Mm-hmm. [00:05:16]Steve: Sure. A need for a lot of stuff. Better programming languages, better databases. Better everything. I mean, I started in like 95, right? Where there was kind of nothing. Yeah. Yeah. [00:05:24]Beyang: I just want to say, I remember when you first went to Grab because you wrote that blog post talking about why you were excited about it, about like the expanding Asian market. And our reaction was like, oh, man, how did we miss stealing it with you? [00:05:36]Swyx: Hiring you. [00:05:37]Beyang: Yeah. [00:05:38]Steve: I was like, miss that. [00:05:39]Swyx: Tell that story. So how did this happen? Right? So you were inspired by Grok. [00:05:44]Beyang: I guess the backstory from my point of view is I had used code search and Grok while at Google, but I didn't actually know that it was connected to you, Steve. I knew you from your blog posts, which were always excellent, kind of like inside, very thoughtful takes from an engineer's perspective on some of the challenges facing tech companies and tech culture and that sort of thing. But my first introduction to you within the context of code intelligence, code understanding was I watched a talk that you gave, I think at Stanford, about Grok when you're first building it. And that was very eye opening. I was like, oh, like that guy, like the guy who, you know, writes the extremely thoughtful ranty like blog posts also built that system. And so that's how I knew, you know, you were involved in that. And then, you know, we always wanted to hire you, but never knew quite how to approach you or, you know, get that conversation started. [00:06:34]Steve: Well, we got introduced by Max, right? Yeah. It was temporal. Yeah. Yeah. I mean, it was a no brainer. They called me up and I had noticed when Sourcegraph had come out. Of course, when they first came out, I had this dagger of jealousy stabbed through me piercingly, which I remember because I am not a jealous person by any means, ever. But boy, I was like, but I was kind of busy, right? And just one thing led to another. I got sucked back into the ads vortex and whatever. So thank God Sourcegraph actually kind of rescued me. [00:07:05]Swyx: Here's a chance to build DevTools. Yeah. [00:07:08]Steve: That's the best. DevTools are the best. [00:07:10]Swyx: Cool. Well, so that's the overall intro. I guess we can get into Cody. Is there anything else that like people should know about you before we get started? [00:07:18]Steve: I mean, everybody knows I'm a musician. I can juggle five balls. [00:07:24]Swyx: Five is good. Five is good. I've only ever managed three. [00:07:27]Steve: Five is hard. Yeah. And six, a little bit. [00:07:30]Swyx: Wow. [00:07:31]Beyang: That's impressive. [00:07:32]Alessio: So yeah, to jump into Sourcegraph, this has been a company 10 years in the making. And as Sean said, now you're at the right place. Phase two. Now, exactly. You spent 10 years collecting all this code, indexing, making it easy to surface it. Yeah. [00:07:47]Swyx: And also learning how to work with enterprises and having them trust you with their code bases. Yeah. [00:07:52]Alessio: Because initially you were only doing on-prem, right? Like a lot of like VPC deployments. [00:07:55]Beyang: So in the very early days, we're cloud only. But the first major customers we landed were all on-prem, self-hosted. And that was, I think, related to the nature of the problem that we're solving, which becomes just like a critical, unignorable pain point once you're above like 100 devs or so. [00:08:11]Alessio: Yeah. And now Cody is going to be GA by the time this releases. So congrats to your future self for launching this in two weeks. Can you give a quick overview of just what Cody is? I think everybody understands that it's a AI coding agent, but a lot of companies say they have a AI coding agent. So yeah, what does Cody do? How do people interface with it? [00:08:32]Beyang: Yeah. So how is it different from the like several dozen other AI coding agents that exist in the market now? When we thought about building a coding assistant that would do things like code generation and question answering about your code base, I think we came at it from the perspective of, you know, we've spent the past decade building the world's best code understanding engine for human developers, right? So like it's kind of your guide as a human dev if you want to go and dive into a large complex code base. And so our intuition was that a lot of the context that we're providing to human developers would also be useful context for AI developers to consume. And so in terms of the feature set, Cody is very similar to a lot of other assistants. It does inline autocompletion. It does code base aware chat. It does specific commands that automate, you know, tasks that you might rather not want to do like generating unit tests or adding detailed documentation. But we think the core differentiator is really the quality of the context, which is hard to kind of describe succinctly. It's a bit like saying, you know, what's the difference between Google and Alta Vista? There's not like a quick checkbox list of features that you can rattle off, but it really just comes down to all the attention and detail that we've paid to making that context work well and be high quality and fast for human devs. We're now kind of plugging into the AI coding assistant as well. Yeah. [00:09:53]Steve: I mean, just to add my own perspective on to what Beyang just described, RAG is kind of like a consultant that the LLM has available, right, that knows about your code. RAG provides basically a bridge to a lookup system for the LLM, right? Whereas fine tuning would be more like on the job training for somebody. If the LLM is a person, you know, and you send them to a new job and you do on the job training, that's what fine tuning is like, right? So tuned to our specific task. You're always going to need that expert, even if you get the on the job training, because the expert knows your particular code base, your task, right? That expert has to know your code. And there's a chicken and egg problem because, right, you know, we're like, well, I'm going to ask the LLM about my code, but first I have to explain it, right? It's this chicken and egg problem. That's where RAG comes in. And we have the best consultants, right? The best assistant who knows your code. And so when you sit down with Cody, right, what Beyang said earlier about going to Google and using code search and then starting to feel like without it, his job was super tedious. Once you start using these, do you guys use coding assistants? [00:10:53]Swyx: Yeah, right. [00:10:54]Steve: I mean, like we're getting to the point very quickly, right? Where you feel like almost like you're programming without the internet, right? Or something, you know, it's like you're programming back in the nineties without the coding assistant. Yeah. Hopefully that helps for people who have like no idea about coding systems, what they are. [00:11:09]Swyx: Yeah. [00:11:10]Alessio: I mean, going back to using them, we had a lot of them on the podcast already. We had Cursor, we have Codium and Codium, very similar names. [00:11:18]Swyx: Yeah. Find, and then of course there's Copilot. [00:11:22]Alessio: You had a Copilot versus Cody blog post, and I think it really shows the context improvement. So you had two examples that stuck with me. One was, what does this application do? And the Copilot answer was like, oh, it uses JavaScript and NPM and this. And it's like, but that's not what it does. You know, that's what it's built with. Versus Cody was like, oh, these are like the major functions. And like, these are the functionalities and things like that. And then the other one was, how do I start this up? And Copilot just said NPM start, even though there was like no start command in the package JSON, but you know, most collapse, right? Most projects use NPM start. So maybe this does too. How do you think about open source models? Because Copilot has their own private thing. And I think you guys use Starcoder, if I remember right. Yeah, that's correct. [00:12:09]Beyang: I think Copilot uses some variant of Codex. They're kind of cagey about it. I don't think they've like officially announced what model they use. [00:12:16]Swyx: And I think they use a range of models based on what you're doing. Yeah. [00:12:19]Beyang: So everyone uses a range of model. Like no one uses the same model for like inline completion versus like chat because the latency requirements for. Oh, okay. Well, there's fill in the middle. There's also like what the model's trained on. So like we actually had completions powered by Claude Instant for a while. And but you had to kind of like prompt hack your way to get it to output just the code and not like, hey, you know, here's the code you asked for, like that sort of text. So like everyone uses a range of models. We've kind of designed Cody to be like especially model, not agnostic, but like pluggable. So one of our kind of design considerations was like as the ecosystem evolves, we want to be able to integrate the best in class models, whether they're proprietary or open source into Cody because the pace of innovation in the space is just so quick. And I think that's been to our advantage. Like today, Cody uses Starcoder for inline completions. And with the benefit of the context that we provide, we actually show comparable completion acceptance rate metrics. It's kind of like the standard metric that folks use to evaluate inline completion quality. It's like if I show you a completion, what's the chance that you actually accept the completion versus you reject it? And so we're at par with Copilot, which is at the head of that industry right now. And we've been able to do that with the Starcoder model, which is open source and the benefit of the context fetching stuff that we provide. And of course, a lot of like prompt engineering and other stuff along the way. [00:13:40]Alessio: And Steve, you wrote a post called cheating is all you need about what you're building. And one of the points you made is that everybody's fighting on the same axis, which is better UI and the IDE, maybe like a better chat response. But data modes are kind of the most important thing. And you guys have like a 10 year old mode with all the data you've been collecting. How do you kind of think about what other companies are doing wrong, right? Like, why is nobody doing this in terms of like really focusing on RAG? I feel like you see so many people. Oh, we just got a new model. It's like a bit human eval. And it's like, well, but maybe like that's not what we should really be doing, you know? Like, do you think most people underestimate the importance of like the actual RAG in code? [00:14:21]Steve: I think that people weren't doing it much. It wasn't. It's kind of at the edges of AI. It's not in the center. I know that when ChatGPT launched, so within the last year, I've heard a lot of rumblings from inside of Google, right? Because they're undergoing a huge transformation to try to, you know, of course, get into the new world. And I heard that they told, you know, a bunch of teams to go and train their own models or fine tune their own models, right? [00:14:43]Swyx: Both. [00:14:43]Steve: And, you know, it was a s**t show. Nobody knew how to do it. They launched two coding assistants. One was called Code D with an EY. And then there was, I don't know what happened in that one. And then there's Duet, right? Google loves to compete with themselves, right? They do this all the time. And they had a paper on Duet like from a year ago. And they were doing exactly what Copilot was doing, which was just pulling in the local context, right? But fundamentally, I thought of this because we were talking about the splitting of the [00:15:10]Swyx: models. [00:15:10]Steve: In the early days, it was the LLM did everything. And then we realized that for certain use cases, like completions, that a different, smaller, faster model would be better. And that fragmentation of models, actually, we expected to continue and proliferate, right? Because we are fundamentally, we're a recommender engine right now. Yeah, we're recommending code to the LLM. We're saying, may I interest you in this code right here so that you can answer my question? [00:15:34]Swyx: Yeah? [00:15:34]Steve: And being good at recommender engine, I mean, who are the best recommenders, right? There's YouTube and Spotify and, you know, Amazon or whatever, right? Yeah. [00:15:41]Swyx: Yeah. [00:15:41]Steve: And they all have many, many, many, many, many models, right? For all fine-tuned for very specific, you know. And that's where we're heading in code, too. Absolutely. [00:15:50]Swyx: Yeah. [00:15:50]Alessio: We just did an episode we released on Wednesday, which we said RAG is like Rexis or like LLMs. You're basically just suggesting good content. [00:15:58]Swyx: It's like what? Recommendations. [00:15:59]Beyang: Recommendations. [00:16:00]Alessio: Oh, got it. [00:16:01]Steve: Yeah, yeah, yeah. [00:16:02]Swyx: So like the naive implementation of RAG is you embed everything, throw it in a vector database, you embed your query, and then you find the nearest neighbors, and that's your RAG. But actually, you need to rank it. And actually, you need to make sure there's sample diversity and that kind of stuff. And then you're like slowly gradient dissenting yourself towards rediscovering proper Rexis, which has been traditional ML for a long time. But like approaching it from an LLM perspective. Yeah. [00:16:24]Beyang: I almost think of it as like a generalized search problem because it's a lot of the same things. Like you want your layer one to have high recall and get all the potential things that could be relevant. And then there's typically like a layer two re-ranking mechanism that bumps up the precision and tries to get the relevant stuff to the top of the results list. [00:16:43]Swyx: Have you discovered that ranking matters a lot? Oh, yeah. So the context is that I think a lot of research shows that like one, context utilization matters based on model. Like GPT uses the top of the context window, and then apparently Claude uses the bottom better. And it's lossy in the middle. Yeah. So ranking matters. No, it really does. [00:17:01]Beyang: The skill with which models are able to take advantage of context is always going to be dependent on how that factors into the impact on the training loss. [00:17:10]Swyx: Right? [00:17:10]Beyang: So like if you want long context window models to work well, then you have to have a ton of data where it's like, here's like a billion lines of text. And I'm going to ask a question about like something that's like, you know, embedded deeply into it and like, give me the right answer. And unless you have that training set, then of course, you're going to have variability in terms of like where it attends to. And in most kind of like naturally occurring data, the thing that you're talking about right now, the thing I'm asking you about is going to be something that we talked about recently. [00:17:36]Swyx: Yeah. [00:17:36]Steve: Did you really just say gradient dissenting yourself? Actually, I love that it's entered the casual lexicon. Yeah, yeah, yeah. [00:17:44]Swyx: My favorite version of that is, you know, how we have to p-hack papers. So, you know, when you throw humans at the problem, that's called graduate student dissent. That's great. It's really awesome. [00:17:54]Alessio: I think the other interesting thing that you have is this inline assist UX that I wouldn't say async, but like it works while you can also do work. So you can ask Cody to make changes on a code block and you can still edit the same file at the same time. [00:18:07]Swyx: Yeah. [00:18:07]Alessio: How do you see that in the future? Like, do you see a lot of Cody's running together at the same time? Like, how do you validate also that they're not messing each other up as they make changes in the code? And maybe what are the limitations today? And what do you think about where the attack is going? [00:18:21]Steve: I want to start with a little history and then I'm going to turn it over to Bian, all right? So we actually had this feature in the very first launch back in June. Dominic wrote it. It was called nonstop Cody. And you could have multiple, basically, LLM requests in parallel modifying your source [00:18:37]Swyx: file. [00:18:37]Steve: And he wrote a bunch of code to handle all of the diffing logic. And you could see the regions of code that the LLM was going to change, right? And he was showing me demos of it. And it just felt like it was just a little before its time, you know? But a bunch of that stuff, that scaffolding was able to be reused for where we're inline [00:18:56]Swyx: sitting today. [00:18:56]Steve: How would you characterize it today? [00:18:58]Beyang: Yeah, so that interface has really evolved from a, like, hey, general purpose, like, request anything inline in the code and have the code update to really, like, targeted features, like, you know, fix the bug that exists at this line or request a very specific [00:19:13]Swyx: change. [00:19:13]Beyang: And the reason for that is, I think, the challenge that we ran into with inline fixes, and we do want to get to the point where you could just fire and forget and have, you know, half a dozen of these running in parallel. But I think we ran into the challenge early on that a lot of people are running into now when they're trying to construct agents, which is the reliability of, you know, working code generation is just not quite there yet in today's language models. And so that kind of constrains you to an interaction where the human is always, like, in the inner loop, like, checking the output of each response. And if you want that to work in a way where you can be asynchronous, you kind of have to constrain it to a domain where today's language models can generate reliable code well enough. So, you know, generating unit tests, that's, like, a well-constrained problem. Or fixing a bug that shows up as, like, a compiler error or a test error, that's a well-constrained problem. But the more general, like, hey, write me this class that does X, Y, and Z using the libraries that I have, that is not quite there yet, even with the benefit of really good context. Like, it definitely moves the needle a lot, but we're not quite there yet to the point where you can just fire and forget. And I actually think that this is something that people don't broadly appreciate yet, because I think that, like, everyone's chasing this dream of agentic execution. And if we're to really define that down, I think it implies a couple things. You have, like, a multi-step process where each step is fully automated. We don't have to have a human in the loop every time. And there's also kind of like an LM call at each stage or nearly every stage in that [00:20:45]Swyx: chain. [00:20:45]Beyang: Based on all the work that we've done, you know, with the inline interactions, with kind of like general Codyfeatures for implementing longer chains of thought, we're actually a little bit more bearish than the average, you know, AI hypefluencer out there on the feasibility of agents with purely kind of like transformer-based models. To your original question, like, the inline interactions with CODI, we actually constrained it to be more targeted, like, you know, fix the current error or make this quick fix. I think that that does differentiate us from a lot of the other tools on the market, because a lot of people are going after this, like, shnazzy, like, inline edit interaction, whereas I think where we've moved, and this is based on the user feedback that we've gotten, it's like that sort of thing, it demos well, but when you're actually coding day to day, you don't want to have, like, a long chat conversation inline with the code base. That's a waste of time. You'd rather just have it write the right thing and then move on with your life or not have to think about it. And that's what we're trying to work towards. [00:21:37]Steve: I mean, yeah, we're not going in the agent direction, right? I mean, I'll believe in agents when somebody shows me one that works. Yeah. Instead, we're working on, you know, sort of solidifying our strength, which is bringing the right context in. So new context sources, ways for you to plug in your own context, ways for you to control or influence the context, you know, the mixing that happens before the request goes out, etc. And there's just so much low-hanging fruit left in that space that, you know, agents seems like a little bit of a boondoggle. [00:22:03]Beyang: Just to dive into that a little bit further, like, I think, you know, at a very high level, what do people mean when they say agents? They really mean, like, greater automation, fully automated, like, the dream is, like, here's an issue, go implement that. And I don't have to think about it as a human. And I think we are working towards that. Like, that is the eventual goal. I think it's specifically the approach of, like, hey, can we have a transformer-based LM alone be the kind of, like, backbone or the orchestrator of these agentic flows? Where we're a little bit more bearish today. [00:22:31]Swyx: You want the human in the loop. [00:22:32]Beyang: I mean, you kind of have to. It's just a reality of the behavior of language models that are purely, like, transformer-based. And I think that's just like a reflection of reality. And I don't think people realize that yet. Because if you look at the way that a lot of other AI tools have implemented context fetching, for instance, like, you see this in the Copilot approach, where if you use, like, the at-workspace thing that supposedly provides, like, code-based level context, it has, like, an agentic approach where you kind of look at how it's behaving. And it feels like they're making multiple requests to the LM being like, what would you do in this case? Would you search for stuff? What sort of files would you gather? Go and read those files. And it's like a multi-hop step, so it takes a long while. It's also non-deterministic. Because any sort of, like, LM invocation, it's like a dice roll. And then at the end of the day, the context it fetches is not that good. Whereas our approach is just like, OK, let's do some code searches that make sense. And then maybe, like, crawl through the reference graph a little bit. That is fast. That doesn't require any sort of LM invocation at all. And we can pull in much better context, you know, very quickly. So it's faster. [00:23:37]Swyx: It's more reliable. [00:23:37]Beyang: It's deterministic. And it yields better context quality. And so that's what we think. We just don't think you should cargo cult or naively go like, you know, agents are the [00:23:46]Swyx: future. [00:23:46]Beyang: Let's just try to, like, implement agents on top of the LM that exists today. I think there are a couple of other technologies or approaches that need to be refined first before we can get into these kind of, like, multi-stage, fully automated workflows. [00:24:00]Swyx: It makes sense. You know, we're very much focused on developer inner loop right now. But you do see things eventually moving towards developer outer loop. Yeah. So would you basically say that they're tackling the agent's problem that you don't want to tackle? [00:24:11]Beyang: No, I would say at a high level, we are after maybe, like, the same high level problem, which is like, hey, I want some code written. I want to develop some software and can automate a system. Go build that software for me. I think the approaches might be different. So I think the analogy in my mind is, I think about, like, the AI chess players. Coding, in some senses, I mean, it's similar and dissimilar to chess. I think one question I ask is, like, do you think producing code is more difficult than playing chess or less difficult than playing chess? More. [00:24:41]Swyx: I think more. [00:24:41]Beyang: Right. And if you look at the best AI chess players, like, yes, you can use an LLM to play chess. Like, people have showed demos where it's like, oh, like, yeah, GPT-4 is actually a pretty decent, like, chess move suggester. Right. But you would never build, like, a best in class chess player off of GPT-4 alone. [00:24:57]Swyx: Right. [00:24:57]Beyang: Like, the way that people design chess players is that you have kind of like a search space and then you have a way to explore that search space efficiently. There's a bunch of search algorithms, essentially. We were doing tree search in various ways. And you can have heuristic functions, which might be powered by an LLM. [00:25:12]Swyx: Right. [00:25:12]Beyang: Like, you might use an LLM to generate proposals in that space that you can efficiently explore. But the backbone is still this kind of more formalized tree search based approach rather than the LLM itself. And so I think my high level intuition is that, like, the way that we get to more reliable multi-step workflows that do things beyond, you know, generate unit test, it's really going to be like a search based approach where you use an LLM as kind of like an advisor or a proposal function, sort of your heuristic function, like the ASTAR search algorithm. But it's probably not going to be the thing that is the backbone, because I guess it's not the right tool for that. Yeah. [00:25:50]Swyx: I can see yourself kind of thinking through this, but not saying the words, the sort of philosophical Peter Norvig type discussion. Maybe you want to sort of introduce that in software. Yeah, definitely. [00:25:59]Beyang: So your listeners are savvy. They're probably familiar with the classic like Chomsky versus Norvig debate. [00:26:04]Swyx: No, actually, I wanted, I was prompting you to introduce that. Oh, got it. [00:26:08]Beyang: So, I mean, if you look at the history of artificial intelligence, right, you know, it goes way back to, I don't know, it's probably as old as modern computers, like 50s, 60s, 70s. People are debating on like, what is the path to producing a sort of like general human level of intelligence? And kind of two schools of thought that emerged. One is the Norvig school of thought, which roughly speaking includes large language models, you know, regression, SVN, basically any model that you kind of like learn from data. And it's like data driven. Most of machine learning would fall under this umbrella. And that school of thought says like, you know, just learn from the data. That's the approach to reaching intelligence. And then the Chomsky approach is more things like compilers and parsers and formal systems. So basically like, let's think very carefully about how to construct a formal, precise system. And that will be the approach to how we build a truly intelligent system. I think Lisp was invented so that you could create like rules-based systems that you would call AI. As a language. Yeah. And for a long time, there was like this debate, like there's certain like AI research labs that were more like, you know, in the Chomsky camp and others that were more in the Norvig camp. It's a debate that rages on today. And I feel like the consensus right now is that, you know, Norvig definitely has the upper hand right now with the advent of LMs and diffusion models and all the other recent progress in machine learning. But the Chomsky-based stuff is still really useful in my view. I mean, it's like parsers, compilers, basically a lot of the stuff that provides really good context. It provides kind of like the knowledge graph backbone that you want to explore with your AI dev tool. Like that will come from kind of like Chomsky-based tools like compilers and parsers. It's a lot of what we've invested in in the past decade at Sourcegraph and what you build with Grok. Basically like these formal systems that construct these very precise knowledge graphs that are great context providers and great kind of guard rails enforcers and kind of like safety checkers for the output of a more kind of like data-driven, fuzzier system that uses like the Norvig-based models. [00:28:03]Steve: Jang was talking about this stuff like it happened in the middle ages. Like, okay, so when I was in college, I was in college learning Lisp and prologue and planning and all the deterministic Chomsky approaches to AI. And I was there when Norvig basically declared it dead. I was there 3,000 years ago when Norvig and Chomsky fought on the volcano. When did he declare it dead? [00:28:26]Swyx: What do you mean he declared it dead? [00:28:27]Steve: It was like late 90s. [00:28:29]Swyx: Yeah. [00:28:29]Steve: When I went to Google, Peter Norvig was already there. He had basically like, I forget exactly where. It was some, he's got so many famous short posts, you know, amazing. [00:28:38]Swyx: He had a famous talk, the unreasonable effectiveness of data. Yeah. [00:28:41]Steve: Maybe that was it. But at some point, basically, he basically convinced everybody that deterministic approaches had failed and that heuristic-based, you know, data-driven statistical approaches, stochastic were better. [00:28:52]Swyx: Yeah. [00:28:52]Steve: The primary reason I can tell you this, because I was there, was that, was that, well, the steam-powered engine, no. The reason was that the deterministic stuff didn't scale. [00:29:06]Swyx: Yeah. Right. [00:29:06]Steve: They're using prologue, man, constraint systems and stuff like that. Well, that was a long time ago, right? Today, actually, these Chomsky-style systems do scale. And that's, in fact, exactly what Sourcegraph has built. Yeah. And so we have a very unique, I love the framing that Bjong's made, that the marriage of the Chomsky and the Norvig, you know, sort of models, you know, conceptual models, because we, you know, we have both of them and they're both really important. And in fact, there, there's this really interesting, like, kind of overlap between them, right? Where like the AI or our graph or our search engine could potentially provide the right context for any given query, which is, of course, why ranking is important. But what we've really signed ourselves up for is an extraordinary amount of testing. [00:29:45]Swyx: Yeah. [00:29:45]Steve: Because in SWIGs, you were saying that, you know, GPT-4 tends to the front of the context window and maybe other LLMs to the back and maybe, maybe the LLM in the middle. [00:29:53]Swyx: Yeah. [00:29:53]Steve: And so that means that, you know, if we're actually like, you know, verifying whether we, you know, some change we've made has improved things, we're going to have to test putting it at the beginning of the window and at the end of the window, you know, and maybe make the right decision based on the LLM that you've chosen. Which some of our competitors, that's a problem that they don't have, but we meet you, you know, where you are. Yeah. And we're, just to finish, we're writing tens of thousands. We're generating tests, you know, fill in the middle type tests and things. And then using our graph to basically sort of fine tune Cody's behavior there. [00:30:20]Swyx: Yeah. [00:30:21]Beyang: I also want to add, like, I have like an internal pet name for this, like kind of hybrid architecture that I'm trying to make catch on. Maybe I'll just say it here. Just saying it publicly kind of makes it more real. But like, I call the architecture that we've developed the Normsky architecture. [00:30:36]Swyx: Yeah. [00:30:36]Beyang: I mean, it's obviously a portmanteau of Norvig and Chomsky, but the acronym, it stands for non-agentic, rapid, multi-source code intelligence. So non-agentic because... Rolls right off the tongue. And Normsky. But it's non-agentic in the sense that like, we're not trying to like pitch you on kind of like agent hype, right? Like it's the things it does are really just developer tools developers have been using for decades now, like parsers and really good search indexes and things like that. Rapid because we place an emphasis on speed. We don't want to sit there waiting for kind of like multiple LLM requests to return to complete a simple user request. Multi-source because we're thinking broadly about what pieces of information and knowledge are useful context. So obviously starting with things that you can search in your code base, and then you add in the reference graph, which kind of like allows you to crawl outward from those initial results. But then even beyond that, you know, sources of information, like there's a lot of knowledge that's embedded in docs, in PRDs or product specs, in your production logging system, in your chat, in your Slack channel, right? Like there's so much context is embedded there. And when you're a human developer, and you're trying to like be productive in your code base, you're going to go to all these different systems to collect the context that you need to figure out what code you need to write. And I don't think the AI developer will be any different. It will need to pull context from all these different sources. So we're thinking broadly about how to integrate these into Codi. We hope through kind of like an open protocol that like others can extend and implement. And this is something else that should be accessible by December 14th in kind of like a preview stage. But that's really about like broadening this notion of the code graph beyond your Git repository to all the other sources where technical knowledge and valuable context can live. [00:32:21]Steve: Yeah, it becomes an artifact graph, right? It can link into your logs and your wikis and any data source, right? [00:32:27]Alessio: How do you guys think about the importance of, it's almost like data pre-processing in a way, which is bring it all together, tie it together, make it ready. Any thoughts on how to actually make that good? Some of the innovation you guys have made. [00:32:40]Steve: We talk a lot about the context fetching, right? I mean, there's a lot of ways you could answer this question. But, you know, we've spent a lot of time just in this podcast here talking about context fetching. But stuffing the context into the window is, you know, the bin packing problem, right? Because the window is not big enough, and you've got more context than you can fit. You've got a ranker maybe. But what is that context? Is it a function that was returned by an embedding or a graph call or something? Do you need the whole function? Or do you just need, you know, the top part of the function, this expression here, right? You know, so that art, the golf game of trying to, you know, get each piece of context down into its smallest state, possibly even summarized by another model, right, before it even goes to the LLM, becomes this is the game that we're in, yeah? And so, you know, recursive summarization and all the other techniques that you got to use to like stuff stuff into that context window become, you know, critically important. And you have to test them across every configuration of models that you could possibly need. [00:33:32]Beyang: I think data preprocessing is probably the like unsexy, way underappreciated secret to a lot of the cool stuff that people are shipping today. Whether you're doing like RAG or fine tuning or pre-training, like the preprocessing step matters so much because it's basically garbage in, garbage out, right? Like if you're feeding in garbage to the model, then it's going to output garbage. Concretely, you know, for code RAG, if you're not doing some sort of like preprocessing that takes advantage of a parser and is able to like extract the key components of a particular file of code, you know, separate the function signature from the body, from the doc string, what are you even doing? Like that's like table stakes. It opens up so much more possibilities with which you can kind of like tune your system to take advantage of the signals that come from those different parts of the code. Like we've had a tool, you know, since computers were invented that understands the structure of source code to a hundred percent precision. The compiler knows everything there is to know about the code in terms of like structure. Like why would you not want to use that in a system that's trying to generate code, answer questions about code? You shouldn't throw that out the window just because now we have really good, you know, data-driven models that can do other things. [00:34:44]Steve: Yeah. When I called it a data moat, you know, in my cheating post, a lot of people were confused, you know, because data moat sort of sounds like data lake because there's data and water and stuff. I don't know. And so they thought that we were sitting on this giant mountain of data that we had collected, but that's not what our data moat is. It's really a data pre-processing engine that can very quickly and scalably, like basically dissect your entire code base in a very small, fine-grained, you know, semantic unit and then serve it up. Yeah. And so it's really, it's not a data moat. It's a data pre-processing moat, I guess. [00:35:15]Beyang: Yeah. If anything, we're like hypersensitive to customer data privacy requirements. So it's not like we've taken a bunch of private data and like, you know, trained a generally available model. In fact, exactly the opposite. A lot of our customers are choosing Cody over Copilot and other competitors because we have an explicit guarantee that we don't do any of that. And that we've done that from day one. Yeah. I think that's a very real concern in today's day and age, because like if your proprietary IP finds its way into the training set of any model, it's very easy both to like extract that knowledge from the model and also use it to, you know, build systems that kind of work on top of the institutional knowledge that you've built up. [00:35:52]Alessio: About a year ago, I wrote a post on LLMs for developers. And one of the points I had was maybe the depth of like the DSL. I spent most of my career writing Ruby and I love Ruby. It's so nice to use, but you know, it's not as performant, but it's really easy to read, right? And then you look at other languages, maybe they're faster, but like they're more verbose, you know? And when you think about efficiency of the context window, that actually matters. [00:36:15]Swyx: Yeah. [00:36:15]Alessio: But I haven't really seen a DSL for models, you know? I haven't seen like code being optimized to like be easier to put in a model context. And it seems like your pre-processing is kind of doing that. Do you see in the future, like the way we think about the DSL and APIs and kind of like service interfaces be more focused on being context friendly, where it's like maybe it's harder to read for the human, but like the human is never going to write it anyway. We were talking on the Hacks podcast. There are like some data science things like spin up the spandex, like humans are never going to write again because the models can just do very easily. Yeah, curious to hear your thoughts. [00:36:51]Steve: Well, so DSLs, they involve, you know, writing a grammar and a parser and they're like little languages, right? We do them that way because, you know, we need them to compile and humans need to be able to read them and so on. The LLMs don't need that level of structure. You can throw any pile of crap at them, you know, more or less unstructured and they'll deal with it. So I think that's why a DSL hasn't emerged for sort of like communicating with the LLM or packaging up the context or anything. Maybe it will at some point, right? We've got, you know, tagging of context and things like that that are sort of peeking into DSL territory, right? But your point on do users, you know, do people have to learn DSLs like regular expressions or, you know, pick your favorite, right? XPath. I think you're absolutely right that the LLMs are really, really good at that. And I think you're going to see a lot less of people having to slave away learning these things. They just have to know the broad capabilities and the LLM will take care of the rest. [00:37:42]Swyx: Yeah, I'd agree with that. [00:37:43]Beyang: I think basically like the value profit of DSL is that it makes it easier to work with a lower level language, but at the expense of introducing an abstraction layer. And in many cases today, you know, without the benefit of AI cogeneration, like that totally worth it, right? With the benefit of AI cogeneration, I mean, I don't think all DSLs will go away. I think there's still, you know, places where that trade-off is going to be worthwhile. But it's kind of like how much of source code do you think is going to be generated through natural language prompting in the future? Because in a way, like any programming language is just a DSL on top of assembly, right? And so if people can do that, then yeah, like maybe for a large portion of the code [00:38:21]Swyx: that's written, [00:38:21]Beyang: people don't actually have to understand the DSL that is Ruby or Python or basically any other programming language that exists. [00:38:28]Steve: I mean, seriously, do you guys ever write SQL queries now without using a model of some sort? At least a draft. [00:38:34]Swyx: Yeah, right. [00:38:36]Steve: And so we have kind of like, you know, past that bridge, right? [00:38:39]Alessio: Yeah, I think like to me, the long-term thing is like, is there ever going to be, you don't actually see the code, you know? It's like, hey, the basic thing is like, hey, I need a function to some two numbers and that's it. I don't need you to generate the code. [00:38:53]Steve: And the following question, do you need the engineer or the paycheck? [00:38:56]Swyx: I mean, right? [00:38:58]Alessio: That's kind of the agent's discussion in a way where like you cannot automate the agents, but like slowly you're getting more of the atomic units of the work kind of like done. I kind of think of it as like, you know, [00:39:09]Beyang: do you need a punch card operator to answer that for you? And so like, I think we're still going to have people in the role of a software engineer, but the portion of time they spend on these kinds of like low-level, tedious tasks versus the higher level, more creative tasks is going to shift. [00:39:23]Steve: No, I haven't used punch cards. [00:39:25]Swyx: Yeah, I've been talking about like, so we kind of made this podcast about the sort of rise of the AI engineer. And like the first step is the AI enhanced engineer. That is that software developer that is no longer doing these routine, boilerplate-y type tasks, because they're just enhanced by tools like yours. So you mentioned OpenCodeGraph. I mean, that is a kind of DSL maybe, and because we're releasing this as you go GA, you hope for other people to take advantage of that? [00:39:52]Beyang: Oh yeah, I would say so OpenCodeGraph is not a DSL. It's more of a protocol. It's basically like, hey, if you want to make your system, whether it's, you know, chat or logging or whatever accessible to an AI developer tool like Cody, here's kind of like the schema by which you can provide that context and offer hints. So I would, you know, comparisons like LSP obviously did this for kind of like standard code intelligence. It's kind of like a lingua franca for providing fine references and codefinition. There's kind of like analogs to that. There might be also analogs to kind of the original OpenAI, kind of like plugins, API. There's all this like context out there that might be useful for an LM-based system to consume. And so at a high level, what we're trying to do is define a common language for context providers to provide context to other tools in the software development lifecycle. Yeah. Do you have any critiques of LSP, by the way, [00:40:42]Swyx: since like this is very much, very close to home? [00:40:45]Steve: One of the authors wrote a really good critique recently. Yeah. I don't think I saw that. Yeah, yeah. LSP could have been better. It just came out a couple of weeks ago. It was a good article. [00:40:54]Beyang: Yeah. I think LSP is great. Like for what it did for the developer ecosystem, it was absolutely fantastic. Like nowadays, like it's much easier now to get code navigation up and running in a bunch of editors by speaking this protocol. I think maybe the interesting question is like looking at the different design decisions comparing LSP basically with Kythe. Because Kythe has more of a... How would you describe it? [00:41:18]Steve: A storage format. [00:41:20]Beyang: I think the critique of LSP from a Kythe point of view would be like with LSP, you don't actually have an actual symbolic model of the code. It's not like LSP models like, hey, this function calls this other function. LSP is all like range-based. Like, hey, your cursor's at line 32, column 1. [00:41:35]Swyx: Yeah. [00:41:35]Beyang: And that's the thing you feed into the language server. And then it's like, okay, here's the range that you should jump to if you click on that range. So it kind of is intentionally ignorant of the fact that there's a thing called a reference underneath your cursor, and that's linked to a symbol definition. [00:41:49]Steve: Well, actually, that's the worst example you could have used. You're right. But that's the one thing that it actually did bake in is following references. [00:41:56]Swyx: Sure. [00:41:56]Steve: But it's sort of hardwired. [00:41:58]Swyx: Yeah. [00:41:58]Steve: Whereas Kythe attempts to model [00:42:00]Beyang: like all these things explicitly. [00:42:02]Swyx: And so... [00:42:02]Steve: Well, so LSP is a protocol, right? And so Google's internal protocol is gRPC-based. And it's a different approach than LSP. It's basically you make a heavy query to the back end, and you get a lot of data back, and then you render the whole page, you know? So we've looked at LSP, and we think that it's a little long in the tooth, right? I mean, it's a great protocol, lots and lots of support for it. But we need to push into the domain of exposing the intelligence through the protocol. Yeah. [00:42:29]Beyang: And so I would say we've developed a protocol of our own called Skip, which is at a very high level trying to take some of the good ideas from LSP and from Kythe and merge that into a system that in the near term is useful for Sourcegraph, but I think in the long term, we hope will be useful for the ecosystem. Okay, so here's what LSP did well. LSP, by virtue of being like intentionally dumb, dumb in air quotes, because I'm not like ragging on it, allowed language servers developers to kind of like bypass the hard problem of like modeling language semantics precisely. So like if all you want to do is jump to definition, you don't have to come up with like a universally unique naming scheme for each symbol, which is actually quite challenging because you have to think about like, okay, what's the top scope of this name? Is it the source code repository? Is it the package? Does it depend on like what package server you're fetching this from? Like whether it's the public one or the one inside your... Anyways, like naming is hard, right? And by just going from kind of like a location to location based approach, you basically just like throw that out the window. All I care about is jumping definition, just make that work. And you can make that work without having to deal with like all the complex global naming things. The limitation of that approach is that it's harder to build on top of that to build like a true knowledge graph. Like if you actually want a system that says like, okay, here's the web of functions and here's how they reference each other. And I want to incorporate that like semantic model of how the code operates or how the code relates to each other at like a static level. You can't do that with LSP because you have to deal with line ranges. And like concretely the pain point that we found in using LSP for source graph is like in order to do like a find references [00:44:04]Swyx: and then jump definitions, [00:44:04]Beyang: it's like a multi-hop process because like you have to jump to the range and then you have to find the symbol at that range. And it just adds a lot of latency and complexity of these operations where as a human, you're like, well, this thing clearly references this other thing. Why can't you just jump me to that? And I think that's the thing that Kaith does well. But then I think the issue that Kaith has had with adoption is because it is more sophisticated schema, I think. And so there's basically more things that you have to implement to get like a Kaith implementation up and running. I hope I'm not like, correct me if I'm wrong about any of this. [00:44:35]Steve: 100%, 100%. Kaith also has a problem, all these systems have the problem, even skip, or at least the way that we implemented the indexers, that they have to integrate with your build system in order to build that knowledge graph, right? Because you have to basically compile the code in a special mode to generate artifacts instead of binaries. And I would say, by the way, earlier I was saying that XREFs were in LSP, but it's actually, I was thinking of LSP plus LSIF. [00:44:58]Swyx: Yeah. That's another. [00:45:01]Steve: Which is actually bad. We can say that it's bad, right? [00:45:04]Steve: It's like skip or Kaith, it's supposed to be sort of a model serialization, you know, for the code graph, but it basically just does what LSP needs, the bare minimum. LSIF is basically if you took LSP [00:45:16]Beyang: and turned that into a serialization format. So like you build an index for language servers to kind of like quickly bootstrap from cold start. But it's a graph model [00:45:23]Steve: with all of the inconvenience of the API without an actual graph. And so, yeah. [00:45:29]Beyang: So like one of the things that we try to do with skip is try to capture the best of both worlds. So like make it easy to write an indexer, make the schema simple, but also model some of the more symbolic characteristics of the code that would allow us to essentially construct this knowledge graph that we can then make useful for both the human developer through SourceGraph and through the AI developer through Cody. [00:45:49]Steve: So anyway, just to finish off the graph comment, we've got a new graph, yeah, that's skip based. We call it BFG internally, right? It's a beautiful something graph. A big friendly graph. [00:46:00]Swyx: A big friendly graph. [00:46:01]Beyang: It's a blazing fast. [00:46:02]Steve: Blazing fast. [00:46:03]Swyx: Blazing fast graph. [00:46:04]Steve: And it is blazing fast, actually. It's really, really interesting. I should probably have to do a blog post about it to walk you through exactly how they're doing it. Oh, please. But it's a very AI-like iterative, you know, experimentation sort of approach. We're building a code graph based on all of our 10 years of knowledge about building code graphs, yeah? But we're building it quickly with zero configuration, and it doesn't have to integrate with your build. And through some magic tricks that we have. And so what just happens when you install the plugin, that it'll be there and indexing your code and providing that knowledge graph in the background without all that build system integration. This is a bit of secret sauce that we haven't really like advertised it very much lately. But I am super excited about it because what they do is they say, all right, you know, let's tackle function parameters today. Cody's not doing a very good job of completing function call arguments or function parameters in the definition, right? Yeah, we generate those thousands of tests, and then we can actually reuse those tests for the AI context as well. So fortunately, things are kind of converging on, we have, you know, half a dozen really, really good context sources, and we mix them all together. So anyway, BFG, you're going to hear more about it probably in the holidays? [00:47:12]Beyang: I think it'll be online for December 14th. We'll probably mention it. BFG is probably not the public name we're going to go with. I think we might call it like Graph Context or something like that. [00:47:20]Steve: We're officially calling it BFG. [00:47:22]Swyx: You heard it here first. [00:47:24]Beyang: BFG is just kind of like the working name. And so the impetus for BFG was like, if you look at like current AI inline code completion tools and the errors that they make, a lot of the errors that they make, even in kind of like the easy, like single line case, are essentially like type errors, right? Like you're trying to complete a function call and it suggests a variable that you defined earlier, but that variable is the wrong type. [00:47:47]Swyx: And that's the sort of thing [00:47:47]Beyang: where it's like a first year, like freshman CS student would not make that error, right? So like, why does the AI make that error? And the reason is, I mean, the AI is just suggesting things that are plausible without the context of the types or any other like broader files in the code. And so the kind of intuition here is like, why don't we just do the basic thing that like any baseline intelligent human developer would do, which is like click jump to definition, click some fine references and pull in that like Graph Context into the context window and then have it generate the completion. So like that's sort of like the MVP of what BFG was. And turns out that works really well. Like you can eliminate a lot of type errors that AI coding tools make just by pulling in that context. Yeah, but the graph is definitely [00:48:32]Steve: our Chomsky side. [00:48:33]Swyx: Yeah, exactly. [00:48:34]Beyang: So like this like Chomsky-Norvig thing, I think pops up in a bunch of differ

The Patrick Madrid Show
The Patrick Madrid Show: August 07, 2023 - Hour 1

The Patrick Madrid Show

Play Episode Listen Later Aug 7, 2023 50:44


Patrick shares the remarkable story about the many Civil War soldiers' lives that were saved by a phenomenon they called 'Angel's Glow.' Ray - Do the sperm and Egg have separate souls that combine to make the human soul? What does the 'you fool' mean in Matthew 5:22? Andrea - In Father Brown, a TV show about a priest detective, he will take a cloth and press it to the dead victims. Why is that? Gill – I'm a Calvinist and I believe in predestination Timothy - Are different types of miracles considered Dogmas that Catholics must believe in order to be Catholic? i.e. Multiplication of the loafs in the bible Email - What do you think of the Cursillo movement? Email - Did Mary experience pain while giving birth to Jesus? Steve - Can you explain Deuteronomy 4:16? We are not supposed to have idols, so why can Catholics have statues?

Land Academy Show
How to Make a Million Dollars a Year is the Career Path Number 5 Number One Request (LA 1875)

Land Academy Show

Play Episode Listen Later Oct 19, 2022 16:29


How to Make a Million Dollars a Year is the Career Path Number 5 Number One Request (LA 1875) Transcript: Steve: Steve and Jill here. Jill: Hi. Steve: Welcome to the Land Academy Show, entertaining land investment talk. I'm Steven Jack Butala. Jill: And I'm Jill DeWit, broadcasting from the natural springs area of Arkansas. Yes, Eureka Springs right now and we are loving it. Steve: Can't say enough positive stuff about this area, with the state and the country. Jill: You know what's funny, we had a day this last week where it was like, started off, it got sunny but it started off kind of dark and weird and it rained a little bit and it was, you know the show the Ozarks, how we all know it has that funky tint. They make the, there's like a filter on the cameras or something and make it kind of a dark, interesting blue? Would you call it blue background kind of thing? I swear it felt like that. Steve: It was that color. Jill: I was running around taking pictures, I'm like, "That's real." Now I get the Ozarks show and I always thought, why do they make it look so dark? But by the way, right now we're sitting in a beautiful, sunny day. Blue sky, it's gorgeous so it's not always like that, but I get it. It was really cool. Steve: Today Jill and I are going to talk about how making a million dollars a year is the number one discussion point and request for this new career path class. Jill: There's kind of two things that came up because I want to share, to tell you, there's one other thing that everybody asks for. So it's important to know. I want leave a little teaser here. So when we went around the room the other day, everybody brought up two things and one is the money and I'll tell you the other one in a minute. Steve: Before we get into it, let's take a question posted by one of our members on the land investors.com online community. It's free. And last year a ton of people came to us, came to Jill and I requesting for help getting their first mailer out or just getting consistency in doing mailers or mailer type stuff. Well enough people came to us. Jill and I decided to turn over our own mailer department to our employees to allow them to do that. So we call it concierge data and now it's called concierge data plus, you can completely and entirely outsource doing a mailer to this department and it's a subdepartment of offers2owners.com. Or if you're having trouble getting the first one out, check it out. It's very, very efficient. And now we process a ton of orders, more and more every single month. Jill: All right, Chief wrote, "As a seller, how do you ask a buyer, how do you ask a buyer close, as a seller how do you ask a buyer to close through your selected title company? Or is it just customary for the seller to choose? It's a little late for me to be inquiring about this, but I've been letting, or should I say making the buyer's agent find one. Now that I say it out loud pretty sure I'm doing it completely wrong. I need to be sending them somewhere specific, right? My thought was it doesn't really matter to me and it's all about the same price so let them have it in case they have people they like to close through. It was one less thing I had to set up. Now I see why all the buyer's agents attitude seemed to change a little bit after their clients have signed. I'm making them do my job accidentally but sure enough, they'd sign and I'd been letting them take it from there. I'd been asking them where they want to close and wait for an email. How messy. This must be what Jack means by being able to, by doing it all wrong and still being able to pull it off. Embarrassing but enlightening." All right, so let me back up here. Steve: Embarrassing but enlightening. Jill: So here's what I do, Chief, and this, I'm going to make it easy. This will be make it easy for you actually, because I bet you putting it on them could slow you down too. I want you to stay in control for the whole transaction. Steve: Yeah,

Land Academy Show
How to Make a Million Dollars a Year is the Career Path Number 5 Number One Request (LA 1875)

Land Academy Show

Play Episode Listen Later Oct 19, 2022 16:29


How to Make a Million Dollars a Year is the Career Path Number 5 Number One Request (LA 1875) Transcript: Steve: Steve and Jill here. Jill: Hi. Steve: Welcome to the Land Academy Show, entertaining land investment talk. I'm Steven Jack Butala. Jill: And I'm Jill DeWit, broadcasting from the natural springs area of Arkansas. Yes, Eureka Springs right now and we are loving it. Steve: Can't say enough positive stuff about this area, with the state and the country. Jill: You know what's funny, we had a day this last week where it was like, started off, it got sunny but it started off kind of dark and weird and it rained a little bit and it was, you know the show the Ozarks, how we all know it has that funky tint. They make the, there's like a filter on the cameras or something and make it kind of a dark, interesting blue? Would you call it blue background kind of thing? I swear it felt like that. Steve: It was that color. Jill: I was running around taking pictures, I'm like, "That's real." Now I get the Ozarks show and I always thought, why do they make it look so dark? But by the way, right now we're sitting in a beautiful, sunny day. Blue sky, it's gorgeous so it's not always like that, but I get it. It was really cool. Steve: Today Jill and I are going to talk about how making a million dollars a year is the number one discussion point and request for this new career path class. Jill: There's kind of two things that came up because I want to share, to tell you, there's one other thing that everybody asks for. So it's important to know. I want leave a little teaser here. So when we went around the room the other day, everybody brought up two things and one is the money and I'll tell you the other one in a minute. Steve: Before we get into it, let's take a question posted by one of our members on the land investors.com online community. It's free. And last year a ton of people came to us, came to Jill and I requesting for help getting their first mailer out or just getting consistency in doing mailers or mailer type stuff. Well enough people came to us. Jill and I decided to turn over our own mailer department to our employees to allow them to do that. So we call it concierge data and now it's called concierge data plus, you can completely and entirely outsource doing a mailer to this department and it's a subdepartment of offers2owners.com. Or if you're having trouble getting the first one out, check it out. It's very, very efficient. And now we process a ton of orders, more and more every single month. Jill: All right, Chief wrote, "As a seller, how do you ask a buyer, how do you ask a buyer close, as a seller how do you ask a buyer to close through your selected title company? Or is it just customary for the seller to choose? It's a little late for me to be inquiring about this, but I've been letting, or should I say making the buyer's agent find one. Now that I say it out loud pretty sure I'm doing it completely wrong. I need to be sending them somewhere specific, right? My thought was it doesn't really matter to me and it's all about the same price so let them have it in case they have people they like to close through. It was one less thing I had to set up. Now I see why all the buyer's agents attitude seemed to change a little bit after their clients have signed. I'm making them do my job accidentally but sure enough, they'd sign and I'd been letting them take it from there. I'd been asking them where they want to close and wait for an email. How messy. This must be what Jack means by being able to, by doing it all wrong and still being able to pull it off. Embarrassing but enlightening." All right, so let me back up here. Steve: Embarrassing but enlightening. Jill: So here's what I do, Chief, and this, I'm going to make it easy. This will be make it easy for you actually, because I bet you putting it on them could slow you down too. I want you to stay in control for the whole transaction. Steve: Yeah,

Land Academy Show
How to Make a Million Dollars a Year is the Career Path Number 5 Number One Request (LA 1875)

Land Academy Show

Play Episode Listen Later Oct 19, 2022 16:29


How to Make a Million Dollars a Year is the Career Path Number 5 Number One Request (LA 1875) Transcript: Steve: Steve and Jill here. Jill: Hi. Steve: Welcome to the Land Academy Show, entertaining land investment talk. I'm Steven Jack Butala. Jill: And I'm Jill DeWit, broadcasting from the natural springs area of Arkansas. Yes, Eureka Springs right now and we are loving it. Steve: Can't say enough positive stuff about this area, with the state and the country. Jill: You know what's funny, we had a day this last week where it was like, started off, it got sunny but it started off kind of dark and weird and it rained a little bit and it was, you know the show the Ozarks, how we all know it has that funky tint. They make the, there's like a filter on the cameras or something and make it kind of a dark, interesting blue? Would you call it blue background kind of thing? I swear it felt like that. Steve: It was that color. Jill: I was running around taking pictures, I'm like, "That's real." Now I get the Ozarks show and I always thought, why do they make it look so dark? But by the way, right now we're sitting in a beautiful, sunny day. Blue sky, it's gorgeous so it's not always like that, but I get it. It was really cool. Steve: Today Jill and I are going to talk about how making a million dollars a year is the number one discussion point and request for this new career path class. Jill: There's kind of two things that came up because I want to share, to tell you, there's one other thing that everybody asks for. So it's important to know. I want leave a little teaser here. So when we went around the room the other day, everybody brought up two things and one is the money and I'll tell you the other one in a minute. Steve: Before we get into it, let's take a question posted by one of our members on the land investors.com online community. It's free. And last year a ton of people came to us, came to Jill and I requesting for help getting their first mailer out or just getting consistency in doing mailers or mailer type stuff. Well enough people came to us. Jill and I decided to turn over our own mailer department to our employees to allow them to do that. So we call it concierge data and now it's called concierge data plus, you can completely and entirely outsource doing a mailer to this department and it's a subdepartment of offers2owners.com. Or if you're having trouble getting the first one out, check it out. It's very, very efficient. And now we process a ton of orders, more and more every single month. Jill: All right, Chief wrote, "As a seller, how do you ask a buyer, how do you ask a buyer close, as a seller how do you ask a buyer to close through your selected title company? Or is it just customary for the seller to choose? It's a little late for me to be inquiring about this, but I've been letting, or should I say making the buyer's agent find one. Now that I say it out loud pretty sure I'm doing it completely wrong. I need to be sending them somewhere specific, right? My thought was it doesn't really matter to me and it's all about the same price so let them have it in case they have people they like to close through. It was one less thing I had to set up. Now I see why all the buyer's agents attitude seemed to change a little bit after their clients have signed. I'm making them do my job accidentally but sure enough, they'd sign and I'd been letting them take it from there. I'd been asking them where they want to close and wait for an email. How messy. This must be what Jack means by being able to, by doing it all wrong and still being able to pull it off. Embarrassing but enlightening." All right, so let me back up here. Steve: Embarrassing but enlightening. Jill: So here's what I do, Chief, and this, I'm going to make it easy. This will be make it easy for you actually, because I bet you putting it on them could slow you down too. I want you to stay in control for the whole transaction. Steve: Yeah,

ScaleUpRadio's podcast
We Just Want To Help Businesses Get The Right Finance

ScaleUpRadio's podcast

Play Episode Listen Later May 10, 2021 52:04


It’s the thing that makes the world go round, not just in business - money.   That’s the focus of this week’s featured business on ScaleUp Radio.   Steve Harris formed Central Business Finance in 2010 with two other financial experts with the aim of helping business across the West Midlands get access to finance.   Cut to 11 years later and the aim of the business hasn’t changed - just the scope of the business, and the clients that they deal with.   Steve Harris from Central Business Finance   This episode with Steve is a fascinating one, and we cover quite a bit of ground, including:    - How trust needs to be a big part of your business, and the way that you deal with clients and other businesses  - Accept that you will definitely make mistakes in the early days of your business, it’s the learning from them that is the important lesson  - The process of letting go of the reins, and allowing staff - the right staff - to take more control of the company  - Just what Business Rhythms are, and how they can benefit your business   Obviously finance and financial matters take up quite a bit of our time, but I think that there are some general business lessons to take away from this episode.     Steve Can be found here:   linkedin.com/in/steve-harris-36b6731a   steveh@central-finance.com   https://www.central-finance.com/   ScaleUp Radio cannot be held responsible for the content of third party websites   Scaling up your business isn't easy, and can be a little daunting. Let ScaleUp Radio make it a little easier for you. With guests who have been where you are now, and can offer their thoughts and advice on several aspects of business. ScaleUp Radio is the business podcast you've been waiting for.   You can get in touch with Kevin here: kevin@biz-smart.co.uk

Podcast For Hire
Franciscan Spirituality Center - Sister Karen Lueck

Podcast For Hire

Play Episode Listen Later Dec 30, 2020 28:27


Franciscan Spirituality Center920 Market StreetLa Crosse, WI 54601608-791-5295Steve Spilde: Welcome. I’m excited today [because] my guest is Sister Karen Lueck. She has experience as an educator, as a pastoral counselor, [and] as an author. She has been involved in leadership of the FSPA community, most recently serving as president. It’s my pleasure to introduce today Sister Karen Lueck.Sister Karen Lueck: Thank you. I’m glad to be here, too.Steve: As I often begin with guests, Sister Karen, could you tell me about your family’s religious tradition. Let’s get that grounded so we kind of know where you’re coming from.Sister Karen: My family is Catholic, [and is] very grounded in being Catholic. We were from a small town, a German town, in Iowa. Everybody in the town was Catholic, except a stray Lutheran here or there. The town and the church were pretty much one, so everybody was both. We were very dedicated Catholics, I would say – just being very, very much Catholic.Steve: When you were young, how would you have described the word “God?”Sister Karen: I think it was always, for me, it was hard to do because I think I knew that God was somebody who really cared about me at some level. But what I heard from homilies and from the church a lot of time was that I was a sinner [and] I was bad. Therefore, the only way I would be able to get to close to God is by confessing my sins and being this perfect person. I think that messed with my head and my heart for a long, long, long, long time, and it still comes up at times, I think. On one hand I was very proud of being Catholic. I was very happy [and] was inspired by the rituals and that kind of thing. But it didn’t feed my soul in other ways.Steve: People who know me know that I’m a big fan of Brene Brown, and I find her content on shame to be very helpful. But I know that you were studying shame at a deep level long before Brene Brown came along.Sister Karen: Yes. I could have been the first Brene Brown. My professor said I should publish my work, but I never did.Steve: Society wasn’t quite ready for it at the time.Sister Karen: That’s it. That’s right.Steve: Can you connect some of that work you did later in your academic study to some of those experiences you had as a kid, because it sounds like that was ambivalent. There were good things about that background, but there are also things that weren’t helpful.Sister Karen: As I went through life, starting in my 20s or probably before that, [I was] always feeling a sense that I wasn’t good enough. As I look back now, I do say that a lot of it came from the religious beliefs or what we were hearing in church. I spent a long time – and at times I still need to revisit that – working on that shame because I kept saying to myself, ‘No, it can’t be that. I want to be feeling like I love myself, and that other people love me, too.’ I spent a lot of time doing therapy, reading books about shame. Then, like you said, eventually going to graduate school and having that be my main focus to look at especially women and their psychology and spirituality or theology. [I] therefore got into the feminist movement more, again feeling like going into that it would sever me from the church because I knew that that wasn’t something the church was really advocating – then or now, probably. I think I had to do that for myself to feel like I was OK. And as I grew through that, my relationship with God also changed because all of a sudden, now I was a good person, so God must be loving me all the time because that goodness is inherent in me, and it’s inherent in others, too. I started seeing myself as a good person and God loving me. It’s all combined, and to this day I am still learning how that is – what does that mean when I believe that I’m good, [and] that other people are basically good and how that contributes. I’ve come to believe or know that God is with me all the time. God is the one who is loving me all the time – even sometimes when I’m not able to do that. That’s a real comforting thing, and it’s something I think that other people need to know. That’s why I’m glad Brene Brown is doing a lot of her work. It’s bringing it to the ordinary people about how we are good inside. We have to fight against anything that tells us that we’re not.Steve: When you were young, that message … Certainly I know myself, [and] I know a lot of people heard the negative elements of that message. But part of what made it so confusing is that there were also positives. We were hearing these mixed messages that you’re loved [and] you’re good, but also [that] you’re bad [and] you’re evil. Which is it? Where did you experience the positive side to that? When or where did you feel closest to God when you were younger, or feel the love of God?Sister Karen: I think even though my family was not demonstrative – we’re staunch Germans, and so [we’re] not as demonstrative with love – I did feel centered and loved in that way. I don’t know if I could have expressed it at that time, but I think where I felt close to God, not know that that was really spirituality at the time … I go outside sometimes and go out in our pasture that we had a creek running through there. I would just sit there and just be aware of nature. And sometimes just laying on the grass in the sun, and with my face to the sun, feeling peaceful and whole. I think later on when I was more of a teenager, I remember going into church sometimes when there was no service or anything going on, but just sitting there in the quiet and feeling something [and] feeling like, I don’t know what it was, but I think it was in awe. I don’t even know if I could have defined that as God at the time, although I think I did believe that. That’s where I found God, and where I still find God today mainly is in awe of nature – sitting in nature, walking. Now I find it in journaling now that I know that what comes out of my mouth is really hopefully what God has already put there. Every morning I ask God to let me speak God’s words. That has brought me closer, too.Steve: Where did you become acquainted with the Franciscan perspective? Was it just simply a matter of meeting Sisters over time [and] learning deeper what their perspective was? Or were you really drawn to a Franciscan perspective?Sister Karen: At first, I think I was really close to the Franciscan Sisters, our community Franciscan Sisters of Perpetual Adoration. I have two aunts who were in the community, and so we would go up to La Crosse to visit them, so I was very familiar with the community. I was taught by them for 11 years; my last year we transferred high schools. I knew them, and I felt comfortable with them. When it was time to join the convent, then that’s where I was going to go. But ironically, at that time – I joined in the late 60s – Franciscanism was only starting to come to … There was more knowledge of that. Before that, I don’t think a lot of the Sisters even focused on Franciscanism a lot. But where it really struck me was when I was invited to go to Assisi with the leadership pilgrimage when I was in leadership the first time. Going there and going where Saint Francis walked and finding out what he said and how he loved God and God loved Francis, I remember sitting on a mountain there and saying, ‘I think I am Franciscan.’ Before that, I had probably been in the community for 25, 30 years already and knowing in my head that I was Franciscan, but here, all of a sudden now knowing in my heart that I was Franciscan. It just struck me that this is what it means to be Franciscan. I think it also went along with all the work I had done on shame and goodness to recognize my goodness, and to realize that that is what Franciscanism is: to recognize that all creation is good.In the Middle Ages, when there were debates going on, Franciscans lost out in a lot of ways because the dominant message became that we’re sinners – that’s the basic thing. Francis always said everybody is good, and he saw that in everything – in nature, in people. I think that’s where I first really realized that I was Franciscan, that it wasn’t that I got it from being in the community. It was that I was that originally.Steve: To paraphrase, what I’m hearing is that somewhere inside you, you had this desire to find this source of goodness. It was kind of a pleasant surprise [to discover that], “That’s in the tradition I’ve already committed to.” That’s at the core of Francis’ message.Sister Karen: Yes. Somehow I stumbled into it, right?

10 Minutes to Save Your Marriage
Ep 128 - Follow Up! (Girlfriend Wants To Meet Up With Ex)

10 Minutes to Save Your Marriage

Play Episode Listen Later Dec 7, 2020 10:36


We got mail! Feedback from a previous episode filters into the TMTSYM Mailbag--and it's good news! Listen in and marvel at how James and Dr. Steve CAN give good advice, and how the letter-writer's relationship has been made better.

Land Academy Show
So You Made 100K on a Land Deal Now What (LA 1302)

Land Academy Show

Play Episode Listen Later Aug 6, 2020 20:44


So You Made 100K on a Land Deal Now What (LA 1302) Transcript: Steve: Steve and Jill here. Jill: Hey. Steve: Welcome to the Land Academy Show, entertaining land investment talk. I'm Steven Jack Butala Jill: And I'm Jill Dewitt broadcasting from sunny Southern California Steve: Today Jill and I talk about, so you made a hundred grand on that last land deal, now what? Jill: I know what. Steve: Well. Jill: Do it again. Steve: Celebrate. Yes, first you need to take 10 minutes and celebrate. Maybe do shot of tequila or something. Jill: Yeah, tens good. Ten minutes is good. Steve: Whatever works for you. Eat a piece of chocolate cake, I don't know whatever works for you. Jill: What is yours? Steve: And that's the whole show. Jill: Quick. You want to celebrate? What? Can I have a budget? I made a hundred thousand dollars. How much money can I spend from my separation? Steve: You know, Jill and I made a huge amount of money one time on a real estate deal. You know what we did? Bought new computers. Jill: Yeah. It went to the business and it made us more effective and it, and I was just so happy. Yep. All right. Quick, you give yourself $500. What are you going to do? Steve: God, I haven't thought about something like this in a long time. Because usually I just go do whatever I want. Jill: I know but- Steve: For 500 bucks, what would I do? You know what I would do? Call up my buddies, probably bring you and your friends, girlfriends, and just pay for everybody's night out. Jill: That's very sweet. Well, now I feel like a little bit like a heel because mine's different. Mine is I call no one. Steve: Oh my God. Is this a spa day at that MZ diamond acquisitions? Jill: No because I have $500, it's just a spa day. That's it. I call no one, I turn off my phone, I leave it in the car and I'm gone for several hours. That's how I celebrate. Steve: Jill, I speak frankly, here. You should be doing that once a week anyway. Jill: I know. I should but spa's are kind of closed right now. Steve: Why don't you schedule that? Jill: Because the spa's are closed right now. That's, trust me, don't you, don't think I'm not, that's not on my list. Steve: Can't you have like a masseuse come to the house? Jill: I haven't really tried that hard but I could probably work on that. So, but thank you, that's not what this show's about. Thank you. Steve: Yes it is. This is about a hundred grand. It's totally about this. Jill: Okay, I guess so. Okay, yes because I just learned, I didn't know of any that would come to the house. And I just heard from somebody recently that they know someone. So that's in the works. But do you know what? I still don't want to do it in my own house. I have to go somewhere because I don't want to have to hide. And you know, I want to just, I'd like to go and be treated. Steve: You want to go somewhere and do that? Huh? What if I leave the house? And then- Jill: It's still not that great. I want to go be treated. Steve: This is interesting. Jill: You know what I want to do? Steve: You learn new stuff about your mate every day. Jill: I want to go to Terranea, or something equivalent, and just really have a nice, nice time. Thank you. Steve: Before we get into it, let's take a question posted by one of our members on the landinvestors.com online community. It's free. Jill: Okay. Mohad wrote, I've been practicing using Real Quest Pro to pull data in an area I'm looking to send my first mailer. Once I enter all the criteria and submit, it seems like a lot of the data I pull has some sort of housing on it. I'm entering in zero to 0% improvement and I'm still getting many buildings slash houses. I don't want to waste money on records with houses. I've gone through each land use to figure out which is pulling the records with the houses, but it looks like they are just blended in with several uses. Any suggestions on how to get rid of the houses, to be sure I'm doing something wrong? Steve:

god land mine eat bought mz steve you steve oh steve well terranea steve can steve yes steve for jill it steve welcome
Land Academy Show
So You Made 100K on a Land Deal Now What (LA 1302)

Land Academy Show

Play Episode Listen Later Aug 6, 2020 20:44


So You Made 100K on a Land Deal Now What (LA 1302) Transcript: Steve: Steve and Jill here. Jill: Hey. Steve: Welcome to the Land Academy Show, entertaining land investment talk. I'm Steven Jack Butala Jill: And I'm Jill Dewitt broadcasting from sunny Southern California Steve: Today Jill and I talk about, so you made a hundred grand on that last land deal, now what? Jill: I know what. Steve: Well. Jill: Do it again. Steve: Celebrate. Yes, first you need to take 10 minutes and celebrate. Maybe do shot of tequila or something. Jill: Yeah, tens good. Ten minutes is good. Steve: Whatever works for you. Eat a piece of chocolate cake, I don't know whatever works for you. Jill: What is yours? Steve: And that's the whole show. Jill: Quick. You want to celebrate? What? Can I have a budget? I made a hundred thousand dollars. How much money can I spend from my separation? Steve: You know, Jill and I made a huge amount of money one time on a real estate deal. You know what we did? Bought new computers. Jill: Yeah. It went to the business and it made us more effective and it, and I was just so happy. Yep. All right. Quick, you give yourself $500. What are you going to do? Steve: God, I haven't thought about something like this in a long time. Because usually I just go do whatever I want. Jill: I know but- Steve: For 500 bucks, what would I do? You know what I would do? Call up my buddies, probably bring you and your friends, girlfriends, and just pay for everybody's night out. Jill: That's very sweet. Well, now I feel like a little bit like a heel because mine's different. Mine is I call no one. Steve: Oh my God. Is this a spa day at that MZ diamond acquisitions? Jill: No because I have $500, it's just a spa day. That's it. I call no one, I turn off my phone, I leave it in the car and I'm gone for several hours. That's how I celebrate. Steve: Jill, I speak frankly, here. You should be doing that once a week anyway. Jill: I know. I should but spa's are kind of closed right now. Steve: Why don't you schedule that? Jill: Because the spa's are closed right now. That's, trust me, don't you, don't think I'm not, that's not on my list. Steve: Can't you have like a masseuse come to the house? Jill: I haven't really tried that hard but I could probably work on that. So, but thank you, that's not what this show's about. Thank you. Steve: Yes it is. This is about a hundred grand. It's totally about this. Jill: Okay, I guess so. Okay, yes because I just learned, I didn't know of any that would come to the house. And I just heard from somebody recently that they know someone. So that's in the works. But do you know what? I still don't want to do it in my own house. I have to go somewhere because I don't want to have to hide. And you know, I want to just, I'd like to go and be treated. Steve: You want to go somewhere and do that? Huh? What if I leave the house? And then- Jill: It's still not that great. I want to go be treated. Steve: This is interesting. Jill: You know what I want to do? Steve: You learn new stuff about your mate every day. Jill: I want to go to Terranea, or something equivalent, and just really have a nice, nice time. Thank you. Steve: Before we get into it, let's take a question posted by one of our members on the landinvestors.com online community. It's free. Jill: Okay. Mohad wrote, I've been practicing using Real Quest Pro to pull data in an area I'm looking to send my first mailer. Once I enter all the criteria and submit, it seems like a lot of the data I pull has some sort of housing on it. I'm entering in zero to 0% improvement and I'm still getting many buildings slash houses. I don't want to waste money on records with houses. I've gone through each land use to figure out which is pulling the records with the houses, but it looks like they are just blended in with several uses. Any suggestions on how to get rid of the houses, to be sure I'm doing something wrong? Steve:

god land mine eat bought mz steve you steve oh steve well terranea steve can steve yes steve for jill it steve welcome
Podcast For Hire
Franciscan Spirituality Center - Sister Jolynn Brehm

Podcast For Hire

Play Episode Listen Later May 27, 2020 34:17


Franciscan Spirituality Center920 Market StreetLa Crosse, WI 54601Steve Spilde: Welcome. My guest today is Sister Jolynn Brehm. She is a Sister with the Franciscan Sisters of Perpetual Adoration. I’ve been blessed to know Jolynn for many years. She was a longtime supervisor in our Spiritual Direction Preparation Program. Welcome, Jolynn.Jolynn Brehm: Thank you. Thank you.Steve: Today we are joining by Zoom, and this is kind of a new experience for both of us. But it’s a testament that even old dogs can learn new tricks.Jolynn: Exactly. We’ll either prove it or disprove it.Steve: Sounds good. Sounds good. So tell me, Jolynn, how long have you been a Sister?Jolynn: Well, I entered the community in 1954. So actually, that’s about 66 years ago. This year I would be celebrating my 61st year of profession. It’s been an interesting and wonderful long life.Steve: Amazing. Tell me where you grew up, and tell me where the desire to become a sister, how that evolved.Jolynn: I was born and raised on a dairy farm in Colby, Wisconsin, in 1939, and I was the first liveborn member of my family. The sister who was to be a year older than me, my mom and dad’s first child, was stillborn. So when I came along, I think they kind of favored me. Obviously they were happy I was alive, so that was my experience as the firstborn of my family. My family heritage, which is very interesting that I’ve kind of thought about with these questions and this opportunity … But I realize that on both sides of my family – paternal and maternal – we are four generations of farmers. So that is very significant, and it has allowed me now to really delve into why it seems my genetics are deeply, deeply grounded in the land. Everything about nature, everything about creation, everything about how I relate to life, to people, to living is all pretty much in the framework of anything that has to do with nature and creation. It would be understandable because having great-grandparents and great-great-grandparents who worked the land, they understood how to live with the land.Steve: Were all of your relatives kind of around the farm there in Colby?Jolynn: Yes, they were. I would say probably within maybe a one-hour radius we had at least five or six different homesteaded farms in both my father’s side and my mother’s side of the family was near Auburndale, Wisconsin. That’s the central Wisconsin area – all the same kind of area. I was led to think about religious life when I entered Saint Mary’s Catholic School in Colby, Wisconsin. The principal at that time was Sister Alice McMullen. For some reason or other, we already made a connection when I was already in the first grade. She would let me go over to be with my friends or whatever. Then she left Saint Mary’s in Colby when I graduated from the eighth grade. But the year of eighth grade, she, after afternoon recess, would stand behind my desk since I was tall and in the last seat, she would stand behind there and pray a prayer to the Sacred Heart. Then we also had a family in Colby who had three sisters as FSPAs. The dad invited some of us to go and visit La Crosse, Wisconsin Saint Rose Convent, which I did. From there on, I just sensed from Alice McMullen that I admired who she was, who the sisters were, and so she kind of facilitated my entering. I also had an aunt, a sister of my mother’s, who was also a member of the community. So I had a connection with FSPA, and that’s where I entered.Steve: The Sister would stand behind you and pray to the Sacred Heart.Jolynn: Yes.Steve: For those who are not Catholic, please explain what that means.Jolynn: That way of taking time out in our school day to have some quiet and to be attentive to the idea that we weren’t just who we were, that God was an entity in our lives. And as eighth graders, I don’t think we had any clue about what that really meant. But in our Catholic tradition, we have a whole, whole long history of people who created ways of connecting with the divine or God or that entity. So we have a long, long history of many, many prayers. The Sacred Heart Prayer has to do with connecting with Jesus. Jesus revealed in his earthly experience how to be a human being, how to be present to people. He really showed that he had a heart for all of humanity. His heart was really the focus for a lot of people to feel connected to our God. So for her to call upon that aspect of Jesus to bless us as kids and to help our lives be better. Somehow or another, that touched my heart.Steve: She was just praying that you would be guided? Or that you would be guided specifically to become a Sister in La Crosse?Jolynn: It was just a general prayer for all of us in our class. And it just so happened that because I was right in front of her – she was right behind me – somehow or another it felt we were more connected simply because we were that close in space to each other. That’s how I sensed that it felt almost like it was a personal prayer for me.Steve: Nice. So then you went to La Crosse, and that’s where you went to high school?Jolynn: I was a sophomore in high school when I entered the community. And at that time, yes, we had Saint Rose High School, which was located in the north portion of the Viterbo University building. That was our way of finishing our high school, plus we also went to summer school to make the process happen faster, and to give us something to keep us out of trouble. We went up to Modena, Wisconsin, where we had a school for the Native Americans, the Ojibwa Tribe up there. We helped out in the summer. We helped out with the farming that was there. We helped with creating the things to get ready for the school year, and we just basically had a feeling for one another, and kind of a sense of community.Steve: I know you. I know that nature and connection to the earth, to the universe, is an important part of your spirituality. And it sounds like it was being formed in you already in some of these summer experiences.Jolynn: Definitely, definitely. And because we were that plot of land that was farmland as well, we would talk about our farming background. I think at that time there were seven or eight of us in that process of needing the summer school, and as far as I know all of us were from farms. Some were from Iowa, so of course we would often talk about our farms and our families and things like that. I think it always entered into how we appreciated even the opportunity to work up there on the farm.Steve: Talk about your experience with native spirituality as a result of that experience. Were you able to pick up any of that?Jolynn: Not at that time specifically. But I’ve the privilege of ministering in the Woodruff Minocqua area, which is eight miles from Lac du Flambeau, Wisconsin, which is also the Ojibwa Native Reservation. Through the years that I was up there – about 30 years I lived up in the Minocqua Woodruff area and worked up there – I became acquainted with several of the people from Lac du Flambeau. We would have book discussions, we would have conversations. And one of the things I did was when one of the women especially from Lac du Flambeau would offer programs through the technical institute where I went in Minocqua, I would always attend. There was one very specific time when, Rochelle was her name, she offered the way in which the rituals of the native people were so significant, and we participated in some of those rituals. That really grounded me in the fact that you know, my religious tradition and my spirituality, we have a lot of rituals, so there was a link. It felt like a link between how rituals in their tradition, in their life, and in mine, rituals almost always had something to do with something from nature. Either stones or grasses or incense or sage or oils or animals – something like that.Steve: Is it fair to say those experiences learning the native perspective on spirituality helped you to have a better understanding of your own Catholic tradition?Jolynn: I would say yes, because what it allowed me to do was to say, ‘OK, what is the underlying way that all peoples are invited to be aware of the greater dynamic in their lives? How are all peoples invited to sense that spirit world?’ So when I sensed how the native people were so in touch with the creator and everything that was created and how all of earth was gift, I began to realize that in our Catholic tradition, yes, we too – all from way, way back when – use earth items as ways of entering into a ritual that seemed to connect us with the divine. So it felt like that two of them together. Also then, during that same time, I began to be much more attentive to, what are some of the other traditions? What are some of the other kinds of ways people connect with the divine, or that essence, if you will. I heard some things about the Buddhist tradition. I heard some things about other cultures, the Spanish-speaking people. I think connecting with the native people opened my eyes to, how broad is and how fundamental is ritual? And for me, how fundamental is ritual using things of nature as the visuals or the ways to connect?Steve: Can you give me some examples of that within the Catholic tradition?Jolynn: In the Catholic tradition, in many of the Old Testament stories, the ways that some of the people understood that God was present to them was usually in some walking that they did. And very often there were stones, or if there is a prophet who realized he is sensing God’s presence, he would lay down and put his head on the stone. It seemed to me that was a way of that. We have the whole reality of way back in the Book of Kings, the leader of that day was encouraged to feed thousands of people with minimal, minimal. It was the use of wheat in the bread and the wine of the grapes. In our Catholic tradition, that particular piece of bread and wine from the earth is very central to our wahttps://www.fscenter.org/

Aprenda Inglês Online com Cambly
Ep.62 - Quais as melhores oportunidades de trabalho no Canadá?

Aprenda Inglês Online com Cambly

Play Episode Listen Later Feb 21, 2020 7:56


Quais as oportunidades de trabalho no Canadá? Use o código TeacherCa para fazer uma aula de inglês grátis no Cambly.* Link: bit.ly/cambly_subcribe Quer fazer uma aula com o professor canadense Steve-Can ? https://www.cambly.com/en/student/tutors/5b3e77f38860910029244be5 Ou Baixe o nosso app: Apple Store: bit.ly/cambly-applestore Google Play: bit.ly/cambly-googleplay Acesse também o nosso podcast no YOUTUBE: www.youtube.com/watch?v=YC7JrFYEW…pek9DTfOJhqWSvJDI Ep.62 - Quais as melhores oportunidades de trabalho no Canadá? Você sabe quais as melhores vagas de trabalho no Canadá? Aprenda inglês com tutores canadenses! Aprender inglês com nativos pode ser divertido! Ouça nosso podcast para melhorar seu listening e ainda aprender algumas dicas de como reduzir o sotaque em inglês? O Podcast Aprenda Inglês Online é para aqueles alunos que desejam melhorar seu conhecimento e o seu listening na língua inglesa. A cada novo episódio, você poderá aprender como as dicas da Teacher Ca e os melhores professores nativos do Cambly. Deixa seu comentário aqui! A gente vai te ajudar a aprender inglês de verdade! * o código teacherca é válido apenas para novos alunos.

英语每日一听 | 每天少于5分钟
第382期:English Movie Trailers

英语每日一听 | 每天少于5分钟

Play Episode Listen Later Jan 31, 2019 2:31


更多英语知识,请关注微信公众号: VOA英语每日一听Steve: So do you go by Andrew, or is Andy OK?Andy: Ah, Andy, please.Steve: Can you tell us about the work you do with your website?Andy: Sure. I've made a website for people to study English using movie trailers. (OK) Movie commercials. (OK) On the site, you can see lots of different trailers, over 100, (Wow) and you can do various activities (Right) you can read a summary. You can do a cloze exercise (OK) you can read a script, you can pop, you can click on a vocabulary word and it will show you definitions, example sentences. (OK, OK) yeah, I added little quizzes and stuff. (OK) Yeah, it's pretty fun.Steve: So you have the subtitles on there?Andy: Ah, no subtitles. It's all English. (OK, yeah) English trailers with English voices only.Steve: OK, so do you have the transcripts in English? You have the English subtitles in there?Andy: Yes, yes, yes. The students can watch the trailers while they read.Steve: Right. OK.Andy: So, yes.Steve: And what kind of movies have you been able to put into use?Andy: Oh, gosh, recently we've added "War of the World's".Steve: Oh, the H.G. Wells classic.Andy: Yes, the new Speilberg version.Steve: Oh, OK. OK.Andy: Yes. Yes. Um, the new "Hitchhiker's Guide to the Galaxy." is going to be up there shortly.Steve: Right. OK.Andy: It looks like a fun trailer.Steve: Excellent. Yeah.Andy: Ah, various genres, too, like computer animation or love stories or action (OK. OK) Yeah something for everybody.Steve: Yeah, and who, which type of people have been accessing the website, and who's getting the most from it?Andy: Ah, people from all over the world are going now. I've had people e-mail about this site from over 25 different countries, so it's very international.Steve: Excellent. Excellent. And how long have you had this website up?Andy: Almost three years now. (Yeah) I guess, yeah about three years.Steve: Wow. OK. It sounds extremely interesting. I'll be very much looking forward to having a look at it myself. How can I access this website?Andy: The URL is www.english-trailers.comSteve: OK, sounds great, Andy, Good luck with it.

Double Penetration Radio
Double Penetration Radio #11

Double Penetration Radio

Play Episode Listen Later Nov 1, 2016 60:22


 Playlist:1.  Galantis & Hook N Sling – Love On Me (CID Remix)2. DNCE, Lucas & Steve - Can't Get Enough Cake By the Ocean (SHAFTUNE Edit)3. Quintino x Cheat Codes - Can't Fight It (Ale Mora Remix)4. Mark Bale – Sikk [PROMO OF THE MONTH]5. Cheat Codes & Dante Klein - Let Me Hold You (Turn Me On) (Curbi Remix)6. Paul Vinx - Try7. Tommie Sunshine, Chocolate Puma  - Take The Ride 8. BROHUG  - Marshall 9. Bad Nelson, Jerry Joxx – Like This [BANGER OF THE MONTH]10. Wreckless – In Your Eyes11. L'Tric, Miles Graham - 1994 (Don Diablo Edit)12. John Dahlback, Young Rebels & Diaz feat. Terri B! - Can't Slow Down (Jean Elan Remix) [CLASSIC OF THE MONTH]13. MRVLZ - Give U My Fire14. EDX – High On You15. Stefan Vilijn & Simon Kidzoo – Every Night16. The Ashton Shuffle - Make A Wrong Thing Right (Dom Dolla Remix)17. Ferreck Dawn – Mad Love18. Jones & Brock ft. Anica – Join Me (Vol2Cat Remix) [EMOTIONAL TUNE OF THE MONTH] 

Double Penetration Radio
Double Penetration Radio #10

Double Penetration Radio

Play Episode Listen Later Oct 1, 2016 60:34


 Playlist:1.  Galantis & Hook N Sling – Love On Me (CID Remix)2. DNCE, Lucas & Steve - Can't Get Enough Cake By the Ocean (SHAFTUNE Edit)3. Quintino x Cheat Codes - Can't Fight It (Ale More Remix)4. Mark Bale - Sikk5. Cheat Codes & Dante Klein - Let Me Hold You (Turn Me On) (Curbi Remix)6. Paul Vinx - Try7. Tommie Sunshine, Chocolate Puma  - Take The Ride [BANGER OF THE MONTH]8. BROHUG  - Marshall [PROMO OF THE MONTH]9. Bad Nelson, Jerry Joxx – Like This10. Wreckless – In Your Eyes [CLASSIC OF THE MONTH]11. L'Tric, Miles Graham - 1994 (Don Diablo Edit)12. John Dahlback, Young Rebels & Diaz feat. Terri B! - Can't Slow Down (Jean Elan Remix)13. MRVLZ - Give U My Fire14. EDX – High On You15. Stefan Vilijn & Simon Kidzoo – Every Night16. The Ashton Shuffle - Make A Wrong Thing Right (Dom Dolla Remix)17. Ferreck Dawn – Mad Love18. Jones & Brock ft. Anica – Join Me (Vol2Cat Remix) [EMOTIONAL TUNE OF THE MONTH] 

NRJ GLOBAL DANCE
NRJ GLOBALDANCE v.58 - 2016 (16 Июля) RADIOSHOW

NRJ GLOBAL DANCE

Play Episode Listen Later Jul 16, 2016 56:22


Cabu & Akacia - Gold (Fabich x Ferdinand Weber Remix) King Arthur feat. TRM - Right Now (Sam Feldt Extended Edit) Bingo Players - Cry (Just A Little) (A-Trak & Phantoms Remix) Keanu Silva - Me & You (Extended Mix) Gregor Salto - Verao Khrebto - Sydney (Original Mix) Pep & Rash x Lucas & Steve - Enigma David Guetta Ft. Zara Larsson - This One's For You (STVCKS Remix) Florian Picasso - Final Call (Original Mix) Ftampa feat. Amanda Wilson - Stay (Original Mix) DJ PERETSE NRJ Megamix #004 --- 001. Headhunterz & KSHMR - Dharma 002. Toyboy & Robin feat Finnli - Can't Love Me 003. Jolyon Petch - U Sure Do (M-1 Remix) 004. Drone In Ibiza - Here Without You 005. Clean Bandit - Tears 006. Tiesto feat John Legend - Summer Nights 007. John Dahlback, Bullysongs - Walking With Shadows (Happy Love Remix) 008. Kap Slap feat M Bronx - Felt This Good 009. Volt & State - Anthems 010. ilLegal Content - Drop That 011. Junior Jack - E Samba 2016 (Freejack Remix) 012. Lee Cabrera feat Tommie Sunshine - Shake It (Roelbeat & Sharapov Remix) 013. Mari Ferrari - Hello Hello 014. Quintino x Cheat Codes - Can't Fight It 015. Arash, Snoop Dogg - OMG 016. Assix - Hearts 017. Inache & Kari - Don't Stop 018. Me & My Toothbrush - Air Miles 019. Hugel & Jasmine Thompson - Where We Belong 020. Carla's Dream - Sub Pielea Mea (Rakurs Remix) 021. Calvin Harris feat Rihanna - This Is What You Came For 022. Merk & Kremont, Steffen Morrison - Don't Need No Money 023. Navidal feat Kasai - Thinking About You 024. Lucas & Steve - Can't Get Enough 025. Dirty Freek - Love For You 026. Ryos feat Karra - Where We Are 027. Swanky Tunes, Going Deeper - Drowning 028. Aluna George - I'm In Control (Klosman 'Summer' Remix) 029. Bodybangers feat Victoria Kern - All That She Wants --- Ice MC feat. DJ Peretse - Think About The Way Same K & Stendahl - Body Language (LTN Extended Remix) Chicane - Poppiholla (Anniversary Extended Remix)

remix radio show calvin harris tiesto volt rash get enough merk arash quintino hugel swanky tunes kremont john dahlback headhunterz toyboy lee cabrera fight it cabu ryos bodybangers sharapov remix kap slap steve can cheat codes can zara larsson this one junior jack e samba rihanna this is what you came for steffen morrison don tommie sunshine shake it roelbeat need no money john legend summer nights kshmr dharma kari don
NRJ GLOBAL DANCE
NRJ GLOBALDANCE v.58 - 2016 (16 Июля) RADIOSHOW

NRJ GLOBAL DANCE

Play Episode Listen Later Jul 16, 2016 56:22


Cabu & Akacia - Gold (Fabich x Ferdinand Weber Remix) King Arthur feat. TRM - Right Now (Sam Feldt Extended Edit) Bingo Players - Cry (Just A Little) (A-Trak & Phantoms Remix) Keanu Silva - Me & You (Extended Mix) Gregor Salto - Verao Khrebto - Sydney (Original Mix) Pep & Rash x Lucas & Steve - Enigma David Guetta Ft. Zara Larsson - This One's For You (STVCKS Remix) Florian Picasso - Final Call (Original Mix) Ftampa feat. Amanda Wilson - Stay (Original Mix) DJ PERETSE NRJ Megamix #004 --- 001. Headhunterz & KSHMR - Dharma 002. Toyboy & Robin feat Finnli - Can't Love Me 003. Jolyon Petch - U Sure Do (M-1 Remix) 004. Drone In Ibiza - Here Without You 005. Clean Bandit - Tears 006. Tiesto feat John Legend - Summer Nights 007. John Dahlback, Bullysongs - Walking With Shadows (Happy Love Remix) 008. Kap Slap feat M Bronx - Felt This Good 009. Volt & State - Anthems 010. ilLegal Content - Drop That 011. Junior Jack - E Samba 2016 (Freejack Remix) 012. Lee Cabrera feat Tommie Sunshine - Shake It (Roelbeat & Sharapov Remix) 013. Mari Ferrari - Hello Hello 014. Quintino x Cheat Codes - Can't Fight It 015. Arash, Snoop Dogg - OMG 016. Assix - Hearts 017. Inache & Kari - Don't Stop 018. Me & My Toothbrush - Air Miles 019. Hugel & Jasmine Thompson - Where We Belong 020. Carla's Dream - Sub Pielea Mea (Rakurs Remix) 021. Calvin Harris feat Rihanna - This Is What You Came For 022. Merk & Kremont, Steffen Morrison - Don't Need No Money 023. Navidal feat Kasai - Thinking About You 024. Lucas & Steve - Can't Get Enough 025. Dirty Freek - Love For You 026. Ryos feat Karra - Where We Are 027. Swanky Tunes, Going Deeper - Drowning 028. Aluna George - I'm In Control (Klosman 'Summer' Remix) 029. Bodybangers feat Victoria Kern - All That She Wants --- Ice MC feat. DJ Peretse - Think About The Way Same K & Stendahl - Body Language (LTN Extended Remix) Chicane - Poppiholla (Anniversary Extended Remix)

remix radio show calvin harris tiesto volt rash get enough merk arash quintino hugel swanky tunes kremont john dahlback headhunterz toyboy lee cabrera fight it cabu ryos bodybangers sharapov remix kap slap steve can cheat codes can zara larsson this one junior jack e samba rihanna this is what you came for steffen morrison don tommie sunshine shake it roelbeat need no money john legend summer nights kshmr dharma kari don
Episodes by iBelko
iBelko - Episode 32

Episodes by iBelko

Play Episode Listen Later Jul 1, 2016 57:24


0:00 KSHMR - Dadima 3:41 Curbi - Triple Six (Original Mix) 7:15 Dave Winnel - Old School (Original Mix) 10:55 EC Twins - Stepping Up (Orignal Mix) 15:14 Bassjackers - F_CK (Dimitri Vegas & Like Mike Edit) 19:14 Bassjackers & Jay Hardway - El Mariachi (Extended Mix) 23:07 DBSTF - AFREAKA (Extended Mix) 26:33 Firebeatz & Fafaq - Sir Duke (Festival Extended Mix) 30:18 Sagan - Happiness (Extended Mix) 34:17 Lucas & Steve - Can't Get Enough (Extended Mix) 38:18 Toby Green - Everytime (Extended Mix) 41:38 DVBBS & Shaun Frank feat. Delaney Jane - La La Land (Breathe Carolina Remix) 45:12 Yves V vs Dimitri Vangelis & Wyman - Daylight (With You) (Extended Mix) 50:30 Sam Feldt x Lucas & Steve - Summer on You (feat. Wulf) 53:11 Michael Calfan - Brothers (Extended Mix)

wulf firebeatz bassjackers sam feldt dvbbs yves v shaun frank dimitri vangelis like mike edit get enough extended mix steve can toby green everytime extended mix steve summer michael calfan brothers extended mix dbstf afreaka extended mix bassjackers f ck dimitri vegas sagan happiness extended mix dave winnel old school original mix
Nerja Records Presents - iNerja
Nerja Records Presents - iNerja 089

Nerja Records Presents - iNerja

Play Episode Listen Later Jun 27, 2016 60:32


[0:00] Manse - Freeze Time (Felicity Remix) [3:00] StadiumX - Mombasa (Camarda Remix) [5:22] Sander Van Doorn - You’re Not Alone (Original Mix) [7:52] HIIO & Sevag - Anzulu (Original Mix) [11:52] Dirty Ducks - Apache (Original Mix) [15:52] Rehab - Sakura (Extended Mix) [19:00] Ashley Wallbridge - Summertime (Extended Mix) [23:00] Robbie Rivera - Bang 2k16 (Robbie Rivera Dub Mix) [25:26] Dave202 - Shake It! (Alexander Som Remix) [28:41] Tom Staar & NEW_ID - Disappear (Extended Mix) [34:11] Merk & Kremont - Don’t Need No Money (Original Mix) [37:00] Lucas & Steve - Can’t Get Enough (Original Mix) [40:52] Aluna George - I’m In Control (Klosman ’Summer’ Remix) Top 3 From Last Week: [43:52] 3) Ale Mora - Party All Night (Original Mix) [47:00] 2) Paris & Simo - Reunite (Original Mix) [50:22] 1) Tritonal - Get Away (Extended Mix) Blast From The Past: [54:30] Sander Kleinenberg - Can You Feel It (Original Mix) [57:51] Inpetto - How We Used To Do (Original Mix)

time house deep dance original records mix remix progressive freeze trance sander merk showcases doorn tom staar mombasa stadiumx manse nerja hiio get enough original mix not alone original mix steve can remix top dirty ducks apache original mix sander van doorn you sevag anzulu original mix robbie rivera bang
NRJ GLOBAL DANCE
NRJ GLOBALDANCE v.53 - 2016 (11 июня) RADIOSHOW

NRJ GLOBAL DANCE

Play Episode Listen Later Jun 11, 2016 55:31


Fabio Vuotto - La isla Bonita (tropical house edit) Alex Hook feat. Rene - Show Me Your Love (Original Mix) Bodybangers Feat Victoria Kern - All That She Wants EMBRZ - Breathe (Original Mix) Secret Sinz - Chase You Down (Original Mix) Bob Marley, LVNDSCAPE & Bolier - Is This Love (Original Mix) John Dahlback, BullySongs - Walking With Shadows (Happy Love Remix) Inache & Kari - Don't Stop (Original Mix) Martin Solveig feat. Tkay Maidza - Do It Right (Club Mix) Lucas & Steve - Can't Get Enough (Extended Mix) Gigo'n'Migo, Stoker, Phaksy Phillips - Love with the Wind (Open Beatz Festival) Markus Schulz feat. Ethan Thompson - Love Me Like You Never Did (Radio Edit) Nicola Fasano & Miami Rockets - I Like to Move it (Radio Mix) Volt & State - Anthems (Extended Mix) Assix - Hearts JL & Afterman - Black Betty (Remix 2016)

radio show stoker migo lvndscape alex hook steve can afterman black betty remix kari don
NRJ GLOBAL DANCE
NRJ GLOBALDANCE v.53 - 2016 (11 июня) RADIOSHOW

NRJ GLOBAL DANCE

Play Episode Listen Later Jun 11, 2016 55:31


Fabio Vuotto - La isla Bonita (tropical house edit) Alex Hook feat. Rene - Show Me Your Love (Original Mix) Bodybangers Feat Victoria Kern - All That She Wants EMBRZ - Breathe (Original Mix) Secret Sinz - Chase You Down (Original Mix) Bob Marley, LVNDSCAPE & Bolier - Is This Love (Original Mix) John Dahlback, BullySongs - Walking With Shadows (Happy Love Remix) Inache & Kari - Don't Stop (Original Mix) Martin Solveig feat. Tkay Maidza - Do It Right (Club Mix) Lucas & Steve - Can't Get Enough (Extended Mix) Gigo'n'Migo, Stoker, Phaksy Phillips - Love with the Wind (Open Beatz Festival) Markus Schulz feat. Ethan Thompson - Love Me Like You Never Did (Radio Edit) Nicola Fasano & Miami Rockets - I Like to Move it (Radio Mix) Volt & State - Anthems (Extended Mix) Assix - Hearts JL & Afterman - Black Betty (Remix 2016)

radio show stoker migo lvndscape alex hook steve can afterman black betty remix kari don
Sir Art (Sweet Beats)
DJ Sir Art - Sweet May 2016 Promo Mix

Sir Art (Sweet Beats)

Play Episode Listen Later Jun 6, 2016 56:50


Музыкальный лейбл Sweet Beats представляет ежемесячный подкаст из 20 лучших релизов мировых лейблов за прошедший месяц. Май 2016 года. 00:00 Calvin Harris feat. Rihanna vs. Alexx Slam & Relanium - This Is What You Came (Sir Art & Tony Sky Mashup) 04:06 EDX ft. Mingue - Missing (Joe Stone Remix) 08:42 Sick Individuals feat. jACQ - Take It On 12:42 Bojac - Six Million 14:46 Lucas & Steve - Can't Get Enough 18:22 Eva Shaw Feat. Mally Mall & Sonny Wilson - U (Club Mix) 19:46 Tritonal feat. Angel Taylor - Getaway 23:42 Curbi - Triple Six 26:12 Rain Man - Bring Back The Summer feat. OLY (Arpex Remix) 28:42 Galantis - No Money (MOTi Remix) 32:31 Don Diablo feat. DYU - Drifter 34:06 DVDG ft. Frae - Need U 37:14 Imany - Don't Be So Shy (Patrick Hagenaar Remix) 40:52 Madison Mars - Doppler (Extended Mix) 42:22 Fifth Harmony - Work From Home (Rowen Reecks & Chasner Remix) 43:46 Tom & Jame - What Goes Around 45:44 Anton Powers - Love You Better 48:56 Patrick Hagenaar - Disarm feat. Sweedish 52:50 Headhunterz & Skytech - Kundalini 54:21 Dave Winnel - Old School

rihanna calvin harris edx promo mix don diablo tritonal sick individuals headhunterz imany don alexx slam mally mall sweet beats steve can dvdg chasner remix mingue missing joe stone remix madison mars doppler extended mix
ChowCast
ChowCast 005

ChowCast

Play Episode Listen Later May 31, 2016 66:26


--CLASSIC-- 1. Coldplay - Midnight (MITS Extended Mix) 2. Jeniffer Rene, Diversion - Wishing (Original Mix) 3. Nora En Pure - Lake Arrowhead (Original Mix) 4. Nora En Pure - Zambia (Original Mix) 5. Roddy Reynaert - Bethalize (Extended Mix) 6. CamelPhat - Light Night (Original Mix) 7. Michael Calfan - Brothers (Original Mix) 8. Justin Jay & Friends feat. Josh Taylor & Benny Bridges - Karma (Club Mix) 9. Years & Years-Desire (Jerry Folk Remix) 10. Eric Prydz - Liberate (Lane 8 Remix) 11. Dirty South & Those Usual Suspects - Walking Alone [Spencer Brown Mix] 12. Lost Kings ft. Katelyn Tarver - You (Halogen x Niko The Kid Remix) 13. Niko The Kid - Easy Street 14. Michael Jackson - Don't Stop 'Til You Get Enough (Gigamesh Remix) 15. Alternative Kasual - OH! 16. Offaiah - Trouble (Original Mix) 17. Watermat - Empire (Extended Mix) 18. Lucas & Steve - Can't Get Enough (Extended Mix) 19. Arty - Poison For Lovers

friends remix classic dirty south josh taylor lost kings justin jay michael jackson don get enough extended mix eric prydz liberate lane steve can nora en pure zambia original mix watermat empire extended mix offaiah trouble original mix
Universe of Sound - Deep, Future, Progressive, Big Room House, Trance, Trap, DnB. FRESH HOT DANCE MIX.

1 Bingo Players – Tom's Diner (Bingo Players 2016 Re-Work) (Extended Mix) 2 Remy Cooper – Fill Your Hope (Extended Mix) 3 MK & Becky Hill – Piece Of Me (Keep That Dub) 4 Vicetone – Hawt Stuff (Original Mix) 5 Lucas & Steve – Can't Get Enough (Extended Mix) 6 D.O.D – Honey (Original Mix) 7 Ape Drums & Major Lazer – The Way We Do This (feat. Busy Signal) 8 Daft Punk – High Life (Olin Batista Bootleg) 9 Swanky Tunes & Going Deeper – Drownin (Extended Mix) 10 Skytech & Fafaq – Flat Beat (2016 Bootleg) 11 Cash Cash feat. Sofia Reyes – How To Love (Fawks Flip) 12 That Matters vs. Jenia x Mr.Styles – IWI (Original Mix) 13 Maestro Harrell – Siren (Extended Mix) 14 Martin Garrix & Third Party – Lions in the Wild 15 Lenx & Denx – El Toro (Original Mix) 16 DBSTF – AFREAKA (Extended Mix) 17 Bassjackers – F*CK (Dimitri Vegas & Like Mike Edit) v Sem Vox – Mumbai (DLDK India 2016 Anthem) (Extended Mix) 18 Matt Nash – From Here (Extended Mix) 19 Fifth Harmony ft. Ty Dolla $ign – Work From Home (Brooks Remix) 20 Olivia Sebastianelli – Lighting Fires (Martell Bootleg) 21 Tritonal feat. Angel Taylor – Getaway (Extended Mix) 22 Orjan Nilsen & KhoMha – Los Capos (Extended Mix)

sound universe mk bootleg martin garrix ty dolla tritonal swanky tunes fifth harmony busy signal cash cash orjan nilsen skytech ape drums anthem extended mix jenia like mike edit get enough extended mix that matters steve can bingo players tom diner bingo players fafaq flat beat dbstf afreaka extended mix bassjackers f ck dimitri vegas