Podcasts about one x

2006 studio album by Three Days Grace

  • 293PODCASTS
  • 437EPISODES
  • 54mAVG DURATION
  • 1MONTHLY NEW EPISODE
  • Mar 11, 2025LATEST
one x

POPULARITY

20172018201920202021202220232024


Best podcasts about one x

Latest podcast episodes about one x

The Latter-day Disciples Podcast
Ep. 161 | Restoring the Divine Family, with Nick + Austin of One x One

The Latter-day Disciples Podcast

Play Episode Listen Later Mar 11, 2025 78:19


6:39 One x One's mission16:21 Teaching our true identity and purpose18:54 How do we 'become one' and build Zion?22:31 The wedding day27:34 Working together after the wedding day30:49 Putting our hands in Christ's handsHealing as individuals and couples through the Atonement of Jesus Christ41:40 Husbands and wives stepping into their divine roles1:04:39 Nick's experience with the Lord1:07:54 Austin's experience with the LordThe Latter-day Disciples are SO excited to be partnering with One x One! One x One features Christ-centered therapeutic programs built to help men and women develop their divine masculinity and divine femininity, cultivate healthy and joyful marriage relationships, and restore the divine family. One x One is pleased to invite YOU, our BROTHERS, to the upcoming DESTINED TO RISE men's retreat! Join us March 30-April 2 in Toquerville, UT for another TRANSFORMATIVE event. Learn more at: https://onexone.ai/Have Feedback? Send the LDD team a text! "Consider Yourself as Eve: A Guide to Spiritual Development for Women (and the Men Who Love Them)" is available on Amazon in paperback, hardback, and ebook formats. Get your copy today! Scripture Notes - the PERFECT scripture study companion! Sign up today at: https://www.scripturenotes.com?via=meghan

TechCentral Podcast
TCS | 'Activist CEO' Adam Craker on iqbusiness, the GNU and fixing Joburg

TechCentral Podcast

Play Episode Listen Later Jan 29, 2025 35:56


Adam Craker has strong views on what's needed to turn around South Africa's fortunes and fix its biggest city, Johannesburg, which has fallen into a state of disrepair. The CEO of iqbusiness, a digital integrator in the Reunert stable formed recently though the merger of IQbusiness and +OneX, is our guest in this episode of the TechCentral Show. Craker – whose career has seen him working for the likes of Accenture, Merchants, Dimension Data and Super Group – tells TechCentral editor Duncan McLeod about his plans for iqbusiness post-merger, how it fits in with Reunert's overall growth plans and why the transaction made sense. He also unpacks: • His take on the government of national unity and why he remains bullish about South Africa's prospects; • The news that government is considering listing some of South Africa's state-owned enterprises on the JSE; • His biggest concerns about the country's future; and • What needs to be done to save Joburg – and the role of the Jozi My Jozi initiative. Don't miss a great conversation! TechCentral

Vertigo - La 1ere
Lʹescamotage de Madame Irma

Vertigo - La 1ere

Play Episode Listen Later Jan 24, 2025 3:50


Un fils de célèbre illusionniste tente de réaliser les numéros de papa avec lʹassistance médiumnique de maman, laquelle est décédée sans avoir livré un secret capital. Entre farce et magie, un drôle de tandem freudien imaginé par Claude-Inga Barbey et Pierric Tenthorey. Chronique de Thierry Sartoretti. Lausanne, Boulimie, jusquʹau 1 février. Onex, salle communale, 12 et 13 février. Yverdon-les-bains, lʹEchandole, 14 et 15 février.

Troubled Minds Radio
The UFO Zeitgeist - Blind Spots in Ufology - Part One *X Space*

Troubled Minds Radio

Play Episode Listen Later Dec 24, 2024 97:44


Modern UFOlogy overlooks the symbolic and psychological layers of encounters, fixating on physical evidence and tech. What if the truth hides in what we dismiss, not what we can measure? These blindspots might hold the key to the UFO mystery.If you are having a mental health crisis and need immediate help please go to https://troubledminds.org/help/ and call somebody right now. Reaching out for support is a sign of strength.LIVE ON Digital Radio!  http://bit.ly/40KBtlWhttp://www.troubledminds.net or https://www.troubledminds.orgSupport The Show!https://www.spreaker.com/podcast/troubled-minds-radio--4953916/supporthttps://ko-fi.com/troubledmindshttps://rokfin.com/creator/troubledmindshttps://patreon.com/troubledmindshttps://www.buymeacoffee.com/troubledmindshttps://troubledfans.comFriends of Troubled Minds! - https://troubledminds.org/friendsShow Schedule Sun-Mon-Tues-Wed-Thurs 7-10pstiTunes - https://apple.co/2zZ4hx6Spotify - https://spoti.fi/2UgyzqMTuneIn - https://bit.ly/2FZOErSTwitter - https://bit.ly/2CYB71U----------------------------------------

The Street Smart Podcast
FROM MEDICAL SPA TO OILFIELD: JEN GUERRA'S JOURNEY INTO OIL AND GAS | 2024 DEP BBQ

The Street Smart Podcast

Play Episode Listen Later Dec 3, 2024 21:07


Join us at the Street Smart Podcast as we hit the road to the Permian Basin BBQ Cookoff in Midland, Texas! In this episode, Justin Overstreet sits down with Jennifer Guerra, Executive Assistant at OneX and a newcomer to the oil and gas industry. Jennifer shares her journey of transitioning from managing a medical spa in The Woodlands to navigating the dynamic world of oilfield services. We explore her impressions of oil and gas, the challenges of stepping into a rebranded company during a transitional period, and her unique role in assisting the CEO of OneX. Jennifer also discusses her experiences at her first industry networking event, her approach to tackling diverse tasks in her position, and her thoughts on the industry's reputation and necessity. Whether you're curious about career transitions, the oilfield services world, or just looking for some insightful and candid reflections, this episode has something for everyone.

Lifetime at Work: Career Advice Podcast
From Toronto Private Equity to Launching His Own Fund with Eugene Polevoy

Lifetime at Work: Career Advice Podcast

Play Episode Listen Later Dec 2, 2024 49:07


Episode 70.  In this episode of the Lifetime at Work podcast, host Greg Martin interviews Eugene Polevoy, co-founder of Blue C Capital, a private equity firm focused on investing in service-based businesses. Eugene shares his career journey from being an automotive mechanic to starting his own private equity fund. They discuss Eugene's early days in investment banking at BMO, his extensive experience in private equity at ONCAP and Imperial Capital, and his strategic approach to investing in service-based businesses. Eugene highlights the importance of having a unique angle, leveraging digital marketing skills, and making data-driven decisions. He also talks about the challenges he faced and the factors that led him to establish his own firm.00:00 Introduction to the Podcast and Guest01:11 Eugene's Current Role and Focus03:25 Early Career and Background07:05 Transition to Private Equity11:16 Learning and Growing in Private Equity14:49 Insights on Investment Banking19:10 Developing Investment Strategies22:52 Balancing Fund Mandates and Flexibility25:38 Leveraging Digital Marketing in Business Growth27:22 Transition from ONCAP and New Opportunities27:53 Exploring the Auto Aftermarket and Housing Industries30:46 Challenges and Decisions in Private Equity33:15 Building Frontier and Entrepreneurial Ventures38:55 Raising Capital and Investment Strategies46:35 Finding Your Niche in Service-Based Industries50:27 Conclusion and Contact Information

Freemusicempire
State of The Game vol.228-Is This A Safe Space? w/ Psalm One x Optiks

Freemusicempire

Play Episode Listen Later Dec 1, 2024 87:33


ATTENDEES Psalm One, Optiks, Keith Rollins, Daniel Olney AGENDA New Business Talk about the relationship between producer and emcee that created Is This A Safe Space?  Talk about going from book tour back to rap music and the way it has changed/improved the creative process. intro and outro by andrew

TyfloPodcast
One X Player Mini Pro

TyfloPodcast

Play Episode Listen Later Oct 24, 2024


Oto kolejny komputer typu HandHeld, możliwości którego demonstruje pasjonat tego sprzętu Patryk Chojnacki, a dociekliwe pytania na jego temat zadaje Robert Rosiejka. Audycja dostępna jest również w wygenerowanej automatycznie wersji tekstowej The post One X Player Mini Pro first appeared on TyfloPodcast.

Living Inside Out with Toks
Ep #102 Forging Her Own Path: Shadé Akande on Resilience, Leadership, and the Power of ONE X LEAGUE

Living Inside Out with Toks

Play Episode Listen Later Oct 12, 2024 59:02


In this episode, I had an incredible conversation with Shadé Akande, founder of ONE X LEAGUE. Shadé, raised as a first-generation American by Nigerian parents, shares her experiences of navigating culture clashes, embracing risk, and learning to trust herself—lessons that shaped her personal and professional journey. You can watch this episode on YouTube   We dive into:   •Growing Up Nigerian-American: Shadé talks about the culture clashes of being raised by Nigerian parents in the United States and how those experiences influenced her career choices and personal growth. •Career Evolution: From the fashion industry to HR at some of the biggest global companies like Google and Verizon, Shadé explains how each step in her career taught her more about people and herself. •Launching ONE X LEAGUE: Shadé shares the mission behind ONE X LEAGUE, a private membership for Black and Afro-Latina executive women, and the upcoming ONE SUITE RETREAT, a retreat for Black executives, both male and female, happening in February next year. •Risk-Taking and Resilience: We discuss the importance of taking risks, learning from failures, and building resilience in the face of entrepreneurship challenges. •Building Your Tribe: A major theme of our conversation revolves around creating a supportive network. Shadé shares insights on how to build a tribe of mentors and supporters and reflects on our own trip to the Amalfi Coast, where we strengthened our community of women. •Mentorship: We also explore how mentorship plays a vital role in personal and professional growth, and how to find the right mentors if you're just starting out.   Shadé's story is filled with practical insights and powerful lessons for anyone looking to take charge of their career, build resilience, and create a supportive network.   Next Steps: If you loved this episode, make sure to subscribe and leave a review! You can connect with Shadé and learn more about ONE X LEAGUE at onexleague.com.  Stay tuned for more insightful conversations, and follow us on @toksa to never miss an episode!

Radio Cité Genève
Culture - AGENDA - lundi 23 au dimanche 29 septembre 2024

Radio Cité Genève

Play Episode Listen Later Sep 16, 2024 4:47


Agenda de la semaine : lundi 23 au dimanche 29 septembre 2024 EXIT ABOVE - Anne Teresa De Keersmaeker Du 25 au 27 septembre, à la Comédie de Genève, la chorégraphe Anne Teresa De Keersmaeker présente son nouveau spectacle EXIT ABOVE. Une expérience immersive mêlant marche, blues et électronique pour un voyage artistique inoubliable.

The Podcast That Rocked
Rocked X The Metal Meltdown | One X One | The Podcast That Rocked

The Podcast That Rocked

Play Episode Listen Later Jun 11, 2024 72:03


Live stream with two YouTube channels talking about anything that comes up. Rocked X The Metal Meltdown One-On-One. Subscribe to @themetalmeltdown here = https://www.youtube.com/@themetalmeltdownofficial TOP 10 WORST METAL ALBUMS OF 2023! = https://www.youtube.com/watch?v=H-NwX0HoSuc Check out more of our videos below: 10 CRINGE Rock And Metal Collaborations = https://youtu.be/5mjWBzRW-Zw Bring Me The Horizon Discography | Tier List = https://youtu.be/-hB0GMXaQNk 10 Bands Metal Elitists HATE = https://youtu.be/fKGpsunbGxo 10 AWESOME Metal Cover Songs = https://youtu.be/bGK-z_kqa-0 “The Metal Meltdown is a popular YouTube channel dedicated to heavy metal music and culture. With engaging video content, insightful commentary, and a passionate community, it has become a go-to destination for metalheads worldwide.” (Perplexity)

Late Confirmation by CoinDesk
MARKETS DAILY: Live from Consensus 2024 | Coexistence of Crypto and Traditional Finance

Late Confirmation by CoinDesk

Play Episode Listen Later May 29, 2024 14:59


The growth and focus of the ONIX Team at JP Morgan with their CEO and Co-Head of Global Payment Sales, Umar Farooq.Sponsored by BitGo. To get the show every day, follow the podcast here.Today "Markets Daily," segment is “LIVE” from Consensus 2024 in Austin, Texas. It covers a range of topics including the energy and innovation at the event, market perspectives, the Onyx project at JP Morgan, tokenization, the role of crypto in traditional finance, partnerships with crypto native companies, and the future of ONEX. Takeaways | The Onyx project at JP Morgan has been a decade-long journey, focusing on tokenization and synchronized payments.The conversation explores the coexistence of crypto and traditional finance, the role of partnerships with crypto-native companies, and the future of ONEX.Challenges and opportunities in the crypto and blockchain space are discussed, along with the growth and focus of the ONIX team at JP Morgan.Chapters | 00:00 Exploring Innovation and Market Perspectives at Consensus 202402:21 The Onyx Project: Tokenization and Synchronized Payments05:05 Coexistence of Crypto and Traditional Finance08:16 Partnerships with Crypto Native Companies and the Future of ONEX11:32 Challenges and Opportunities in the Crypto and Blockchain Space13:22 Growth and Focus of the ONIX Team at JP MorganLINKS | Onyx by J.P.MorganBitGo -This episode was hosted by Jennifer Sanasie and Helene Braun. “Markets Daily” is produced by the CoinDesk team: production assistant Victor Chen, senior producer Michele Musso, executive producer Jared Schwartz, and Senior Booker, Melissa Montañez. See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.

Markets Daily Crypto Roundup
Live from Consensus 2024 | Coexistence of Crypto and Traditional Finance

Markets Daily Crypto Roundup

Play Episode Listen Later May 29, 2024 14:59


The growth and focus of the ONIX Team at JP Morgan with their CEO and Co-Head of Global Payment Sales, Umar Farooq.Sponsored by BitGo. To get the show every day, follow the podcast here.Today's "Markets Daily," segment is “LIVE” from Consensus 2024 in Austin, Texas. It covers a range of topics including the energy and innovation at the event, market perspectives, the Onyx project at JP Morgan, tokenization, the role of crypto in traditional finance, partnerships with crypto native companies, and the future of ONEX. Takeaways | The Onyx project at JP Morgan has been a decade-long journey, focusing on tokenization and synchronized payments.The conversation explores the coexistence of crypto and traditional finance, the role of partnerships with crypto-native companies, and the future of ONEX.Challenges and opportunities in the crypto and blockchain space are discussed, along with the growth and focus of the ONIX team at JP Morgan.Chapters | 00:00 Exploring Innovation and Market Perspectives at Consensus 202402:21 The Onyx Project: Tokenization and Synchronized Payments05:05 Coexistence of Crypto and Traditional Finance08:16 Partnerships with Crypto Native Companies and the Future of ONEX11:32 Challenges and Opportunities in the Crypto and Blockchain Space13:22 Growth and Focus of the ONIX Team at JP MorganLINKS | Onyx by J.P.MorganBitGo -This episode was hosted by Jennifer Sanasie and Helene Braun. “Markets Daily” is produced by the CoinDesk team: production assistant Victor Chen, senior producer Michele Musso, executive producer Jared Schwartz, and Senior Booker, Melissa See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.

Screenwriters Need To Hear This with Michael Jamin
Ep 124 - December 8th Webinar Q&A

Screenwriters Need To Hear This with Michael Jamin

Play Episode Listen Later Mar 13, 2024 56:16


On December 8th, I hosted a webinar called “What “Do Showrunners Look For In A Script,” where I talked about how to come up with interesting and unique characters, as well as how tapping into your everyday life interactions with people can help with this. This episode addresses questions you asked in our Q&A session that we didn't have time to answer. There's lots of great info here, make sure you watch.Show NotesA Paper Orchestra on Website: - https://michaeljamin.com/bookA Paper Orchestra on Audible: - https://www.audible.com/ep/creator?source_code=PDTGBPD060314004R&irclickid=wsY0cWRTYxyPWQ32v63t0WpwUkHzByXJyROHz00&irgwc=1A Paper Orchestra on Amazon: - https://www.amazon.com/Audible-A-Paper-Orchestra/dp/B0CS5129X1/ref=sr_1_4?crid=19R6SSAJRS6TU&keywords=a+paper+orchestra&qid=1707342963&sprefix=a+paper+orchestra%2Caps%2C149&sr=8-4A Paper Orchestra on Goodreads: - https://www.goodreads.com/book/show/203928260-a-paper-orchestraFree Writing Webinar - https://michaeljamin.com/op/webinar-registration/Michael's Online Screenwriting Course - https://michaeljamin.com/courseFree Screenwriting Lesson - https://michaeljamin.com/freeJoin My Newsletter - https://michaeljamin.com/newsletterAutogenerated TranscriptMichael Jamin:Well, no one cares that you took my course, so zero. No one's going to be. That's why we don't give a diploma out because the diploma is worthless. No one really cares if you went where you studied, who taught you all they care about? Is the script good or not? Does it make them want to turn the page or not? Do they want to find out what happens next or not?Michael Jamin:You are listening to What the Hell is Michael Jamin talking about conversations in writing, art, and creativity. Today's episode is brought to you by my debut collection of True Stories, a paper orchestra available in print, ebook and audiobook to purchase And to support me in this podcast, please visit michael jamin.com/book and now on with the show.Michael Jamin:Hey everyone, welcome to a very special episode of What the Hell is Michael Jamin talking about. I'm here with my guest host Kevin Lewandowski, and he helps out a lot with the podcast, with all my social stuff, and he's actually by trade. He's a writer's assistant script coordinator, which is actually one step higher than writer's assistant, so he's worked on a bunch of shows. Kevin, welcome to the show.Kevin Lewandowski:Thank you for having me. Michael, for those of you, sorry I'm not Phil, I'm just kind of filling in for Phil for a couple days, but I'm excited to be here. And yeah, I hope to tell you all a little bit about script coordinating as well and what that all entails,Michael Jamin:Fill in and fulfill, fillKevin Lewandowski:In and fulfill.Michael Jamin:What shows were you script coordinator on?Kevin Lewandowski:So the big one was Why Women Kill.Michael Jamin:Did we ever figure out why?Kevin Lewandowski:I mean, depending on who you ask, a lot of women will say because of men,Michael Jamin:They kill for ratings.Kevin Lewandowski:Right? Okay, that's better. But yeah, that was, I forgot how long ago that was, but that was, unfortunately we got canceled four or five days before we were supposed to start filming. Our actors had just landed in Canada and then the next day they announced they were pulling the plug on the show.Michael Jamin:Why?Kevin Lewandowski:It could be many reasons. I think a lot of it had to do with we were a little bit behind on scripts and then budgeting and we were still kind of in the midst of covid precautions and things like that.Michael Jamin:Covid, people don't realize, especially new showrunners, you don't mess with the budget. You get things done on time, Ross, you're screwed. What other shows did you work on then?Kevin Lewandowski:So the first show I ever worked on was in 2015. It was the Muppets, and it was funny. I thought if anyone ever caught a break, this is my break. I was like, it's the Muppets, it's going to go on for five or six years and I'm just going to notch up every year. And after 16 episodes, that one got canceled.Michael Jamin:What's Ms. Piggy really like?Kevin Lewandowski:I mean, she is who she is. Difficult. Yeah, she's difficult. She's a bit of a diva. We have to had to cater to all of her needs.Michael Jamin:What about, I'm sorry, and what were the other shows? Screw Miss Piggy. Yeah,Kevin Lewandowski:Screw Miss Piggy. So after that, a bunch of pilots that never got picked up, and then I worked for a show on Netflix called The Ranch with AshleyMichael Jamin:ElementKevin Lewandowski:That was a live audience show and I was there for two seasons. I'm trying to think after that. It's all becoming a blur. I did two seasons of Why Women Kill. Actually the first year I was a line producer's assistant, and so that was interesting to kind of see the financial side of things and see where they decide to put the money in. And then for season three, they moved me to Script coordinator,Michael Jamin:But the Branch was a legit show. That was a big show.Kevin Lewandowski:That was a lot of fun because I'd always wanted to work in the Multicam world. There's just something about show night and it's just kind of a big party for everyone and you get to see the audience's instant gratification. It's just a lot of fun. A lot of fun to work on those shows.Michael Jamin:Yeah. Well now the next thing for us to do is try to get you into one of these jobs so you don't have to co-host with me all the time on thisKevin Lewandowski:Podcast. I don't mind co-hosting with you.Michael Jamin:Oh, all right. Well, we'll see if you feel that way at the end. Okay, that's fair. So we are doing, this is a special q and a. We do these monthly webinars or whatever, every three weeks actually, and we have a lot of questions we can't answer. And so we save 'em for the podcast. And now Kevin's going to feed them to me. He's going to regurgitate them to me. He's going to baby bird them into my mouth, and then I'm going to try to answer them as best I can.Kevin Lewandowski:Early Bird gets the worm or something like that.Michael Jamin:Gross. Kevin Gross.Kevin Lewandowski:And I apologize in advance for anyone's name I might butcher.Michael Jamin:It's okay. They don't need to. I mean whatever if you get 'em wrong. Okay,Kevin Lewandowski:So these first few questions are going to be kind of course related questions. The first one is from Dat Boy, D-A-T-B-O-I. And that person's asking, what are the best tips for making my script shine more than the rest?Michael Jamin:Oh boy. Well, I wish he would. Well, he was already at my free webinar. I wish he would sign up for my course. I mean, that's what the course is. The best tips for making it shine is making sure your act breaks pop, making sure the dialogue feels fresh, your characters are original. I mean, there's no tips. It's not a tips thing. It's 14 hours of, let me tell you how to do it. That boy, I wish. What do you think, Kevin? What's your answer for him?Kevin Lewandowski:I think it's one of the things you always say on your webinars is after taking my course, you'll just hear me yelling in your head all the time about this is your end of act two moment, this is this, this is that. And I can vouch for that and say, anytime I'm looking through a script or even watching a TV show, because of your course and just understanding the story structure, you get those spider senses like, oh, the raising the stake should be coming very soon. Now we're about halfway through the episode, so something better be changing here. And I think it's just, again, everything you say in your course of just knowing those beats when they need to hit how they need to pop will help set your script ahead of amateur writers.Michael Jamin:You're a good student, Kevin.Kevin Lewandowski:Yeah. Thanks.Michael Jamin:Alright, what's next?Kevin Lewandowski:So km phs, when I say I don't have experience, but I have a killer pilot and I took Michael Jamin's course. How much of a difference is the course going to make in terms of being a desirable hire?Michael Jamin:No one cares that you took my course. So zero no one's. That's why we don't give a diploma out because the diploma is worthless. No one really cares if you went where you studied, who taught you all they care about, is the script good or not? Does it make them want to turn the page or not? Do they want to find out what happens next or not? So I wish I could give you a better answer than that, but it's not the degree. The degree isn't worth anything. Hopefully the knowledge is worth something.Kevin Lewandowski:I think the analogy I have in my head of your courses, I look at scripts I wrote before taking your course, and it's like when you look back at high school photos and I had the Frosted tips, the pca, shell, necklace, hoop earring, and at the time it was cool. And now you look back and it's like it's pretty cringe-worthy. It's pretty cringe-worthy to see those photos. And now after taking your course, I feel like it's like now I'm wearing a suit and I don't have the poop hearing and I don't have the frosted tips, and I'm not as cringe-worthy when I look back at some of the scripts I wrote a year or so ago.Michael Jamin:Good, good. All right, good. Very good. Impressing me more and more, Kevin.Kevin Lewandowski:Right? Next question. Ous. I'm butchering that one. Nope,Michael Jamin:Perfectly. That's how he says his name.Kevin Lewandowski:Yeah. What are the most important things an inspiring writer should be aware of while reviewing one script before sending it to an established executive or writer?Michael Jamin:God, it's pretty much the same answer as all the other ones. It's like, do your act breaks, pop? Is it fresh? The dialogue, I'm sorry, but it's the same answer, so I don't really have anything to say. Yeah, yeah.Kevin Lewandowski:Next question, mal. Yay.Michael Jamin:Exactly.Kevin Lewandowski:In a 26 page pilot is page 11 two, late for the first act break, second act break or second act being on page 20.Michael Jamin:On the 26 page script, the first back page is on 11, is that what they said?Kevin Lewandowski:Yeah.Michael Jamin:It's not terrible. I've seen worse things. I'm assuming it's a single space. It's not terrible. Yeah.Kevin Lewandowski:Colin Miller, what is a good system to practice writing every day? I like this question.Michael Jamin:A good system, a good system. I don't know why you like it, because I'm stumped. I mean, I would just say write a good system is to, I'm most creative in the morning, so that's when I want to write and I try to do my busy work in the evening stuff that's easier, but you might be a night owl, but I would just carve out time every day and just sit down at the computer and write. And don't be so precious that no one's going to look at your first draft. That first draft can be terrible, so don't just get it on paper. Yeah.Kevin Lewandowski:Yeah. I think a lot of maybe misconceptions people have is writing every day isn't necessarily open up final draft and typing something. Sometimes it's going on a walk for an hour and a half and thinking about the story you're trying to tell and laying out the beats in, I live in Glendale and there's a outdoor mall. It's fun to kind of just walk around there and people watch a little bit. And sometimesMichael Jamin:The Americana, that's where you go.Kevin Lewandowski:Yep. Right By the Americana.Michael Jamin:Are you in walking distance to thatKevin Lewandowski:Few blocks?Michael Jamin:Interesting. Okay. Alright. You'd like to go on the trolley.Kevin Lewandowski:I've never been on that trolley. I'm always afraidMichael Jamin:You like to ring the bell on trolley, Kevin. Yeah.Kevin Lewandowski:I'm always afraid it's going to hit someone.Michael Jamin:Yeah, I know. I know.Kevin Lewandowski:I think takes up a lot of the bottom of the path.Michael Jamin:Yeah. AllKevin Lewandowski:Right. Next question. So NRS creates, I guess this is a question, it's more of a comment. It said, agreed. The course is changing the way I see all of my stories. Good, great.Michael Jamin:Great.Kevin Lewandowski:Christina Sini, who's a current student, and Michael Jamin's course, we learned to break and structure story well before writing those bits and pieces of a script glued together that we won't have to cling to anyone to make them fit. We basically learned how to build in order. I think that goes back to your analogy of laying the foundation first and doing, starting with the characters in beat sheets and then outlining and eventually getting to the physical writing of the script.Michael Jamin:Yeah, she's doing great, Christina. She's having a good amount of success early on, so I'm impressed.Kevin Lewandowski:Another very active person in the course, Laurie. John Michael's course is amazing. When you take the class, you also become of the Jam and Facebook community. We do table reads and give each other notes twice a month. Writer sprints, Wednesday nights and mock writer's room. So anyone that's thinking about getting the course, we have this private Facebook group and it's a bunch of great people in there and we are all just trying to build each other up.Michael Jamin:It really is. It's impressive because when you look at some of the other Facebook groups, the screenwriting groups or on Reddit or groups, it's mostly people trying to tear each other down. But because this is private, I think they're not like that at all. It's a community, I think.Kevin Lewandowski:Yeah, I think that was a big thing for you because you said you were in some of those groups, and I think you even said you sometimes as a professional working writer, you would say something that people would attackMichael Jamin:You. Yeah. You don't, what are you talking about? Oh, alright. I happened once or twice. I was say, I'm done. Yeah.Kevin Lewandowski:All right. Next question. VV oral, is it worth it? And parentheses story structure is very detailed in your course, so I think maybe it's worth it, not is it worth it? Yeah. I think it's just more people praising about your course.Michael Jamin:Okay.Kevin Lewandowski:Let's see. Okay, now we have some craft questions. Good. From Mal mavey, they, again, is it okay to end a pilot on a cliffhanger?Michael Jamin:Yeah, it's okay, but better not. You're really counting on the fact that anyone's going to care, so you're better. I think what the danger is, you may be writing towards this cliffhanger thinking that everyone's going to be so, oh my God, what's going to happen if you don't write? If all those pages beforehand aren't so great, no one's going to care what happens. And so a lot of people write towards this cliffhanger thinking, oh, aren't you going to be enthralled? And the answer is no, we don't care.Kevin Lewandowski:Yeah. Yeah. I think trying to work backwards from that I think can be a disservice. And I think it's just you definitely don't want that cliffhanger to be more exciting necessarily than your act one break, because that's what we know what we're following. Lex Macaluso, once I have a great script, what are the practical steps to do?Michael Jamin:Well, once you have a great script, write another one for sure. And then you want to make sure you actually do have a great script. And you do that by showing it to people. And it doesn't have to be somebody in the industry. It could be a friend or a mother or someone whose opinion you trust. What do you think? And if they love it and they say, this is amazing, show me something else. You're onto something. But if they say, well, I like this part, or I like when this happened, or This is a good storyline, then that's not a great script. So you have to be honest with yourself. It's really, look, it's really hard to write a great script. Everyone assumes they have it and I don't assume I have it. So when I do my job really well, I might have a good script. A great script is really, you got to really hit it out of the park.Kevin Lewandowski:And I think just that idea of what is a great script, so arbitrary, and I think it's sticking to the story structure of what you teach in your course can help set your script apart from others.Michael Jamin:Yeah. And honestly, it is those things that I'm looking for. All the things that I say that when I'm reading a script, what I'm looking for and what I'm really looking for is I want a really good script. It doesn't even have to be great because a really good script stands out great or amazing is very rare. I mean, how often do you see a movie that's been made or a TV show and you go, this is a great script. Most of the time you're like, oh, this is really good.Kevin Lewandowski:So if you were reading a script, and let's say maybe the structure wasn't where you think it should be, but the characters were very compelling and the characters were witty with what they were saying. Would you still be okay with that? Or vice versa if maybe the characters was a little bit too much speaking on the nose, but the structure and everything was spot on with that.Michael Jamin:Years ago we hired on a show, we were running a show and we were reading a ton of scripts, and we got to one where Act one was really good. Act two was really good, and Act three was not very good. And we hired him anyway because we were thought at that point, I was like, he did the first two parts really well, I could fix, or we could fix Act three, not a problem. And so I think that says a lot. You do act one, walk two. That's a big deal. He's a young writer.Kevin Lewandowski:Do you see a pattern with a lot of writers starting out is Act two where they struggle the most? Or is it act three or is it,Michael Jamin:Listen, I don't make it to act two. If Act one isn't good, I don't read further. I get another script. If I get a stack of scripts, who cares about Act two? Fact One sucks.Kevin Lewandowski:Yeah. Ben Miller, what screenplays are the best to read, to learn from perhaps the West Wing pilot, which I read in a screenwriting class?Michael Jamin:Well, it depends what you want to write. If you want to write drama, then maybe West Wing pilot, I haven't read it, but you can also learn from reading band scripts. You can say to yourself, if long as you're honest, why am I not interested in this? And if you know what to look for, why is the script not compelling? Is the dialogue, is it the act breaks? Do they now you'll know what to look for? And then the trick is to be honest with yourself. There's been times even in my early career where I might pitch something to my partner and he'll say, if you read that in a script and someone else's script, you'd say, that sucks. And I go, really? I thought it was good. He goes, no, no, you would say it sucks. So then at that point, you got to go, okay, you got to back off. And you don't fight for it. You got to be honest with yourself.Kevin Lewandowski:I think another amazing thing in today's world that didn't really exist when you start out is pretty much any show that's out there right now, you can get access to some version of the script, whether it was a writer's draft or a production draft. IsMichael Jamin:That true? How do you find them?Kevin Lewandowski:I mean, if you just go to Google and you type in Breaking Bad Pilot script, there's going to be versions that you can download. It's always interesting to read those scripts and then watch the first episode and see how much did they change? Because I doubt you'll be able to find necessarily the final shooting draft online, but those first couple writer's drafts are available. And it's always interesting just to see you're reading it and you really, really like this part, but then you watch the episode and they took it out. You're like, oh, okay. That's interesting thatMichael Jamin:If you really wanted up your game, you could also watch the pilot of Breaking Bag and type out the script while you're watching it and then read it later and look for what are the act breaks, literally, what are the act breaks? How do they work? What's the dialogue on that? What's the last line of every scene? What's the dialogue? At the last line,Kevin Lewandowski:When I was doing writer's assistant script coordinate stuff, that's what I used to do to type faster just sit and watch TV and just type out the script as it was happening.Michael Jamin:Wow, good forKevin Lewandowski:You. Because in the room, they don't like it when you say, Hey, can you slow down a little bit? Can I hear that again? No, you got to go.Michael Jamin:Yeah.Kevin Lewandowski:Okay. Part, what advice would you offer writers to adapt to the inevitable changes in developments expected in the screenwriting field and then years to come? I'm assuming that's in the context of chat, GPT, ai, that kind of stuff.Michael Jamin:Right now, that stuff is being regulated. I don't know of anybody who's using it in a writer's room. That's not to say I could easily be out of the loop, so I don't know. But right now, as far as I know, chat, GPT wasn't a tool. Any writer that I knew was clamoring for, because we all knew if it works, it's going to put us out of a job. So any changes? I don't know. I really don't know. I would just say maybe I'm naive, but stay the course. Figure out how to write without using a computer program or else, because if you're using the computer program, what do we need you for?Kevin Lewandowski:Right. Have you ever just to see what it would look like, just prompt, Chappie, just to write you a random scene just to see what it would look like, and then compare it to your knowledge you have of being a professional writer forMichael Jamin:Many years. Well, a couple of months ago, my partner decided to put some prompts into chat, GPT to come up with story ideas for Come FD for the show we were on. He just read 'em to me. We were both laughing at how terrible they were. It was like a paragraph of what's going to happen in this episode. And it was interesting how it was able to glean what the show was and what it was like, but it was just such an oversimplification of what the show, it lacked any nuance. It was kind of stupid. It was like, nah, that's not, I know. That's what it was almost like asking a 4-year-old what you think the show is and the four year olds. Yeah. Okay. You're right. It's about firemen. Okay, sure. But other than that, the ideas were terrible.Kevin Lewandowski:Yeah. Another question from NRS creates, what are your thoughts on screenwriting competition websites like Cover Fly and the Blacklist? Is that a good way to get a script into people's hands? Thoughts on one act, scripts, one act plays? Do they have three acts?Michael Jamin:A lot of questions. I think you're the better person to answer the first part.Kevin Lewandowski:Yeah. So I've definitely submitted to some of those contests just to see A, if I would get any more B, what kind of feedback they would give. And a lot of times it's not very helpful feedback. And you've talked about, you have to question who these people are that are giving feedback, because chances are, they're not professional working writers right now. They would not have the time to go through 20, 30 scripts to give feedback. So chances are these could potentially be recent college graduates that are just doing what they think, what they learned in film school. And interestingly enough, I think Phil, he went through one competition. He sent me what the feedback was, and just reading it, I was like, this sounds very Chat, GPT ai. It was just very, because he sent me other ones he got, and I was like, okay, this feels like a person actually read this. This feels like it could have been put in chat, GPT, write a response based on what you think. And then when I said that to him, he was like, you might be right. He's like, you might be right. Interesting.Michael Jamin:Back when I was writing my book and I submitted to some publishers, whatever, a couple wrote back why they didn't like it, why they didn't want to option the book or whatever, and whatever. A couple of them, their feedback was like, no, it's clear to me you barely read it. Which I understand because these were low level publishing types editors. And on their weekend read, they probably had to read a couple dozen books, manuscripts, they're not going to give it full attention. And I was like, so some of the criticism, I was like, okay, that's a fair criticism. But no, but that is not, there's literally no truth in what you're saying there. You just phoned it in because you have to read so much over the weekend. So I don't know. Got to take, no one's going. I mean, it's the same thing for these websites. Are they really going to put their heart and soul into it? No. Why would TheyKevin Lewandowski:Don't care. They just want theMichael Jamin:Money. Yeah. Why would they? Yeah.Kevin Lewandowski:You think about someone in your position giving feedback to a fellow writer that might take you two and a half hours, read the script, think about your notes, and then put 'em in a format to be able to explain them to the writer. And I don't think these people in those competitions are doing that. They probably just read it once and write down what they think. And it's funny how some of them, it's what would you rank the character dialogue on a one to 10, and they write six and a half. It's like,Michael Jamin:Where are you gettingKevin Lewandowski:That from? One is six and half. So then what would've gotten me an eight or an half or a nine?Michael Jamin:One of the things we just started doing on their website, if you have the course, our screenwriting course, I have a couple of friends who are high level writers who are willing to give notes. But here's the thing, you're going to pay. It's not cheap. You're going to pay these people to sit down and read your damn script for two or three hours and they're not getting $10 an hour. That's not what they're going to get. I don't know what you get paid for,Kevin Lewandowski:I guess. So is this a good way to get your script into people's hands? So I think, yeah, mean it's technically people's hands, but I don't know ifMichael Jamin:I don't think they're the right hands.Kevin Lewandowski:Feedback is going to be any valuable. And then thoughts on one X Scripts. One X plays, do they have three x inherently?Michael Jamin:That's an interesting question. Do they have three acts? I would say yes, in terms of the structure, in terms of what makes something compelling, but not necessarily, I guess I've written some stories in my book that don't fall into the traditional three Acts structure, but they come close. They definitely come close to it. And that's just because, well, it doesn't really matter why, but you can't go wrong. You really can't go wrong if you structure something like the way we teach.Kevin Lewandowski:So in your opinion, because heard, sometimes people use a five act structure, and I think for me, I think it's basically the same three act structure, but so act one will be act one, and then Act two isMichael Jamin:ActKevin Lewandowski:Two A and then Act two B. And so it's kind of broken up like that. So for me,Michael Jamin:Well, Shakespeare wrote that way. Yeah.Kevin Lewandowski:And he's all right. He did.Michael Jamin:Yeah. I mean, I just think it's easier not to write. I just think three is easier to get your head around. Yeah.Kevin Lewandowski:Yeah. I think just the thought of hearing the words, so writing five acts, that just sounds like it can be a lot, but if you could be like, oh, three acts, okay, I can do that.Michael Jamin:Yeah. Right. Anyone could do that. Yeah.Kevin Lewandowski:Next topic, breaking in. DJ asked when starting out to obtain that experience, what sort of job should one be searching for, staff, writer, assistant, et cetera?Michael Jamin:You should be searching for the production assistant job anywhere, and eventually, after a season or two, see if you can move to a job that's closer to the writer's room. Physically, let's do what Kevin did. That's what he did.Kevin Lewandowski:And I think there's a staff writer that's obviously not entry level assistant. There's various assistant positions you could do production assistant, you can do showrunners, assistant executive assistant. I think one of the, or the terminologies people may get confused is writer's production assistant and then writer's assistant. And the writer's production assistant is the one that's responsible for getting the lunches, stocking the kitchen, making copies, things like that. And the writer's assistant is the one that sits in the room, types up the notes and the jokes that are being pitched. And they work closely with the script coordinator. And as you've said, many times, the writer's assistant is not an entry level job. It can be very intensive times.Michael Jamin:And for what's worth, I've worked with several assistants, either writer's, assistant production assistants, who've since gone on to become staff writers have had successful careers. So it's not like many. So Kevin, hopefully you'll be next.Kevin Lewandowski:Yeah, I'm hoping so too. Next question, Sammy. ak. So the best way to get a foot in the door to support and learn the biz write in assistant or pa, we kind of just answer that. Yeah. Production assistant is that entry level. You're kind of just the gopher and you're the whatever they kind of need you go do, and you prove yourself to those people above you. And they notice. Notice people notice when you're either calling it in or you're really going above and beyond to make whoever's ahead of you life a little bit easier. Yeah. All right. Now we got some miscellaneous. Oh, here's a fun question. Tulio, how close are you to officially publishing your book, Michael,Michael Jamin:It's already out tulio. You can go get it. You can find it. Sign copies are available@michaeljamin.com slash book. Or you could search for a paper orchestra on Amazon or Barnes and Noble, or the audio book on Audible or Spotify or Apple. How about that?Kevin Lewandowski:Get the book. Everyone get the book. The comment to address from Jonathan Loudon, real world dilemma. I like this. Can't get experience without getting hired. Can't get hired without experience. That's why, who is such a reality?Michael Jamin:Well, but if you're starting off in an entry level position, you don't need to know anybody. You just have to put yourself out there. And then in terms of knowing someone later in your job, well, now you already know people. Now you broke because entry levels, literally, you have a pulse in a car. So I find that it's a convenient excuse. Put yourself out there, and Kevin, you didn't have any contacts when you broke into Hollywood. None. So there you go.Kevin Lewandowski:You just got to knock on some doors. I think people that work in the industry, they know kind of how it works. Once you break in, you become a pa, and you make those network connections with production coordinators that you've worked with and people on the show, and you build those genuine relationships and you do good. Then when they go to the next show and they're like, Hey, we need someone, then they'll reach out to you andMichael Jamin:They're not reaching out for you because they're as a favor to you. They're reaching out to you because we need to hire someone. And I don't really want to spend days interviewing.Kevin Lewandowski:I already know you can do the job. It's so much easier just to bring you aboard.Michael Jamin:Yeah, right. It's not like a favor to you. It's a favor to them.Kevin Lewandowski:Yeah.Michael Jamin:You are listening to, what the Hell is Michael Jamin talking about? Today's episode is brought to you by my new book, A Paper Orchestra, A Collection of True Stories. John Mayer says, it's fantastic. It's multi timal. It runs all levels of the pyramid at the same time, his knockout punches are stinging, sincerity, and Kirker View says, those who appreciate the power of simple stories to tell us about human nature or who are bewitched by a storyteller who has mastered his craft, will find a delightful collection of vignettes, a lovely anthology that strikes a perfect balance between humor and poignancy. So my podcast is not advertiser supported. I'm not running ads here. So if you'd like to support me or the podcast, come check out my book, go get an ebook or a paperback, or if you really want to treat yourself, check out the audio book. Go to michael jamin.com/book, and now back to our show.Kevin Lewandowski:Next question, all nighters cinema, what makes your script stand out? If it's a book adaptation and the story isn't your original story,Michael Jamin:Well, do you have the rights to adapt? A book is one question. So if you don't, I probably wouldn't adapt it. And that's not to say that when people think you adapt a book, you still have to have these act break pops. These scenes have to unfold. It's not like books are a slam dunk to adapt. I mean, there's definitely some art and craft that has to be applied to turning into a script. So that's how you make it stand out.Kevin Lewandowski:And I think one of the other things you like to say is if you have a book, there might be a few different stories happening throughout that book. And in your paper orchestra, one of the examples you get, oh, I forget what it was called about the swing dance, and I forgot that chapterMichael Jamin:Was called Yes, swing and a Miss.Kevin Lewandowski:Yeah. As you said, there was other stuff happening at that point in your life, but it was just this story was the one you wanted to tell. Of course you were going to work and doing stuff like that, but this was the story you wanted to tell.Michael Jamin:Right. And also, how many times have you seen they've adapted a book, I don't know, a popular book into a TV show movie? And sometimes it's good and sometimes it's bad. It's because it's not as simple as simply typing the book.Kevin Lewandowski:And a lot of times people say the book was even better or the book was better anyways. And I mean, it's hard to take 300 pages of a book and consented toMichael Jamin:An hour and a half movie. Right.Kevin Lewandowski:David Sallow, what if you a show idea that you have done the work on and think it uniquely speaks to the present moment? Are there any shortcuts possible there or noMichael Jamin:Shortcuts to what? You got to write a script. Yeah. There's no shortcuts to write in a good script, and there's no shortcuts to selling it. There's no shortcuts anywhere. Shortcuts. When does shortcuts ever work? I don't know. Where are the shortcuts? Yeah, little Ed riding Hood. Other than that, in real life, you got to put the work in. Right.Kevin Lewandowski:Do you ever watch the, there's a documentary about the South Park creators and how from they, from blank page to delivering the episode, how many days do you think,Michael Jamin:Well, I know they're super fast, so I would say five,Kevin Lewandowski:Six.Michael Jamin:Six.Kevin Lewandowski:Okay. Six days. That's very fast. They are delivering it like a half hour before it's supposed to. Yeah.Michael Jamin:And that's because the animation process is so crude that they can do it so quickly, but that's fast,Kevin Lewandowski:And we've just gotten used to it that way. So I think with them in an interesting way, that's why their shows seem like their current and present, because something could have happened in the news last week, and then that episode could air next week. Whereas other animation shows, and I know you've worked in animation, sometimes it's seven, eight months before that episode,Michael Jamin:Or it could be nine months, nine months animated show. So yeah, you don't do anything top of one within in an animated show, not the ones I've done.Kevin Lewandowski:Yeah. Next question. What if I wrote lyrics to the theme song? Is that okay to include? I think this might be in the context of one of the things you say in your scripts, don't write music cues. Don't write, don't put song lyrics in there, or something like that.Michael Jamin:I mean, if you think you got fantastic lyrics and you're going to really impress the hell out of someone, but you still have to, when I'm reading the script, I have to imagine what the music is, and I'm not going to imagine the music. And I suppose you can write the lyrics and maybe some people will read it and some won't. So it's up to you. Do you really think it's fantastic or not?Kevin Lewandowski:I had a couple scripts that I put part of a song in there and then listening to, I'm like, no, it's coming out, taking it out.Michael Jamin:In my opinion, there's really no, I'm not crazy about reading that.Kevin Lewandowski:Yeah,Michael Jamin:I mean, maybe others are, I don't know.Kevin Lewandowski:Well, I think, I think back to my script, it was I just kind of being lazy. Could I take that three eighths of a page and add something in there that's going to help move the storyline further, or was I just looking for a, what's a funny moment I could have right now?Michael Jamin:Right. Okay.Kevin Lewandowski:Let's see. From Aaron, in terms of recognizing good writing, writing, what is considered too much in terms of providing direction to actors, description of character, thoughts and emotions, et cetera?Michael Jamin:The less the better, in my opinion. You don't want let the actors do their job, and if you feel you can't convey the anger in a scene or the love in a scene with dialogue and you're yelling at the actors, do it this way, then you haven't done your job as the writer do your job. Not everyone else's. As far as action lines go, I am of the camp that the shorter the better because most writers or most people reading do not want to read your action line. I suppose one day, if get, I think when you get more successful, if you're Aaron Sorkin, you can write whatever the hell you want. You're, because he writes his actions line. I imagine poetry, it's probably his action lines are probably just as interesting as his dialogue because he's such a great writer, but don't count on it when you're starting off.Kevin Lewandowski:I was reading something, I forgot who the actor was, but they said, the actor always requested that their script have commas and apostrophes taken out of dialogue because they felt like they didn't want someone telling them how to say things. And I was like, I can respect as an actor, but I was like, that poor script coordinator, they have to go through that whole script again and take everything out.Michael Jamin:That's a little bit much to me. It seems like putting a comma there is like that's just grammar. And if they wanted to take it out, I think they should do it themselves, but whatever,Kevin Lewandowski:From Jonathan Loudon, again, how many feature films have you written, pitched, but never sold?Michael Jamin:Well, we wrote one completely as a spec, and that did not sell, but that got us a producer interested in our writing, and then we wrote two more that did sell as pitches. We pitched them first, then we got paid to write the script. And as far as I can remember, I don't think we wrote any other feature scripts. I think we maybe had some ideas that were batting around, but we never actually pitched or wrote, but we work mostly in tv.Kevin Lewandowski:So do you know, because from what I can recall, you've never sold a feature that actually went into production, correct. Right,Michael Jamin:Right. They they never do.Kevin Lewandowski:Yeah. And how do you think you would feel, because as you say, tv, the showrunner head writer has the final say, and on a feature, it's the director that has the final say. I worked with someone, his name's Steve Rudnick, and he wrote Space Jam and the Santa Clause movies with Tim Allen, and he told me this story how he was at a baseball game and he saw someone walking down the aisle and it had a Space jam cast and crew jacket. And he asked the guy and he was like, can I ask you where you got that jacket? That's a really cool jacket. And he's like, oh, I worked on production. This was all our rap gifts, and Steve never got one because writers usually aren't part of the production aspect onMichael Jamin:Feature, and he was accredited writer on it. Right. That's what an actor thought he was. Yeah. Yeah. I think that's probably common. I don't know why people want to become writers on movies. I mean, it would be cool, but maybe he was heavily rewritten. Maybe he was, I don't know.Kevin Lewandowski:He was so bummed. Yeah.Michael Jamin:Yeah. He wasn't invited to anything.Kevin Lewandowski:Yeah. Right. Geo, could you elaborate on the things not to say to executives or some examples of what the producer said?Michael Jamin:What the producer said? I'm not sure I answered the question.Kevin Lewandowski:So can you elaborate on the things, so I guess as a writer, and maybe you gave your script to an executive and they were giving you feedback or said, Hey, maybe do this, do this. How would you respond to those notes?Michael Jamin:Yeah, you want to be positive. Great. We'll work on that. Thank you. Good idea. Interesting thought. We'll definitely do our best with that, and then later, hopefully you can take 90% of the notes and the ones you can't take, you say, I think we address the spirit of your note. Even if we couldn't address your notes or this one, we couldn't make it work occasionally, but you're doing 90% of the notes. Yeah.Kevin Lewandowski:I think the phrase I would always hear on notes calls is, okay, well, yeah, we'll take a look at it. We'll take a look at that. Yeah,Michael Jamin:We'll take a look at it. Yeah. We,Kevin Lewandowski:Next question from Cody, with short seasons, freelance opportunities have mostly gone away, but are there still opportunities for freelance, and if so, how are writers polled in for those?Michael Jamin:I don't know. That's a good question because that's a question. You'd have to look that up with the Writer's Guild. I don't remember on our last show there, I don't recall ever having those guys doing freelance, giving off freelance episodes to anyone. So it used to be a Writer's Guild mandate if the show was a certain length that they had to give out a certain number of freelancers. And now maybe they don't have to, but I wouldn't either way get it out of your head that you're ever going to sell a freelance episode because it's just so over my 28 years, I think I've sold maybe three freelance episodes and I would do more. It's not a problem. It's just that they're really hard to get.Kevin Lewandowski:And I think a lot of times what happens in writer's rooms is those writer's assistants and script coordinators that have proved their worth for a couple of seasons. If that opportunity comes for them to get a freelance episode, the showrunner helps 'em out with that, and that helps them get into the Writer's Guild and things likeMichael Jamin:That. That's usually a bone you throw those support staff after they've been there a couple of years.Kevin Lewandowski:That's a nice bonus. It's a nice check to get. Next question, David Campbell. Does the creator continue to have involvement or do you teach them on the job?Michael Jamin:If someone creates the show and they are not the showrunner, which just happened on a couple shows we've done. We were not the showrunner and the creator had involved. They were on the writing staff, but they didn't have any say. They didn't have the final say or anything. If we are the showrunner, whoever's the runner has final say. Yeah.Kevin Lewandowski:Next question, nerds and friends, how many writers' rooms are virtual remote nowadays? What is the path to becoming a showrunner? Is it a writer pivoting into that role? I can imagine producing experience helps.Michael Jamin:No, so a showrunner is the head writer. The way you become a showrunner is by being a writer on many shows and being good at writing, and then the producing aspect of the job. You kind of learn on the job as you rise up the ranks. You don't have to take a course or there's no certification, and it's something you can fake.Kevin Lewandowski:For me, I never really understood what the word producer meant. No one in the context of television, because it's working in the industry, you learn, okay, writers can be producers, but then sometimes accountants, if they're high enough, they can also be producers. And not every producer is necessarily like the creative vision. Some of them deal with the money aspect of it.Michael Jamin:Yeah. They're non-writing producers or non-writing executive producers, they'reKevin Lewandowski:Called. Yeah. Next question, K with an asterisk next to it. Are series filmed for streaming services similar to TV regarding creative control for the show runner?Michael Jamin:Yeah. Yes.Kevin Lewandowski:Easy question. Yeah, all-nighter cinema. How different is trying to greenlight a serial TV show versus a mini series?Michael Jamin:It just depends on what the network, usually they're buying series. They're not buying mini series there. Sometimes they're buying limited series. It just depends on the network. And I wouldn't even approach, again, your goal is to write one great script as a writing sample, and it's not to time the market and figure out who's buying what. Can you write a script? Answer that question first,Kevin Lewandowski:Right? If a studio buys your pilot but ends up passing and an exec at another studio is interested, how realistic is it that they'll buy it againMichael Jamin:If the first one will buy it?Kevin Lewandowski:I don't know. I'm wondering if they're asking just because one studio passes on your script, does that mean every studio is going to pass on it?Michael Jamin:No. No. Usually if you're lucky, you pitch to five studios and one buys it. That's how they don't all want to buy it. You're lucky if one wants to buy it. But again, what's frustrating about all these questions that we're hearing is everyone's saying, how do I make money selling a script? And no one's saying, how do I write a good script? Everyone is already assuming that. It's just so damn frustrating. It's like, guys, what do you think? How do you think this is going to work? It's not about the meeting. It's about writing a damn good script. First thing's first. So I don't know, what are you going to do? I yell into the wind. People don't listen to me on this.Kevin Lewandowski:I listen. They'll listen. They'll listen. Yeah. I mean, I think there's almost this weird delusion that people think they're going to move out here within a year. They're going to have their own show. And I was just talking to someone the other day that they're going to USC, and she was talking about kind of her timeline with things, and she said, I want to give myself five years from when I graduate in 2025 to try to get into a writer's room. And when she said that to me, I said, very realistic. That's not too quick that, because there's a lot of luck of, IMichael Jamin:Thought you were going to say have her own show on the air.Kevin Lewandowski:No, no. She was very much, if I can be in a writer's room in five years. So I thought, yeah, because tough, because if you can get on that show that season one, it's not a hit yet, then it becomes a hit that can definitely fast track you a little bit. And my struggle has been, none of the pilots I've worked on have gotten picked up and shows have gotten canceled. And I'd like to believe that's not my fault, but it's hard to look at the No, I'm kidding. I'm kidding.Michael Jamin:But yeah. But it's a little frustrating when people ask these questions sound to me like when I hit a grand Slam, who do I high five first? They're like, dude, can you get on base? Do you know how to get a base hit? What are you talking about? Just get a base hit first. So that's what it sounds like to me. And I wish people would just have more realistic expectations and would take a little more, everyone's assuming they already knew how to do the hard part.Kevin Lewandowski:Yeah. Next question, given that streaming has changed the face of sitcom series writing, how do you feel about the future of the industry? Are there days of having full writer's room and staff over?Michael Jamin:It certainly seems that way, but who knows right now, if you follow what's going on, it seems like, it seems like everything's becoming, we're slowly moving back to the old days. There's going to be fewer streamers. They're going to be consolidation. They're already talking about these big streamers merging. And when that happens, things will change, but we don't really know. Right now, the industry's at a crossroads. They're not picking up a lot of shows. Now. They will pick up start. That will happen. And imagine a couple of, it can't go on much longer. They got to have to start pulling the trigger and start making TV shows again. So we don't know. We're at the crossroads,Kevin Lewandowski:Because I think you said back when you were working on, just Shoot Me In, I think you said King of the Hill, there was more than 15 writers on King. KingMichael Jamin:Of the Hill. We had 20 writers in King of the Hill, and we were do 22 episodes in a season.Kevin Lewandowski:And how many were on Just Shoot Me?Michael Jamin:Well, let's see. In the beginning, I would say it's closer to maybe 10 or so, maybe 12 at some point.Kevin Lewandowski:And in your experience, do you think comedy rooms always have more writersMichael Jamin:Than drama? I don't know. I mean, it just really depends on the budget of the show and how many episodes you're going to be doing.Kevin Lewandowski:I think I was watching something about Breaking Bad, and I think they had six writers.Michael Jamin:Oh, really? That's it.Kevin Lewandowski:Wow. On why Women Kill. We had five.Michael Jamin:The thing about drama is that you don't have to, it is easier in the sense that when you're writing a comedy, you still need to have that structure. You still need to come up with a story that is engaging, but it also has to be funny. But when you're doing the drama, you just need to come up with an engaging story, and it doesn't have to be funny, and you don't have to punch up the lines. And in that sense, I do think it's a little easier, but that's not to say writing Breaking Bad is easy. I mean, what a great show that works.Kevin Lewandowski:Right, right. Next question from maybe, are there tutorials and Final Draft, a proper guide for making your script presentation acceptable?Michael Jamin:What do you think? I don't know. I haven't looked at the tutorials.Kevin Lewandowski:Yeah. I mean, I think the nice thing about Final Draft is they have pre-built templates that you can use. So if you're writing a Multicam, it'll prebuilt that template and everything will automatically be capitalized for you. And same thing with Single Cam. And I think one of the things you always say is when you hand your script to someone, they're not going to know you use Final Draft or one of these other programs to write the script. They're just going to get a printed out version. And I think there's minimal things you need to do, make sure the dialogue is in the middle of the page and certain things are capitalized, and there's a certain format formatting of that. But Final Draft can take care of all that too. So when you're done writing, you just hit file, export as PDF, and that's it. You're done. All the four is done.Michael Jamin:I mean, final Draft, like you said, has those templates, and it'll make your script look like a script, which is great. You got a script, you got something that looks like a script, but does it read like a script?Kevin Lewandowski:Right. Har Draft does not do that for you. Yeah, it won'tMichael Jamin:Do that.Kevin Lewandowski:Michael's course does.Michael Jamin:Yeah. I hope,Kevin Lewandowski:Lorenzo, given your friendship with the late David Bellini, have you got any insights on Italian films, TV industry, in your opinion? Is there any difference? Thank you.Michael Jamin:From what I knew from David. David when he was a lot, the difference is enormous. It's a whole different film structure over there. It's not so much of an industry as it is. I don't know. It sounded like really hard. And he was pretty successful. He worked on a bunch of shows, and he moved to LA to Hollywood because he was like, this is too crazy here. This is just not enough work. So I think it was a miracle that he was as successful as he was there, but it's a whole different ballgameKevin Lewandowski:If the script doesn't have scenes in it. How should it be written? Is it just dialogue and descriptions? Do you have any advice for someone who wants to be a script doctor?Michael Jamin:Okay. The script does have to have scenes in it. It can't be all one scene. That's not going to be acceptable. A script doctor is not really, that's some bullshit that people say on the internet. No one I've ever met ever called themselves a script doctor. We're all screenwriters. And sometimes you sell your own work, and sometimes you're brought in to rewrite somebody else's, and there's no script doctor. You don't get a degree and you don't wear a stethoscope. And that's not a job. It's just sometimes will get paid to rewrite someone else's script, but you'll only get that job if you're a really good writer and you've written some really good scripts on your own. And then when you do, usually you're like, hell, I'll just write. I want to write my own stuff. And you're brought in to change someone else's script because it's like, all right, someone's giving me money and here's a job, and I'm in between jobs, so I'll do it.Kevin Lewandowski:There's no shortcuts. A couple more questions, Aaron. How many followers, subscribers would someone need to have on social media for that to be interesting and asset to a studio or showrunner?Michael Jamin:Literally have no idea. And I'm not sure it would be interesting to a showrunner at all as far as the studio, in terms of being a writer. You're not expected to have a social media following at all. I just happen to have one, but it's not right. No one's, no one ever asked me, no one really cares. The benefit is I can promote my own stuff. I have a following, but for a writer, you don't need that.Kevin Lewandowski:Yeah. And then our last question, is it okay to make the size of the words on the title page a little bit bigger?Michael Jamin:I suppose it is. I don't try to do anything fancy, but I don't know why you want to. It's okay if you want to. It's not desperate, but I don't know. I try to make it, I want my script to look like just an ordinary script. I want the pages themselves, the dialogue to stand out. I'm not really trying to make the cover page stand out.Kevin Lewandowski:Yeah, I think it's like when writing any paper you did in college or whatever the title is, 18 font, and then the stuffy writing is 12 font or whatever.Michael Jamin:Yeah, you can do that.Kevin Lewandowski:Yeah. I think one of the things you said is the title page. No one necessarily cares about that. If you put a fancy image on there, that's not going to, people aren't going to be like, oh, we got to hire this person. We got to hire this person right now.Michael Jamin:Yeah. Don't even give any thought to the title. I mean, really. You're not going to fool anybody. So yeah.Kevin Lewandowski:Well, that is all the questions we have from that webinar.Michael Jamin:Wow. Excellent. Kevin, you did really well. You're a natural here. Thanks. Yeah. Alright, everyone. Thank you. Please continue coming to our webinars. We do 'em every few weeks. To sign up, go to michael jamin.com/webinar. I got a book out. I hope you all get it. Sign copies are available @michaeljamin.com slash book. And if you want to come see me on tour, go to michael jamin.com/upcoming. Kevin, where can people find you?Kevin Lewandowski:I'm on social media, Kevin Lewandowski. Sorry it's a very long last name. It gets butchered a lot, but I'm there. And yeah, I occasionally make appearances with Michael on these webinars and things like that. So yeah. Thank you all for who's been coming to the webinars and checking out Michael's stuff. Just go to michael jamen.com and just start clicking around. There's a bunch of stuff you can get his free scripts, stuff he's written. There's free lessons up there. Every podcast we do gets uploaded there. You can spend hours on that websites. Just go there, click around, buy the book byMichael Jamin:The book. Thank you so much buddy. Alright. You're just going to stick around. Kevin's going to be back next week for another episode. I believe it's next week. We will see when it drops, but he's going to be back around for another one. Alright, everyone, until then, keep writing, keep being creative and all that stuff. Thanks so much.Michael Jamin:Wow. I did it again. Another fantastic episode of, what the Hell is Michael Jamin talking about? How do I do it week after week? Well, I don't do it with advertiser supported money. I tell you how I do it. I do it with my book. If you'd like to support the show, if you'd like to support me, go check out my new book, A Paper Orchestra. It asks the question, what if it's the smallest, almost forgotten moments that are the ones that shape us most. Laura Sanoma says, good storytelling also leads us to ourselves, our memories, our beliefs, personal and powerful. I loved the Journey, and Max Munic, who was on my show says, as the father of daughters, I found Michael's understanding of parenting and the human condition to be spot on. This book is a fantastic read. Go check it out for yourself. Go to michael jamin.com/book. Thank you all and stay tuned. More great stuff coming next week.

Latent Space: The AI Engineer Podcast — CodeGen, Agents, Computer Vision, Data Science, AI UX and all things Software 3.0

Speaker CFPs and Sponsor Guides are now available for AIE World's Fair — join us on June 25-27 for the biggest AI Engineer conference of 2024!Soumith Chintala needs no introduction in the ML world — his insights are incredibly accessible across Twitter, LinkedIn, podcasts, and conference talks (in this pod we'll assume you'll have caught up on the History of PyTorch pod from last year and cover different topics). He's well known as the creator of PyTorch, but he's more broadly the Engineering Lead on AI Infra, PyTorch, and Generative AI at Meta.Soumith was one of the earliest supporters of Latent Space (and more recently AI News), and we were overjoyed to catch up with him on his latest SF visit for a braindump of the latest AI topics, reactions to some of our past guests, and why Open Source AI is personally so important to him.Life in the GPU-Rich LaneBack in January, Zuck went on Instagram to announce their GPU wealth: by the end of 2024, Meta will have 350k H100s. By adding all their GPU clusters, you'd get to 600k H100-equivalents of compute. At FP16 precision, that's ~1,200,000 PFLOPS. If we used George Hotz's (previous guest!) "Person of Compute" measure, Meta now has 60k humans of compute in their clusters. Occasionally we get glimpses into the GPU-rich life; on a recent ThursdAI chat, swyx prompted PaLM tech lead Yi Tay to write down what he missed most from Google, and he commented that UL2 20B was trained by accidentally leaving the training job running for a month, because hardware failures are so rare in Google.Meta AI's Epic LLM RunBefore Llama broke the internet, Meta released an open source LLM in May 2022, OPT-175B, which was notable for how “open” it was - right down to the logbook! They used only 16 NVIDIA V100 GPUs and Soumith agrees that, with hindsight, it was likely under-trained for its parameter size.In Feb 2023 (pre Latent Space pod), Llama was released, with a 7B version trained on 1T tokens alongside 65B and 33B versions trained on 1.4T tokens. The Llama authors included Guillaume Lample and Timothée Lacroix, who went on to start Mistral.July 2023 was Llama2 time (which we covered!): 3 model sizes, 7B, 13B, and 70B, all trained on 2T tokens. The three models accounted for a grand total of 3,311,616 GPU hours for all pre-training work. CodeLlama followed shortly after, a fine-tune of Llama2 specifically focused on code generation use cases. The family had models in the 7B, 13B, 34B, and 70B size, all trained with 500B extra tokens of code and code-related data, except for 70B which is trained on 1T.All of this on top of other open sourced models like Segment Anything (one of our early hits!), Detectron, Detectron 2, DensePose, and Seamless, and in one year, Meta transformed from a company people made fun of for its “metaverse” investments to one of the key players in the AI landscape and its stock has almost tripled since (about $830B in market value created in the past year).Why Open Source AIThe obvious question is why Meta would spend hundreds of millions on its AI efforts and then release them for free. Zuck has addressed this in public statements:But for Soumith, the motivation is even more personal:“I'm irrationally interested in open source. I think open source has that fundamental way to distribute opportunity in a way that is very powerful. Like, I grew up in India… And knowledge was very centralized, but I saw that evolution of knowledge slowly getting decentralized. And that ended up helping me learn quicker and faster for like zero dollars. And I think that was a strong reason why I ended up where I am. So like that, like the open source side of things, I always push regardless of like what I get paid for, like I think I would do that as a passion project on the side……I think at a fundamental level, the most beneficial value of open source is that you make the distribution to be very wide. It's just available with no friction and people can do transformative things in a way that's very accessible. Maybe it's open source, but it has a commercial license and I'm a student in India. I don't care about the license. I just don't even understand the license. But like the fact that I can use it and do something with it is very transformative to me……Like, okay, I again always go back to like I'm a student in India with no money. What is my accessibility to any of these closed source models? At some scale I have to pay money. That makes it a non-starter and stuff. And there's also the control issue: I strongly believe if you want human aligned AI, you want all humans to give feedback. And you want all humans to have access to that technology in the first place. And I actually have seen, living in New York, whenever I come to Silicon Valley, I see a different cultural bubble.We like the way Soumith put it last year: Closed AI “rate-limits against people's imaginations and needs”!What It Takes For Open Source AI to WinHowever Soumith doesn't think Open Source will simply win by popular demand. There is a tremendous coordination problem with the decentralized nature of the open source AI development right now: nobody is collecting the valuable human feedback in the way that OpenAI or Midjourney are doing.“Open source in general always has a coordination problem. If there's a vertically integrated provider with more resources, they will just be better coordinated than open source. And so now open source has to figure out how to have coordinated benefits. And the reason you want coordinated benefits is because these models are getting better based on human feedback. And if you see with open source models, like if you go to the /r/localllama subreddit, like there's so many variations of models that are being produced from, say, Nous research. I mean, like there's like so many variations built by so many people. And one common theme is they're all using these fine-tuning or human preferences datasets that are very limited and they're not sufficiently diverse. And you look at the other side, say front-ends like Oobabooga or like Hugging Chat or Ollama, they don't really have feedback buttons. All the people using all these front-ends, they probably want to give feedback, but there's no way for them to give feedback… So we're just losing all of this feedback. Maybe open source models are being as used as GPT is at this point in like all kinds of, in a very fragmented way, like in aggregate all the open source models together are probably being used as much as GPT is, maybe close to that. But the amount of feedback that is driving back into the open source ecosystem is like negligible, maybe less than 1% of like the usage. So I think like some, like the blueprint here I think is you'd want someone to create a sinkhole for the feedback… I think if we do that, if that actually happens, I think that probably has a real chance of the open source models having a runaway effect against OpenAI, I think like there's a clear chance we can take at truly winning open source.”If you're working on solving open source coordination, please get in touch!Show Notes* Soumith Chintala Twitter* History of PyTorch episode on Gradient Podcast* The Llama Ecosystem* Apple's MLX* Neural ODEs (Ordinary Differential Equations)* AlphaGo* LMSys arena* Dan Pink's "Drive"* Robotics projects:* Dobb-E* OK Robot* Yann LeCun* Yangqing Jia of Lepton AI* Ed Catmull* George Hotz on Latent Space* Chris Lattner on Latent Space* Guillaume Lample* Yannic Kilcher of OpenAssistant* LMSys* Alex Atallah of OpenRouter* Carlo Sferrazza's 3D tactile research* Alex Wiltschko of Osmo* Tangent by Alex Wiltschko* Lerrel Pinto - RoboticsTimestamps* [00:00:00] Introductions* [00:00:51] Extrinsic vs Intrinsic Success* [00:02:40] Importance of Open Source and Its Impact* [00:03:46] PyTorch vs TinyGrad* [00:08:33] Why PyTorch is the Switzerland of frameworks* [00:10:27] Modular's Mojo + PyTorch?* [00:13:32] PyTorch vs Apple's MLX* [00:16:27] FAIR / PyTorch Alumni* [00:18:50] How can AI inference providers differentiate?* [00:21:41] How to build good benchmarks and learnings from AnyScale's* [00:25:28] Most interesting unexplored ideas* [00:28:18] What people get wrong about synthetic data* [00:35:57] Meta AI's evolution* [00:38:42] How do you allocate 600,000 GPUs?* [00:42:05] Even the GPU Rich are GPU Poor* [00:47:31] Meta's MTIA silicon* [00:50:09] Why we need open source* [00:59:00] Open source's coordination problem for feedback gathering* [01:08:59] Beyond text generation* [01:15:37] Osmo and the Future of Smell Recognition TechnologyTranscriptAlessio [00:00:00]: Hey everyone, welcome to the Latent Space podcast. This is Alessio, partner and CTO in residence at Decibel Partners, and I'm joined by my co-host Swyx, founder of Smol AI.Swyx [00:00:15]: Hey, and today we have in the studio Soumith Chintala, welcome.Soumith [00:00:17]: Thanks for having me.Swyx [00:00:18]: On one of your rare visits from New York where you live. You got your start in computer vision at NYU with Yann LeCun. That was a very fortuitous start. I was actually listening to your interview on the Gradient podcast. So if people want to know more about the history of Soumith, history of PyTorch, they can go to that podcast. We won't spend that much time there, but I just was marveling at your luck, or I don't know if it's your luck or your drive to find AI early and then find the right quality mentor because I guess Yan really sort of introduced you to that world.Soumith [00:00:51]: Yeah, I think you're talking about extrinsic success, right? A lot of people just have drive to do things that they think is fun, and a lot of those things might or might not be extrinsically perceived as good and successful. I think I just happened to like something that is now one of the coolest things in the world or whatever. But if I happen, the first thing I tried to become was a 3D VFX artist, and I was really interested in doing that, but I turned out to be very bad at it. So I ended up not doing that further. But even if I was good at that, whatever, and I ended up going down that path, I probably would have been equally happy. It's just like maybe like the perception of, oh, is this person successful or not might be different. I think like after a baseline, like your happiness is probably more correlated with your intrinsic stuff.Swyx [00:01:44]: Yes. I think Dan Pink has this book on drive that I often refer to about the power of intrinsic motivation versus extrinsic and how long extrinsic lasts. It's not very long at all. But anyway, now you are an investor in Runway, so in a way you're working on VFX. Yes.Soumith [00:02:01]: I mean, in a very convoluted way.Swyx [00:02:03]: It reminds me of Ed Catmull. I don't know if you guys know, but he actually tried to become an animator in his early years and failed or didn't get accepted by Disney and then went and created Pixar and then got bought by Disney and created Toy Story. So you joined Facebook in 2014 and eventually became a creator and maintainer of PyTorch. And there's this long story there you can refer to on the gradient. I think maybe people don't know that you also involved in more sort of hardware and cluster decision affair. And we can dive into more details there because we're all about hardware this month. Yeah. And then finally, I don't know what else, like what else should people know about you on a personal side or professional side?Soumith [00:02:40]: I think open source is definitely a big passion of mine and probably forms a little bit of my identity at this point. I'm irrationally interested in open source. I think open source has that fundamental way to distribute opportunity in a way that is very powerful. Like, I grew up in India. I didn't have internet for a while. In college, actually, I didn't have internet except for GPRS or whatever. And knowledge was very centralized, but I saw that evolution of knowledge slowly getting decentralized. And that ended up helping me learn quicker and faster for zero dollars. And I think that was a strong reason why I ended up where I am. So the open source side of things, I always push regardless of what I get paid for, like I think I would do that as a passion project on the side.Swyx [00:03:35]: Yeah, that's wonderful. Well, we'll talk about the challenges as well that open source has, open models versus closed models. Maybe you want to touch a little bit on PyTorch before we move on to the sort of Meta AI in general.PyTorch vs Tinygrad tradeoffsAlessio [00:03:46]: Yeah, we kind of touched on PyTorch in a lot of episodes. So we had George Hotz from TinyGrad. He called PyTorch a CISC and TinyGrad a RISC. I would love to get your thoughts on PyTorch design direction as far as, I know you talk a lot about kind of having a happy path to start with and then making complexity hidden away but then available to the end user. One of the things that George mentioned is I think you have like 250 primitive operators in PyTorch, I think TinyGrad is four. So how do you think about some of the learnings that maybe he's going to run into that you already had in the past seven, eight years almost of running PyTorch?Soumith [00:04:24]: Yeah, I think there's different models here, but I think it's two different models that people generally start with. Either they go like, I have a grand vision and I'm going to build a giant system that achieves this grand vision and maybe one is super feature complete or whatever. Or other people say they will get incrementally ambitious, right? And they say, oh, we'll start with something simple and then we'll slowly layer out complexity in a way that optimally applies Huffman coding or whatever. Like where the density of users are and what they're using, I would want to keep it in the easy, happy path and where the more niche advanced use cases, I'll still want people to try them, but they need to take additional frictional steps. George, I think just like we started with PyTorch, George started with the incrementally ambitious thing. I remember TinyGrad used to be, like we would be limited to a thousand lines of code and I think now it's at 5,000. So I think there is no real magic to which why PyTorch has the kind of complexity. I think it's probably partly necessitated and partly because we built with the technology available under us at that time, PyTorch is like 190,000 lines of code or something at this point. I think if you had to rewrite it, we would probably think about ways to rewrite it in a vastly simplified way for sure. But a lot of that complexity comes from the fact that in a very simple, explainable way, you have memory hierarchies. You have CPU has three levels of caches and then you have DRAM and SSD and then you have network. Similarly, GPU has several levels of memory and then you have different levels of network hierarchies, NVLink plus InfiniBand or Rocky or something like that, right? And the way the flops are available on your hardware, they are available in a certain way and your computation is in a certain way and you have to retrofit your computation onto both the memory hierarchy and like the flops available. When you're doing this, it is actually a fairly hard mathematical problem to do this setup, like you find the optimal thing. And finding the optimal thing is, what is optimal depends on the input variables themselves. So like, okay, what is the shape of your input tensors and what is the operation you're trying to do and various things like that. Finding that optimal configuration and writing it down in code is not the same for every input configuration you have. Like for example, just as the shape of the tensors change, let's say you have three input tensors into a Sparstar product or something like that. The shape of each of these input tensors will vastly change how you do this optimally placing this operation onto the hardware in a way that will get you maximal throughput. So a lot of our complexity comes from writing out hundreds of configurations for each single PyTorch operator and templatizing these things and symbolically generating the final CUDA code or CPU code. There's no way to avoid it because mathematically we haven't found symbolic ways to do this that also keep compile time near zero. You can write a very simple framework, but then you also should be willing to eat the long compile time. So if searching for that optimal performance at runtime, but that's the trade off. There's no, like, I don't think unless we have great breakthroughs George's vision is achievable, he should be thinking about a narrower problem such as I'm only going to make this for work for self-driving car connets or I'm only going to make this work for LLM transformers of the llama style. Like if you start narrowing the problem down, you can make a vastly simpler framework. But if you don't, if you need the generality to power all of the AI research that is happening and keep zero compile time and in all these other factors, I think it's not easy to avoid the complexity.Pytorch vs MojoAlessio [00:08:33]: That's interesting. And we kind of touched on this with Chris Lattner when he was on the podcast. If you think about frameworks, they have the model target. They have the hardware target. They have different things to think about. He mentioned when he was at Google, TensorFlow trying to be optimized to make TPUs go brr, you know, and go as fast. I think George is trying to make especially AMD stack be better than ROCm. How come PyTorch has been such as Switzerland versus just making Meta hardware go brr?Soumith [00:09:00]: First, Meta is not in the business of selling hardware. Meta is not in the business of cloud compute. The way Meta thinks about funding PyTorch is we're funding it because it's net good for Meta to fund PyTorch because PyTorch has become a standard and a big open source project. And generally it gives us a timeline edge. It gives us leverage and all that within our own work. So why is PyTorch more of a Switzerland rather than being opinionated? I think the way we think about it is not in terms of Switzerland or not. We actually the way we articulate it to all hardware vendors and software vendors and all who come to us being we want to build a backend in core for PyTorch and ship it by default is we just only look at our user side of things. Like if users are using a particular piece of hardware, then we want to support it. We very much don't want to king make the hardware side of things. So as the MacBooks have GPUs and as that stuff started getting increasingly interesting, we pushed Apple to push some engineers and work on the NPS support and we spend significant time from Meta funded engineers on that as well because a lot of people are using the Apple GPUs and there's demand. So we kind of mostly look at it from the demand side. We never look at it from like oh which hardware should we start taking opinions on.Swyx [00:10:27]: Is there a future in which, because Mojo or Modular Mojo is kind of a superset of Python, is there a future in which PyTorch might use Mojo features optionally?Soumith [00:10:36]: I think it depends on how well integrated it is into the Python ecosystem. So if Mojo is like a pip install and it's readily available and users feel like they can use Mojo so smoothly within their workflows in a way that just is low friction, we would definitely look into that. Like in the same way PyTorch now depends on Triton, OpenAI Triton, and we never had a conversation that was like huh, that's like a dependency. Should we just build a Triton of our own or should we use Triton? It almost doesn't, like those conversations don't really come up for us. The conversations are more well does Triton have 10,000 dependencies and is it hard to install? We almost don't look at these things from a strategic leverage point of view. We look at these things from a user experience point of view, like is it easy to install? Is it smoothly integrated and does it give enough benefits for us to start depending on it? If so, yeah, we should consider it. That's how we think about it.Swyx [00:11:37]: You're inclusive by default as long as it meets the minimum bar of, yeah, but like maybe I phrased it wrongly. Maybe it's more like what problems would you look to solve that you have right now?Soumith [00:11:48]: I think it depends on what problems Mojo will be useful at.Swyx [00:11:52]: Mainly a performance pitch, some amount of cross compiling pitch.Soumith [00:11:56]: Yeah, I think the performance pitch for Mojo was like, we're going to be performant even if you have a lot of custom stuff, you're going to write arbitrary custom things and we will be performant. And that value proposition is not clear to us from the PyTorch side to consider it for PyTorch. So PyTorch, it's actually not 250 operators, it's like a thousand operators. PyTorch exposes about a thousand operators and people kind of write their ideas in the thousand operators of PyTorch. Mojo is like, well, maybe it's okay to completely sidestep those thousand operators of PyTorch and just write it in a more natural form. Just write raw Python, write for loops or whatever, right? So from the consideration of how do we intersect PyTorch with Mojo, I can see one use case where you have custom stuff for some parts of your program, but mostly it's PyTorch. And so we can probably figure out how to make it easier for say Torch.compile to smoothly also consume Mojo subgraphs and like, you know, the interoperability being actually usable, that I think is valuable. But Mojo as a fundamental front end would be replacing PyTorch, not augmenting PyTorch. So in that sense, I don't see a synergy in more deeply integrating Mojo.Pytorch vs MLXSwyx [00:13:21]: So call out to Mojo whenever they have written something in Mojo and there's some performance related thing going on. And then since you mentioned Apple, what should people think of PyTorch versus MLX?Soumith [00:13:32]: I mean, MLX is early and I know the folks well, Ani used to work at FAIR and I used to chat with him all the time. He used to be based out of New York as well. The way I think about MLX is that MLX is specialized for Apple right now. It has a happy path because it's defined its product in a narrow way. At some point MLX either says we will only be supporting Apple and we will just focus on enabling, you know, there's a framework if you use your MacBook, but once you like go server side or whatever, that's not my problem and I don't care. For MLS, it enters like the server side set of things as well. Like one of these two things will happen, right? If the first thing will happen, like MLX's overall addressable market will be small, but it probably do well within that addressable market. If it enters the second phase, they're going to run into all the same complexities that we have to deal with. They will not have any magic wand and they will have more complex work to do. They probably wouldn't be able to move as fast.Swyx [00:14:44]: Like having to deal with distributed compute?Soumith [00:14:48]: Distributed, NVIDIA and AMD GPUs, like just like having a generalization of the concept of a backend, how they treat compilation with plus overheads. Right now they're deeply assumed like the whole NPS graph thing. So they need to think about all these additional things if they end up expanding onto the server side and they'll probably build something like PyTorch as well, right? Like eventually that's where it will land. And I think there they will kind of fail on the lack of differentiation. Like it wouldn't be obvious to people why they would want to use it.Swyx [00:15:24]: I mean, there are some cloud companies offering M1 and M2 chips on servers. I feel like it might be interesting for Apple to pursue that market, but it's not their core strength.Soumith [00:15:33]: Yeah. If Apple can figure out their interconnect story, maybe, like then it can become a thing.Swyx [00:15:40]: Honestly, that's more interesting than the cars. Yes.Soumith [00:15:43]: I think the moat that NVIDIA has right now, I feel is that they have the interconnect that no one else has, like AMD GPUs are pretty good. I'm sure there's various silicon that is not bad at all, but the interconnect, like NVLink is uniquely awesome. I'm sure the other hardware providers are working on it, but-Swyx [00:16:04]: I feel like when you say it's uniquely awesome, you have some appreciation of it that the rest of us don't. I mean, the rest of us just like, you know, we hear marketing lines, but what do you mean when you say NVIDIA is very good at networking? Obviously they made the acquisition maybe like 15 years ago.Soumith [00:16:15]: Just the bandwidth it offers and the latency it offers. I mean, TPUs also have a good interconnect, but you can't buy them. So you have to go to Google to use it.PyTorch MafiaAlessio [00:16:27]: Who are some of the other FAIR PyTorch alumni that are building cool companies? I know you have Fireworks AI, Lightning AI, Lepton, and Yangqing, you knew since college when he was building Coffee?Soumith [00:16:40]: Yeah, so Yangqing and I used to be framework rivals, PyTorch, I mean, we were all a very small close-knit community back then. Caffe, Torch, Theano, Chainer, Keras, various frameworks. I mean, it used to be more like 20 frameworks. I can't remember all the names. CCV by Liu Liu, who is also based out of SF. And I would actually like, you know, one of the ways it was interesting is you went into the framework guts and saw if someone wrote their own convolution kernel or they were just copying someone else's. There were four or five convolution kernels that were unique and interesting. There was one from this guy out of Russia, I forgot the name, but I remembered who was awesome enough to have written their own kernel. And at some point there, I built out these benchmarks called ConNet benchmarks. They're just benchmarking all the convolution kernels that are available at that time. It hilariously became big enough that at that time AI was getting important, but not important enough that industrial strength players came in to do these kinds of benchmarking and standardization. Like we have MLPerf today. So a lot of the startups were using ConNet benchmarks in their pitch decks as like, oh, you know, on ConNet benchmarks, this is how we fare, so you should fund us. I remember Nirvana actually was at the top of the pack because Scott Gray wrote amazingly fast convolution kernels at that time. Very interesting, but separate times. But to answer your question, Alessio, I think mainly Lepton, Fireworks are the two most obvious ones, but I'm sure the fingerprints are a lot wider. They're just people who worked within the PyTorch Cafe2 cohort of things and now end up at various other places.Swyx [00:18:50]: I think as a, both as an investor and a people looking to build on top of their services, it's a uncomfortable slash like, I don't know what I don't know pitch. Because I've met Yang Tsing and I've met Lin Chao. Yeah, I've met these folks and they're like, you know, we are deep in the PyTorch ecosystem and we serve billions of inferences a day or whatever at Facebook and now we can do it for you. And I'm like, okay, that's great. Like, what should I be wary of or cautious of when these things happen? Because I'm like, obviously this experience is extremely powerful and valuable. I just don't know what I don't know. Like, what should people know about like these sort of new inference as a service companies?Soumith [00:19:32]: I think at that point you would be investing in them for their expertise of one kind. So if they've been at a large company, but they've been doing amazing work, you would be thinking about it as what these people bring to the table is that they're really good at like GPU programming or understanding the complexity of serving models once it hits a certain scale. You know, various expertise like from the infra and AI and GPUs point of view. What you would obviously want to figure out is whether their understanding of the external markets is clear, whether they know and understand how to think about running a business, understanding how to be disciplined about making money or, you know, various things like that.Swyx [00:20:23]: Maybe I'll put it like, actually I will de-emphasize the investing bit and just more as a potential customer. Oh, okay. Like, it's more okay, you know, you have PyTorch gods, of course. Like, what else should I know?Soumith [00:20:37]: I mean, I would not care about who's building something. If I'm trying to be a customer, I would care about whether...Swyx [00:20:44]: Benchmarks.Soumith [00:20:44]: Yeah, I use it and it's usability and reliability and speed, right?Swyx [00:20:51]: Quality as well.Soumith [00:20:51]: Yeah, if someone from some random unknown place came to me and say, user stuff is great. Like, and I have the bandwidth, I probably will give it a shot. And if it turns out to be great, like I'll just use it.Benchmark dramaSwyx [00:21:07]: Okay, great. And then maybe one more thing about benchmarks, since we already brought it up and you brought up Confident Benchmarks. There was some recent drama around AnyScale. AnyScale released their own benchmarks and obviously they look great on their own benchmarks, but maybe didn't give the other... I feel there are two lines of criticism. One, which is they didn't test some apples for apples on the kind of endpoints that the other providers, that they are competitors with, on their benchmarks and that is due diligence baseline. And then the second would be more just optimizing for the right thing. You had some commentary on it. I'll just kind of let you riff.Soumith [00:21:41]: Yeah, I mean, in summary, basically my criticism of that was AnyScale built these benchmarks for end users to just understand what they should pick, right? And that's a very good thing to do. I think what they didn't do a good job of is give that end user a full understanding of what they should pick. Like they just gave them a very narrow slice of understanding. I think they just gave them latency numbers and that's not sufficient, right? You need to understand your total cost of ownership at some reasonable scale. Not oh, one API call is one cent, but a thousand API calls are 10 cents. Like people can misprice to cheat on those benchmarks. So you want to understand, okay, like how much is it going to cost me if I actually subscribe to you and do like a million API calls a month or something? And then you want to understand the latency and reliability, not just from one call you made, but an aggregate of calls you've made over several various times of the day and times of the week. And the nature of the workloads, is it just some generic single paragraph that you're sending that is cashable? Or is it like testing of real world workload? I think that kind of rigor, like in presenting that benchmark wasn't there. It was a much more narrow sliver of what should have been a good benchmark. That was my main criticism. And I'm pretty sure if before they released it, they showed it to their other stakeholders who would be caring about this benchmark because they are present in it, they would have easily just pointed out these gaps. And I think they didn't do that and they just released it. So I think those were the two main criticisms. I think they were fair and Robert took it well.Swyx [00:23:40]: And he took it very well. And we'll have him on at some point and we'll discuss it. But I think it's important for, I think the market being maturing enough that people start caring and competing on these kinds of things means that we need to establish what best practice is because otherwise everyone's going to play dirty.Soumith [00:23:55]: Yeah, absolutely. My view of the LLM inference market in general is that it's the laundromat model. Like the margins are going to drive down towards the bare minimum. It's going to be all kinds of arbitrage between how much you can get the hardware for and then how much you sell the API and how much latency your customers are willing to let go. You need to figure out how to squeeze your margins. Like what is your unique thing here? Like I think Together and Fireworks and all these people are trying to build some faster CUDA kernels and faster, you know, hardware kernels in general. But those modes only last for a month or two. These ideas quickly propagate.Swyx [00:24:38]: Even if they're not published?Soumith [00:24:39]: Even if they're not published, the idea space is small. So even if they're not published, the discovery rate is going to be pretty high. It's not like we're talking about a combinatorial thing that is really large. You're talking about Llama style LLM models. And we're going to beat those to death on a few different hardware SKUs, right? Like it's not even we have a huge diversity of hardware you're going to aim to run it on. Now when you have such a narrow problem and you have a lot of people working on it, the rate at which these ideas are going to get figured out is going to be pretty rapid.Swyx [00:25:15]: Is it a standard bag of tricks? Like the standard one that I know of is, you know, fusing operators and-Soumith [00:25:22]: Yeah, it's the standard bag of tricks on figuring out how to improve your memory bandwidth and all that, yeah.Alessio [00:25:28]: Any ideas instead of things that are not being beaten to death that people should be paying more attention to?Novel PyTorch ApplicationsSwyx [00:25:34]: One thing I was like, you know, you have a thousand operators, right? Like what's the most interesting usage of PyTorch that you're seeing maybe outside of this little bubble?Soumith [00:25:41]: So PyTorch, it's very interesting and scary at the same time, but basically it's used in a lot of exotic ways, like from the ML angle, what kind of models are being built? And you get all the way from state-based models and all of these things to stuff nth order differentiable models, like neural ODEs and stuff like that. I think there's one set of interestingness factor from the ML side of things. And then there's the other set of interesting factor from the applications point of view. It's used in Mars Rover simulations, to drug discovery, to Tesla cars. And there's a huge diversity of applications in which it is used. So in terms of the most interesting application side of things, I think I'm scared at how many interesting things that are also very critical and really important it is used in. I think the scariest was when I went to visit CERN at some point and they said they were using PyTorch and they were using GANs at the same time for particle physics research. And I was scared more about the fact that they were using GANs than they were using PyTorch, because at that time I was a researcher focusing on GANs. But the diversity is probably the most interesting. How many different things it is being used in. I think that's the most interesting to me from the applications perspective. From the models perspective, I think I've seen a lot of them. Like the really interesting ones to me are where we're starting to combine search and symbolic stuff with differentiable models, like the whole AlphaGo style models is one example. And then I think we're attempting to do it for LLMs as well, with various reward models and search. I mean, I don't think PyTorch is being used in this, but the whole alpha geometry thing was interesting because again, it's an example of combining the symbolic models with the gradient based ones. But there are stuff like alpha geometry that PyTorch is used at, especially when you intersect biology and chemistry with ML. In those areas, you want stronger guarantees on the output. So yeah, maybe from the ML side, those things to me are very interesting right now.Swyx [00:28:03]: Yeah. People are very excited about the alpha geometry thing. And it's kind of like, for me, it's theoretical. It's great. You can solve some Olympia questions. I'm not sure how to make that bridge over into the real world applications, but I'm sure people smarter than me will figure it out.Synthetic Data vs Symbolic ModelsSoumith [00:28:18]: Let me give you an example of it. You know how the whole thing about synthetic data will be the next rage in LLMs is a thing?Swyx [00:28:27]: Already is a rage.Soumith [00:28:28]: Which I think is fairly misplaced in how people perceive it. People think synthetic data is some kind of magic wand that you wave and it's going to be amazing. Synthetic data is useful in neural networks right now because we as humans have figured out a bunch of symbolic models of the world or made up certain symbolic models because of human innate biases. So we've figured out how to ground particle physics in a 30 parameter model. And it's just very hard to compute as in it takes a lot of flops to compute, but it only has 30 parameters or so. I mean, I'm not a physics expert, but it's a very low rank model. We built mathematics as a field that basically is very low rank. Language, a deep understanding of language, like the whole syntactic parse trees and just understanding how language can be broken down and into a formal symbolism is something that we figured out. So we basically as humans have accumulated all this knowledge on these subjects, either synthetic, we created those subjects in our heads, or we grounded some real world phenomenon into a set of symbols. But we haven't figured out how to teach neural networks symbolic world models directly. The only way we have to teach them is generating a bunch of inputs and outputs and gradient dissenting over them. So in areas where we have the symbolic models and we need to teach all the knowledge we have that is better encoded in the symbolic models, what we're doing is we're generating a bunch of synthetic data, a bunch of input output pairs, and then giving that to the neural network and asking it to learn the same thing that we already have a better low rank model of in gradient descent in a much more over-parameterized way. Outside of this, like where we don't have good symbolic models, like synthetic data obviously doesn't make any sense. So synthetic data is not a magic wand where it'll work in all cases in every case or whatever. It's just where we as humans already have good symbolic models off. We need to impart that knowledge to neural networks and we figured out the synthetic data is a vehicle to impart this knowledge to. So, but people, because maybe they don't know enough about synthetic data as a notion, but they hear, you know, the next wave of data revolution is synthetic data. They think it's some kind of magic where we just create a bunch of random data somehow. They don't think about how, and then they think that's just a revolution. And I think that's maybe a gap in understanding most people have in this hype cycle.Swyx [00:31:23]: Yeah, well, it's a relatively new concept, so. Oh, there's two more that I'll put in front of you and then you can see what you respond. One is, you know, I have this joke that it's, you know, it's only synthetic data if it's from the Mistral region of France, otherwise it's just a sparkling distillation, which is what news research is doing. Like they're distilling GPT-4 by creating synthetic data from GPT-4, creating mock textbooks inspired by Phi 2 and then fine tuning open source models like Llama. And so I don't know, I mean, I think that's, should we call that synthetic data? Should we call it something else? I don't know.Soumith [00:31:57]: Yeah, I mean, the outputs of LLMs, are they synthetic data? They probably are, but I think it depends on the goal you have. If your goal is you're creating synthetic data with the goal of trying to distill GPT-4's superiority into another model, I guess you can call it synthetic data, but it also feels like disingenuous because your goal is I need to copy the behavior of GPT-4 and-Swyx [00:32:25]: It's also not just behavior, but data set. So I've often thought of this as data set washing. Like you need one model at the top of the chain, you know, unnamed French company that has that, you know, makes a model that has all the data in it that we don't know where it's from, but it's open source, hey, and then we distill from that and it's great. To be fair, they also use larger models as judges for preference ranking, right? So that is, I think, a very, very accepted use of synthetic.Soumith [00:32:53]: Correct. I think it's a very interesting time where we don't really have good social models of what is acceptable depending on how many bits of information you use from someone else, right? It's like, okay, you use one bit. Is that okay? Yeah, let's accept it to be okay. Okay, what about if you use 20 bits? Is that okay? I don't know. What if you use 200 bits? I don't think we as society have ever been in this conundrum where we have to be like, where is the boundary of copyright or where is the boundary of socially accepted understanding of copying someone else? We haven't been tested this mathematically before,Swyx [00:33:38]: in my opinion. Whether it's transformative use. Yes. So yeah, I think this New York Times opening eye case is gonna go to the Supreme Court and we'll have to decide it because I think we never had to deal with it before. And then finally, for synthetic data, the thing that I'm personally exploring is solving this great stark paradigm difference between rag and fine tuning, where you can kind of create synthetic data off of your retrieved documents and then fine tune on that. That's kind of synthetic. All you need is variation or diversity of samples for you to fine tune on. And then you can fine tune new knowledge into your model. I don't know if you've seen that as a direction for synthetic data.Soumith [00:34:13]: I think you're basically trying to, what you're doing is you're saying, well, language, I know how to parametrize language to an extent. And I need to teach my model variations of this input data so that it's resilient or invariant to language uses of that data.Swyx [00:34:32]: Yeah, it doesn't overfit on the wrong source documents.Soumith [00:34:33]: So I think that's 100% synthetic. You understand, the key is you create variations of your documents and you know how to do that because you have a symbolic model or like some implicit symbolic model of language.Swyx [00:34:48]: Okay.Alessio [00:34:49]: Do you think the issue with symbolic models is just the architecture of the language models that we're building? I think maybe the thing that people grasp is the inability of transformers to deal with numbers because of the tokenizer. Is it a fundamental issue there too? And do you see alternative architectures that will be better with symbolic understanding?Soumith [00:35:09]: I am not sure if it's a fundamental issue or not. I think we just don't understand transformers enough. I don't even mean transformers as an architecture. I mean the use of transformers today, like combining the tokenizer and transformers and the dynamics of training, when you show math heavy questions versus not. I don't have a good calibration of whether I know the answer or not. I, you know, there's common criticisms that are, you know, transformers will just fail at X. But then when you scale them up to sufficient scale, they actually don't fail at that X. I think there's this entire subfield where they're trying to figure out these answers called like the science of deep learning or something. So we'll get to know more. I don't know the answer.Meta AI and Llama 2/3Swyx [00:35:57]: Got it. Let's touch a little bit on just Meta AI and you know, stuff that's going on there. Maybe, I don't know how deeply you're personally involved in it, but you're our first guest with Meta AI, which is really fantastic. And Llama 1 was, you know, you are such a believer in open source. Llama 1 was more or less the real breakthrough in open source AI. The most interesting thing for us covering on this, in this podcast was the death of Chinchilla, as people say. Any interesting insights there around the scaling models for open source models or smaller models or whatever that design decision was when you guys were doing it?Soumith [00:36:31]: So Llama 1 was Guillaume Lample and team. There was OPT before, which I think I'm also very proud of because we bridged the gap in understanding of how complex it is to train these models to the world. Like until then, no one really in gory detail published.Swyx [00:36:50]: The logs.Soumith [00:36:51]: Yeah. Like, why is it complex? And everyone says, oh, it's complex. But no one really talked about why it's complex. I think OPT was cool.Swyx [00:37:02]: I met Susan and she's very, very outspoken. Yeah.Soumith [00:37:05]: We probably, I think, didn't train it for long enough, right? That's kind of obvious in retrospect.Swyx [00:37:12]: For a 175B. Yeah. You trained it according to Chinchilla at the time or?Soumith [00:37:17]: I can't remember the details, but I think it's a commonly held belief at this point that if we trained OPT longer, it would actually end up being better. Llama 1, I think, was Guillaume Lample and team Guillaume is fantastic and went on to build Mistral. I wasn't too involved in that side of things. So I don't know what you're asking me, which is how did they think about scaling loss and all of that? Llama 2, I was more closely involved in. I helped them a reasonable amount with their infrastructure needs and stuff. And Llama 2, I think, was more like, let's get to the evolution. At that point, we kind of understood what we were missing from the industry's understanding of LLMs. And we needed more data and we needed more to train the models for longer. And we made, I think, a few tweaks to the architecture and we scaled up more. And that was Llama 2. I think Llama 2, you can think of it as after Guillaume left, the team kind of rebuilt their muscle around Llama 2. And Hugo, I think, who's the first author is fantastic. And I think he did play a reasonable big role in Llama 1 as well.Soumith [00:38:35]: And he overlaps between Llama 1 and 2. So in Llama 3, obviously, hopefully, it'll be awesome.Alessio [00:38:42]: Just one question on Llama 2, and then we'll try and fish Llama 3 spoilers out of you. In the Llama 2 paper, the loss curves of the 34 and 70B parameter, they still seem kind of steep. Like they could go lower. How, from an infrastructure level, how do you allocate resources? Could they have just gone longer or were you just, hey, this is all the GPUs that we can burn and let's just move on to Llama 3 and then make that one better?Soumith [00:39:07]: Instead of answering specifically about that Llama 2 situation or whatever, I'll tell you how we think about things. Generally, we're, I mean, Mark really is some numbers, right?Swyx [00:39:20]: So let's cite those things again. All I remember is like 600K GPUs.Soumith [00:39:24]: That is by the end of this year and 600K H100 equivalents. With 250K H100s, including all of our other GPU or accelerator stuff, it would be 600-and-something-K aggregate capacity.Swyx [00:39:38]: That's a lot of GPUs.Soumith [00:39:39]: We'll talk about that separately. But the way we think about it is we have a train of models, right? Llama 1, 2, 3, 4. And we have a bunch of GPUs. I don't think we're short of GPUs. Like-Swyx [00:39:54]: Yeah, no, I wouldn't say so. Yeah, so it's all a matter of time.Soumith [00:39:56]: I think time is the biggest bottleneck. It's like, when do you stop training the previous one and when do you start training the next one? And how do you make those decisions? The data, do you have net new data, better clean data for the next one in a way that it's not worth really focusing on the previous one? It's just a standard iterative product. You're like, when is the iPhone 1? When do you start working on iPhone 2? Where is the iPhone? And so on, right? So mostly the considerations are time and generation, rather than GPUs, in my opinion.Alessio [00:40:31]: So one of the things with the scaling loss, like Chinchilla is optimal to balance training and inference costs. I think at Meta's scale, you would rather pay a lot more maybe at training and then save on inference. How do you think about that from infrastructure perspective? I think in your tweet, you say you can try and guess on like how we're using these GPUs. Can you just give people a bit of understanding? It's like, because I've already seen a lot of VCs say, Llama 3 has been trained on 600,000 GPUs and that's obviously not true, I'm sure. How do you allocate between the research, FAIR and the Llama training, the inference on Instagram suggestions that get me to scroll, like AI-generated stickers on WhatsApp and all of that?Soumith [00:41:11]: Yeah, we haven't talked about any of this publicly, but as a broad stroke, it's like how we would allocate resources of any other kinds at any company. You run a VC portfolio, how do you allocate your investments between different companies or whatever? You kind of make various trade-offs and you kind of decide, should I invest in this project or this other project, or how much should I invest in this project? It's very much a zero sum of trade-offs. And it also comes into play, how are your clusters configured, like overall, what you can fit of what size and what cluster and so on. So broadly, there's no magic sauce here. I mean, I think the details would add more spice, but also wouldn't add more understanding. It's just gonna be like, oh, okay, I mean, this looks like they just think about this as I would normally do.Alessio [00:42:05]: So even the GPU rich run through the same struggles of having to decide where to allocate things.Soumith [00:42:11]: Yeah, I mean, at some point I forgot who said it, but you kind of fit your models to the amount of compute you have. If you don't have enough compute, you figure out how to make do with smaller models. But no one as of today, I think would feel like they have enough compute. I don't think I've heard any company within the AI space be like, oh yeah, like we feel like we have sufficient compute and we couldn't have done better. So that conversation, I don't think I've heard from any of my friends at other companies.EleutherSwyx [00:42:47]: Stella from Eleuther sometimes says that because she has a lot of donated compute. She's trying to put it to interesting uses, but for some reason she's decided to stop making large models.Soumith [00:42:57]: I mean, that's a cool, high conviction opinion that might pay out.Swyx [00:43:01]: Why?Soumith [00:43:02]: I mean, she's taking a path that most people don't care to take about in this climate and she probably will have very differentiated ideas. I mean, think about the correlation of ideas in AI right now. It's so bad, right? So everyone's fighting for the same pie. In some weird sense, that's partly why I don't really directly work on LLMs. I used to do image models and stuff and I actually stopped doing GANs because GANs were getting so hot that I didn't have any calibration of whether my work would be useful or not because, oh yeah, someone else did the same thing you did. It's like, there's so much to do, I don't understand why I need to fight for the same pie. So I think Stella's decision is very smart.Making BetsAlessio [00:43:53]: And how do you reconcile that with how we started the discussion about intrinsic versus extrinsic kind of like accomplishment or success? How should people think about that especially when they're doing a PhD or early in their career? I think in Europe, I walked through a lot of the posters and whatnot, there seems to be mode collapse in a way in the research, a lot of people working on the same things. Is it worth for a PhD to not take a bet on something that is maybe not as interesting just because of funding and visibility and whatnot? Or yeah, what suggestions would you give?Soumith [00:44:28]: I think there's a baseline level of compatibility you need to have with the field. Basically, you need to figure out if you will get paid enough to eat, right? Like whatever reasonable normal lifestyle you want to have as a baseline. So you at least have to pick a problem within the neighborhood of fundable. Like you wouldn't wanna be doing something so obscure that people are like, I don't know, like you can work on it.Swyx [00:44:59]: Would a limit on fundability, I'm just observing something like three months of compute, right? That's the top line, that's the like max that you can spend on any one project.Soumith [00:45:09]: But like, I think that's very ill specified, like how much compute, right? I think that the notion of fundability is broader. It's more like, hey, are these family of models within the acceptable set of, you're not crazy or something, right? Even something like neural or DS, which is a very boundary pushing thing or states-based models or whatever. Like all of these things I think are still in fundable territory. When you're talking about, I'm gonna do one of the neuromorphic models and then apply image classification to them or something, then it becomes a bit questionable. Again, it depends on your motivation. Maybe if you're a neuroscientist, it actually is feasible. But if you're an AI engineer, like the audience of these podcasts, then it's more questionable. The way I think about it is, you need to figure out how you can be in the baseline level of fundability just so that you can just live. And then after that, really focus on intrinsic motivation and depends on your strengths, like how you can play to your strengths and your interests at the same time. Like I try to look at a bunch of ideas that are interesting to me, but also try to play to my strengths. I'm not gonna go work on theoretical ML. I'm interested in it, but when I want to work on something like that, I try to partner with someone who is actually a good theoretical ML person and see if I actually have any value to provide. And if they think I do, then I come in. So I think you'd want to find that intersection of ideas you like, and that also play to your strengths. And I'd go from there. Everything else, like actually finding extrinsic success and all of that, I think is the way I think about it is like somewhat immaterial. When you're talking about building ecosystems and stuff, slightly different considerations come into play, but that's a different conversation.Swyx [00:47:06]: We're gonna pivot a little bit to just talking about open source AI. But one more thing I wanted to establish for Meta is this 600K number, just kind of rounding out the discussion, that's for all Meta. So including your own inference needs, right? It's not just about training.Soumith [00:47:19]: It's gonna be the number in our data centers for all of Meta, yeah.Swyx [00:47:23]: Yeah, so there's a decent amount of workload serving Facebook and Instagram and whatever. And then is there interest in like your own hardware?MTIASoumith [00:47:31]: We already talked about our own hardware. It's called MTIA. Our own silicon, I think we've even showed the standard photograph of you holding the chip that doesn't work. Like as in the chip that you basically just get like-Swyx [00:47:51]: As a test, right?Soumith [00:47:52]: Yeah, a test chip or whatever. So we are working on our silicon and we'll probably talk more about it when the time is right, but-Swyx [00:48:00]: Like what gaps do you have that the market doesn't offer?Soumith [00:48:04]: Okay, I mean, this is easy to answer. So basically, remember how I told you about there's this memory hierarchy and like sweet spots and all of that? Fundamentally, when you build a hardware, you make it general enough that a wide set of customers and a wide set of workloads can use it effectively while trying to get the maximum level of performance they can. The more specialized you make the chip, the more hardware efficient it's going to be, the more power efficient it's gonna be, the more easier it's going to be to find the software, like the kernel's right to just map that one or two workloads to that hardware and so on. So it's pretty well understood across the industry that if you have a sufficiently large volume, enough workload, you can specialize it and get some efficiency gains, like power gains and so on. So the way you can think about everyone building, every large company building silicon, I think a bunch of the other large companies are building their own silicon as well, is they, each large company has a sufficient enough set of verticalized workloads that can be specialized that have a pattern to them that say a more generic accelerator like an NVIDIA or an AMD GPU does not exploit. So there is some level of power efficiency that you're leaving on the table by not exploiting that. And you have sufficient scale and you have sufficient forecasted stability that those workloads will exist in the same form, that it's worth spending the time to build out a chip to exploit that sweet spot. Like obviously something like this is only useful if you hit a certain scale and that your forecasted prediction of those kind of workloads being in the same kind of specializable exploitable way is true. So yeah, that's why we're building our own chips.Swyx [00:50:08]: Awesome.Open Source AIAlessio [00:50:09]: Yeah, I know we've been talking a lot on a lot of different topics and going back to open source, you had a very good tweet. You said that a single company's closed source effort rate limits against people's imaginations and needs. How do you think about all the impact that some of the Meta AI work in open source has been doing and maybe directions of the whole open source AI space?Soumith [00:50:32]: Yeah, in general, I think first, I think it's worth talking about this in terms of open and not just open source, because like with the whole notion of model weights, no one even knows what source means for these things. But just for the discussion, when I say open source, you can assume it's just I'm talking about open. And then there's the whole notion of licensing and all that, commercial, non-commercial, commercial with clauses and all that. I think at a fundamental level, the most benefited value of open source is that you make the distribution to be very wide. It's just available with no friction and people can do transformative things in a way that's very accessible. Maybe it's open source, but it has a commercial license and I'm a student in India. I don't care about the license. I just don't even understand the license. But like the fact that I can use it and do something with it is very transformative to me. Like I got this thing in a very accessible way. And then it's various degrees, right? And then if it's open source, but it's actually a commercial license, then a lot of companies are gonna benefit from gaining value that they didn't previously have, that they maybe had to pay a closed source company for it. So open source is just a very interesting tool that you can use in various ways. So there's, again, two kinds of open source. One is some large company doing a lot of work and then open sourcing it. And that kind of effort is not really feasible by say a band of volunteers doing it the same way. So there's both a capital and operational expenditure that the large company just decided to ignore and give it away to the world for some benefits of some kind. They're not as tangible as direct revenue. So in that part, Meta has been doing incredibly good things. They fund a huge amount of the PyTorch development. They've open sourced Llama and those family of models and several other fairly transformative projects. FICE is one, Segment Anything, Detectron, Detectron 2. Dense Pose. I mean, it's-Swyx [00:52:52]: Seamless. Yeah, seamless.Soumith [00:52:53]: Like it's just the list is so long that we're not gonna cover. So I think Meta comes into that category where we spend a lot of CapEx and OpEx and we have a high talent density of great AI people and we open our stuff. And the thesis for that, I remember when FAIR was started, the common thing was like, wait, why would Meta wanna start a open AI lab? Like what exactly is a benefit from a commercial perspective? And for then the thesis was very simple. It was AI is currently rate limiting Meta's ability to do things. Our ability to build various product integrations, moderation, various other factors. Like AI was the limiting factor and we just wanted AI to advance more and we didn't care if the IP of the AI was uniquely in our possession or not. However the field advances, that accelerates Meta's ability to build a better product. So we just built an open AI lab and we said, if this helps accelerate the progress of AI, that's strictly great for us. But very easy, rational, right? Still the same to a large extent with the Llama stuff. And it's the same values, but the argument, it's a bit more nuanced. And then there's a second kind of open source, which is, oh, we built this project, nights and weekends and we're very smart people and we open sourced it and then we built a community around it. This is the Linux kernel and various software projects like that. So I think about open source, like both of these things being beneficial and both of these things being different. They're different and beneficial in their own ways. The second one is really useful when there's an active arbitrage to be done. If someone's not really looking at a particular space because it's not commercially viable or whatever, like a band of volunteers can just coordinate online and do something and then make that happen. And that's great.Open Source LLMsI wanna cover a little bit about open source LLMs maybe. So open source LLMs have been very interesting because I think we were trending towards an increase in open source in AI from 2010 all the way to 2017 or something. Like where more and more pressure within the community was to open source their stuff so that their methods and stuff get adopted. And then the LLMs revolution kind of took the opposite effect OpenAI stopped open sourcing their stuff and DeepMind kind of didn't, like all the other cloud and all these other providers, they didn't open source their stuff. And it was not good in the sense that first science done in isolation probably will just form its own bubble where people believe their own b******t or whatever. So there's that problem. And then there was the other problem which was the accessibility part. Like, okay, I again always go back to I'm a student in India with no money. What is my accessibility to any of these closers models? At some scale I have to pay money. That makes it a non-starter and stuff. And there's also the control thing. I strongly believe if you want human aligned stuff, you want all humans to give feedback. And you want all humans to have access to that technology in the first place. And I actually have seen, living in New York, whenever I come to Silicon Valley, I see a different cultural bubble. Like all the friends I hang out with talk about some random thing like Dyson Spheres or whatever, that's a thing. And most of the world doesn't know or care about any of this stuff. It's definitely a bubble and bubbles can form very easily. And when you make a lot of decisions because you're in a bubble, they're probably not globally optimal decisions. So I think open source, the distribution of open source powers a certain kind of non-falsifiability that I think is very important. I think on the open source models, like it's going great in the fact that LoRa I think came out of the necessity of open source models needing to be fine-tunable in some way. Yeah, and I think DPO also came out of the academic open source side of things. So do any of the closed source labs, did any of them already have LoRa or DPO internally? Maybe, but that does not advance humanity in any way. It advances some companies probability of doing the winner takes all that I talked about earlier in the podcast.Open Source and TrustI don't know, it just feels fundamentally good. Like when people try to, you know, people are like, well, what are the ways in which it is not okay? I find most of these arguments, and this might be a little controversial, but I find a lot of arguments based on whether closed source models are safer or open source models are safer very much related to what kind of culture they grew up in, what kind of society they grew up in. If they grew up in a society that they trusted, then I think they take the closed source argument. And if they grew up in a society that they couldn't trust, where the norm was that you didn't trust your government, obviously it's corrupt or whatever, then I think the open source argument is what they take. I think there's a deep connection to like people's innate biases from their childhood and their trust in society and governmental aspects that push them towards one opinion or the other. And I'm definitely in the camp of open source is definitely going to actually have better outcomes for society. Closed source to me just means that centralization of power, which, you know, is really hard to trust. So I think it's going well

Masmorra Cine
BPM #96 LIVE Godzilla Minus One X Avatar The Last Airbender

Masmorra Cine

Play Episode Listen Later Feb 29, 2024 95:59


Angélica Hellish  e Marcos Noriega  conversam sobre o filme Godzilla Minus One , que foi uma ótima surpresa e também sobre a série Avatar: The Last Airbender, disponível na Netflix. LIVES TODAS AS QUARTAS 21H NO YOUTUBE, TWICH E FACEBOOK Procure e inscreva-se nos aplicativos de PODCAST e também no SPOTIFY, AMAZON MUSIC, APPLE PODCASTS! – Só procurar MASMORRACINE *Nosso e-mail: contato.cinemasmorra@gmail.com SIGA A GENTE NO NOSSO CANAL NA TWITCH @AngelMasmorra ⁠https://www.twitch.tv/angelmasmorra⁠ AJUDE O NOSSO PODCAST A APARECER MAIS NO SPOTIFY DANDO 5 ESTRELAS PARA A GENTE! Procure-nos lá como Cineclube da Masmorra ou como MasmorraCine Nosso ⁠canal no OK.RU⁠! https://ok.ru/video/c2356516     https://youtu.be/kz4ovOZkY8M

TerraSpaces
Restaking in Cosmos: Persistence One x Terraform Labs

TerraSpaces

Play Episode Listen Later Feb 20, 2024 59:58


Today on the Ether we have Persistence One hosting a chat on restaking and Alliance in the Cosmos with Terra. You'll hear from Mikhil Pandey, 0xAllen, Zion, Marc, Roman Nedved, BackBone Labs, and more! Recorded on February 20th 2024. Make sure to check out the newest tracks from Finn and the RAC FM gang over at ImaginetheSmell.org! The majority of the music at the end of these spaces can be found streaming over on Spotify, and the rest of the streaming platforms. Thank you to everyone in the community who supports TerraSpaces.

DECKED UP: A Tech and Gaming Podcast
The OneXPlayer X1 is a 3-in-1 BEAST with Intel Core Ultra | DeckedUP EP 70

DECKED UP: A Tech and Gaming Podcast

Play Episode Listen Later Jan 25, 2024 27:02


OneX has announced a new handheld offering a 3 in 1 functionality, 2k screen, and more called the OneXPlayer X1. This has definately got my attention so let's talk about it! Get MoonWlkr HERE and usw Code "MekelKasanova" at checkout: https://lddy.no/1izj0 SUPPORT THE SHOW AND FOLLOW US YOUTUBE: ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠http://youtube.com/MekelKasanova⁠⁠⁠⁠ ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠ TWITCH: ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠http://twitch.tv/MekelKasanova⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠ ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠    ⁠⁠ TWITTER: ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠http://twitter.co⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠m/MekelKasanova⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠         ⁠⁠ INSTAGRAM: ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠https://www.instagram.com/MekelKasanova⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠        ⁠⁠ PATREON: ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠https://www.patreon.com/MekelKasanova⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠        ⁠⁠ Be sure to visit ⁠⁠⁠⁠⁠⁠⁠www.MekelKasanova.com⁠⁠⁠⁠⁠⁠⁠ for updates, news, podcasts, and much more. All clips of audio and video used in this work are used for entertainment or education purposes under the fair use clause found in sections 107 through 118 of the copyright law (title 17, U. S. Code). --- Support this podcast: https://podcasters.spotify.com/pod/show/deckeduppodcast/support

Oral Arguments for the Court of Appeals for the Eighth Circuit
SBFO Operator No. 3, LLC v. Onex Corporation

Oral Arguments for the Court of Appeals for the Eighth Circuit

Play Episode Listen Later Jan 11, 2024 26:28


SBFO Operator No. 3, LLC v. Onex Corporation

The Automotive Troublemaker w/ Paul J Daly and Kyle Mountsier
GM's NewYear's Layoff, Carvana Touts Same-Day, UPS Paycheck Share

The Automotive Troublemaker w/ Paul J Daly and Kyle Mountsier

Play Episode Listen Later Dec 15, 2023 14:14


It's Friday and we're ready to toe the line with Retail Auto as we talk about GM layoffs, Carvana's expansion of its same-day services, and one UPS driver who shares his paycheck and sparks some second thoughts from commenters. General Motors has announced a temporary halt in Chevy Bolt EV production, leading to layoffs of 1,314 workers at two Michigan plants.GM has ceased production of the Chevy Bolt EV and EUV, resulting in 945 job cuts at the Lake Orion assembly plant. The plant is set for a transformation to produce future EV models.An additional 369 jobs will be lost at the Lansing Grand River Assembly plant with the end of Chevy Camaro Coupe production.GM plans a significant shift to its Ultium EV platform, with investments and retooling of the plant leading to the new Chevy Bolt EV reappearing on this new platform in 2025.In a notice to the state, GM noted that the reason they are announcing the layoffs so late were due to the slow ratification process of the UAW contract.Affected workers, particularly at the Orion plant, are being offered alternative opportunities in Michigan. This restructuring follows a new contract with the UAW, which included significant wage adjustments.Carvana is expanding its same-day delivery service to new markets in the South and MidwestAfter introducing its same-day delivery service in November to Atlanta, Dallas-Fort Worth, Tampa, and Orlando, and has now extended it to Cincinnati and Columbus, Ohio.The service allows customers to buy a vehicle online and receive it the same day, emphasizing convenience and speed during the busy holiday season.Carvana senior director of market operations and expansion Jacqueline Hearns said in a news release, “Especially in the middle of a busy holiday season, customers value speed and convenience, and we couldn't be happier to make car buying faster and easier than ever before for Columbus and Cincinnati-area residents with our new same day delivery offering.”Customers also have the option for same-day drop-off when selling their vehicles to Carvana at designated locations.A UPS driver's TikTok video revealing his weekly paycheck gains nearly 12 million views, sparking discussions about pay transparency and career choices in various industries.Skyler Stutzman, an Oregon-based UPS driver, shared his earnings of approximately $2,004 pre-tax for one week, leading to widespread surprise over the high pay rate of UPS drivers.The Teamsters union secured a total compensation package for UPS drivers worth $170,000, including benefits, highlighting the value of union negotiations.Just too good:  One X user @battleangelviv  posted this screenshot of an iphone update screen with the caption “breaking: apple recalls almost all iphones”Hosts: Paul J Daly and Kyle MountsierGet the Daily Push Back email at https://www.asotu.com/ JOIN the conversation on LinkedIn at: https://www.linkedin.com/company/asotu/ Read our most recent email at: https://www.asotu.com/media/push-back-email ASOTU Instagram: https://www.instagram.com/automotivestateoftheunion

Regal Movie Masters Unlimited
Episode 65: Godzilla Minus One x Dream Scenario x Silent Night x The Shift

Regal Movie Masters Unlimited

Play Episode Listen Later Dec 12, 2023 98:33


Four new movies covered this week: • GODZILLA MINUS ONE [00:01:30] - Toho's follow-up to the universally-praised breakout "Shin Godzilla," could the Japanese studio make lightning strike twice? • DREAM SCENARIO [00:30:52] - In a landscape where any given number of new Nicolas Cage movies are playing or streaming at any given moment, this was the one that many people were claim they "heard was a great one." At least one Movie Master was not nearly as smitten. Hear why. • SILENT NIGHT [00:50:02] - Not "Violent Night," released last year at this time, this is a new John Woo movie. Experience the gang pontificating on why this was more like a John Boooooo movie. • THE SHIFT [01:22:14] - The Angel Films bait and switch is back, baby. The latest faith-based feature length movie from one of the more prolific faith-based studios on the planet drew in at least a couple of your Movie Masters. How does this one stack up against the ungodly (wait... godly?) amount of other films this experiment has endured to date. As always: email moviemasters760@gmail.com with any/all questions, concerns, comments, and movie recommendations. --- Support this podcast: https://podcasters.spotify.com/pod/show/rmmu/support

Built To Go! A #Vanlife Podcast
190 Winter Realities, 2FA Problem, Insta360 One X, Treasures of Our Town, Canal Crying, Big Robs List

Built To Go! A #Vanlife Podcast

Play Episode Listen Later Nov 30, 2023 38:51


This podcast is also available on YouTube: https://youtu.be/y1sOShiy-1w Winter vanlife can be great! But you need to be prepared. We'll help. We'll also solve a 2FA problem, review a strange and wonderful camera, visit another podcast, hear a heartwarming tale from Panama, and read Big Rob's List.  FIND US: We're on Facebook (Built to Go Group), Instagram (@CollegeOfCuriosity) Travel with us on the Danube River through Hungary, Slovakia, Austria, Germany and the Czech Republic! Details at: bit.ly/CofCDanube24 If you'd like to support this podcast, please visit BuyMeACoffee.com/BuiltToGo Home. Briefly. TECH TALK Google Authenticator Wired: https://www.wired.com/story/how-to-use-google-authenticator-app/ PRODUCT REVIEW Insta360 One X 2 (or 3) Camera https://amzn.to/4aaw24O A PLACE TO VISIT Treasures of our Town Podcast (Jeff as guest on 27NOV2023) https://treasuresofourtown.buzzsprout.com/ NEWS Space Coast's Only “Vanlife” Festival To Take Over Space Coast Daily Park in Viera, Jan 27-28  https://spacecoastdaily.com/2023/11/space-coasts-only-vanlife-festival-to-take-over-space-coast-daily-park-in-viera-jan-27-28/ Rivian Will Now Sell You Its Electric Cargo Van https://www.caranddriver.com/news/a45783032/rivian-electric-cargo-van-fleet-sales/ FORD E-SERIES SALES BEAT PROMASTER, EXPRESS IN Q3 2023 https://fordauthority.com/2023/11/ford-e-series-sales-numbers-figures-results-third-quarter-2023-q3/ RESOURCE RECOMMENDATION Big Rob's XMas List  (http://bigrobsvan.com) https://www.youtube.com/watch?v=dcYHdKl9Syo Some links are affiliate links. If you purchase anything from these links, the show will receive a small fee. This will not impact your price in any way. 

Flip the Record
Three Days Grace Part 1: Self-Titled Through Transit of Venus

Flip the Record

Play Episode Listen Later Nov 13, 2023 65:00


We talk about all Three Day Grace albums with Adam Gontier, including Self-TItled, One-X, Life Starts Now, and Transit of Venus.

No Hype
SEHER ONE X NO HYPE

No Hype

Play Episode Listen Later Nov 1, 2023 68:24


Aunque es muy real que México tiene una riqueza y herencia cultural en la gráfica en el formato que sea, y que mucha de esta tiene una conexión realmente estrecha con la cultura de la calle, pareciera que hablamos de mundos distintos. El Graffiti en México tiene una muy vasta, que podemos trazar casi desde sus inicios en Estados Unidos, la cercanía con ellos, permitió que este estilo de vida rápidamente permeara en nuestro país a través de la fronteras y se fuera filtrando lentamente hasta la Ciudad de México, que ha sido un semillero importante de pesos pesados de este arte. Aunque esta no es una historia del Graffiti, si es una entrevista que queríamos y nos debíamos, porque Seher One es bajita la mano uno de los artistas más completos de la escena actual (y desde hace muchos años) tal vez hayas visto alguna de sus obras y ni siquiera sabías que es de él, aprovechamos su última colaboración junto a American Eagle para hablar de muchas cosas, otras se quedaron fuera por tema de tiempo, pero este es el primer saque de una segunda vuelta que es necesario dar.

Crypto Kid
Layer One X The fastest interoperable blockchain with Matiu Rudolph

Crypto Kid

Play Episode Listen Later Oct 25, 2023 67:02


Matiu Rudolph, Chief Operating Officer at LayerOneX connects with Coin Edition on the application of cross-chain technology in the crypto industry. Layer One X (L1X) enables the full utilisation of blockchain technology by providing unrivalled interoperability, allowing data to flow securely between different blockchains while remaining completely decentralised. Incredibly, this is achieved without compromising the network's scalability or speed through its proprietary micro-validation technology.As the digital world shifts to a decentralised model, L1X eliminates the distraction of protocol supremacy and creates a multi-chain reality of interconnected Web3 communities and commerce, unlocking endless new possibilities. Links: https://linktr.ee/layeronex Layer One X (L1X) Webste: https://www.l1x.foundation/ Follow me on! https://linktr.ee/CryptoKidPodcast Affiliate link: ⁠⁠⁠https://kitcaster.com/cryptokid/⁠⁠⁠ Website: ⁠⁠⁠www.cryptokidpodcast.com --- Send in a voice message: https://podcasters.spotify.com/pod/show/cryptokidpodcast/message Support this podcast: https://podcasters.spotify.com/pod/show/cryptokidpodcast/support

Dj Dev NYC
Jey One X Dj Dev NYC - Onana (Bendecedo Rfx) - Outro

Dj Dev NYC

Play Episode Listen Later Oct 25, 2023 2:27


Jey One X Dj Dev NYC - Onana (Bendecedo Rfx) - Outro by Dj Dev NYC

Built to Sell Radio
Ep 410 Fifty Percent Sooner, for 25% Less: How Mark Ferrier Grew his Marketing Agency to $2M EBITDA and Scored an 8-Figure Exit

Built to Sell Radio

Play Episode Listen Later Oct 6, 2023 65:34


Mark Ferrier founded the marketing agency TRAFFIKGROUP and grew it to over $2M in EBITDA before agreeing to be acquired by the private equity group Onex in an eight-figure exit.  Mark decided to sell when a friend revealed that most founders end up wishing they had sold 50% sooner for 25% less.

Built to Sell Radio
Ep 409 Minority vs. Majority Partners with Mark Ferrier

Built to Sell Radio

Play Episode Listen Later Sep 29, 2023 50:29


Mark Ferrier built the marketing agency TRAFFIKGROUP to more than $2 million of EBITDA before it was acquired by the private equity group Onex in an eight-figure exit. In this first of a two-part interview, Mark shares the story of how he got started in the marketing agency world and how a rift with his former partners left him on the wrong end of a $2 million lawsuit.

Fused Marriages w/ Tristin & Michael
Can You be a Plus One x Tristin & Michael Colter l Fused Marriages

Fused Marriages w/ Tristin & Michael

Play Episode Listen Later Sep 18, 2023 30:23


Are you and your partner approaching the relationship with the same level of intensity? Does ego play a part of finances, responsibilities or plans? How do we handle alpha personalities and what each partner needs? Let's talk about it!   QOTD: What is an alpha female or an alpha male and how do you perceive the term? How do they or have they influenced your relationship? Thanks For Listening ! =================================================Follow Us!   Want to Watch?: https://bit.ly/Fused_Marriages   https://www.instagram.com/fusedmediagroup https://www.instagram.com/mc2colter/ https://www.instagram.com/tristinraquel/ website: https://www.fusedmarriages.com/ email: info@fusedmarriages.com =================================================   About Fused Marriages: Welcome To Fused Marriages :) Real topics, real life, real issues on how to stay in or obtain a successful marriage. Join Tristin & Michael for transparent conversations about marriage, family & relationships. Discussing perspectives that are thought about but not talked about. Hope you subscribed to us! If you didn't, please do consider subscribing if you liked our content. Our motto is to create Incredible videos to create an impact in your life :)

Locked On Sports Minnesota
The Minnesota Football Party (9/6): Minnesota Vikings' Biggest Week One X-Factors

Locked On Sports Minnesota

Play Episode Listen Later Sep 6, 2023 33:29


The Minnesota Vikings gear up for their regular season kickoff at home verse the Tampa Bay Buccaneers Sunday. So which players will turn out to be the biggest x-factors on both teams that will dictate the outcome the most? Plus, will the front office manage to get an extension done with Justin Jefferson before it's too late and which NFC North week one matchup should fans dial into the most. It's the weekly Mailbag Edition of The Minnesota Football Party with Kare 11's Reggie Wilson.  Learn more about your ad choices. Visit podcastchoices.com/adchoices

Locked On Sports Minnesota
The Minnesota Football Party (9/6): Minnesota Vikings' Biggest Week One X-Factors

Locked On Sports Minnesota

Play Episode Listen Later Sep 6, 2023 35:44


The Minnesota Vikings gear up for their regular season kickoff at home verse the Tampa Bay Buccaneers Sunday. So which players will turn out to be the biggest x-factors on both teams that will dictate the outcome the most? Plus, will the front office manage to get an extension done with Justin Jefferson before it's too late and which NFC North week one matchup should fans dial into the most. It's the weekly Mailbag Edition of The Minnesota Football Party with Kare 11's Reggie Wilson.  Learn more about your ad choices. Visit podcastchoices.com/adchoices

Hampton Hoops Podcast
One X-Factor for every NFC team..

Hampton Hoops Podcast

Play Episode Listen Later Aug 28, 2023 49:17


On today's show, Coop and I give one x-factor for every NFC team!

Devious Remix Blends and Mixes
Jey One X Bennie Man - Onana (Devious Mashup Remix)

Devious Remix Blends and Mixes

Play Episode Listen Later Jul 19, 2023 2:38


Devious Mashup Remix

devious one x mashup remix
100% Reverse Bass Podcast
MKN | 100% Reverse Bass | Episode 89 (J-Trax Guestmix)

100% Reverse Bass Podcast

Play Episode Listen Later Jun 29, 2023 64:12


This month we have an epic guestmix from J-Trax. Been a huge fan of his Reverse Bass sound from the Onex & Trax days & LOVE his latest stuff! ✘

Why Tho? A Personal Journey Through my Record Collection

In today's episode Roberto Toro of Tsar Power and Benjamin Jacobs of Why Tho: A Personal Journey Through My Record Collection begin Part 1 of a two part crossover. In this crossover Ben reviews an album chosen by Roberto, and visa versa. Those albums are One X by Three Days Grace and Hot by the Squirrel Nut Zippers. In this episode we discuss album first loves, cool school bus drivers, and the difference between a jewel case and a digipak. Todays episode was edited by Ben instead of Andrew. I am so, so sorry. For anyone curious, the $26.00 Roberto used to purchase this album in 2006 is equivalent to $38.93 in 2023 money.One X - Three Days Grace: https://youtube.com/playlist?list=PLQwPvYVIiaGNTIh8WcBvvDI7fxzWye-LyHot - Squirrel Nut Zippers: https://youtube.com/playlist?list=PL64ioV6xIjLZPWBnqWKysQwWvdyW3LN3SMusical Candy Documentary: https://vimeo.com/100220051 Hosted on Acast. See acast.com/privacy for more information.

Why Tho? A Personal Journey Through my Record Collection

In today's episode Roberto Toro of Tsar Power and Benjamin Jacobs of Why Tho: A Personal Journey Through My Record Collection present Part 2 of a two part crossover. In this crossover Ben reviews an album chosen by Roberto, and visa versa. Those albums are One X by Three Days Grace and Hot by the Squirrel Nut Zippers. In todays episode our dynamic duo discuss candy, heavy metal, publication rights, the MAPL system, and the proper way to cook a steak. One X - Three Days Grace: https://youtube.com/playlist?list=PLQwPvYVIiaGNTIh8WcBvvDI7fxzWye-LyHot - Squirrel Nut Zippers: https://youtube.com/playlist?list=PL64ioV6xIjLZPWBnqWKysQwWvdyW3LN3S13nth Floor Elevators: https://youtube.com/playlist?list=PLiN-7mukU_RHffmSQiUY-wrIPJJu5PpznGary Glitter (Glam Rocker): https://youtu.be/2r8Ac7JtEaULed Zeppelin - Ramble on: https://youtu.be/LzGBQerkvWsLed Zeppelin, sounding more metal - Whole Lotta Love: https://youtu.be/HQmmM_qwG4kEvanescence, doing the contrasting verse chorus thing: https://youtu.be/3YxaaGgTQYM70s Classic MetalBlack Sabbath: https://youtu.be/5s7_WbiR79EAlice Cooper: https://youtu.be/jXZcJojTucgJudas Priest: https://youtu.be/WtuoFv4dcwMIron Maiden: https://youtu.be/X4bgXH3sJ2Q80s Glam MetalKiss: https://youtu.be/iZq3i94mSsQBon Jovi: https://youtu.be/KrZHPOeOxQQMotley Crue: https://youtu.be/d2XdmyBtCRQPoison: https://youtu.be/j2r2nDhTzO480s and 90s Underground Metal ExperimentsMotorhead: https://youtu.be/3mbvWn1EY6gMetallica: https://youtu.be/6xjJ2XIbGRkRollins Band: https://youtu.be/0oB1HT7C9kIFaith No More: https://youtu.be/ZG_k5CSYKhgPrimus: https://youtu.be/LBQ2305fLeARed Hot Chili Peppers: https://youtu.be/QpHj0IdN4DANirvana: https://youtu.be/PbgKEjNBHqMPearl Jam: https://youtu.be/aDaOgu2CQtINu MetalKorn: https://youtu.be/jRGrNDV2mKcLimp Bizkit: https://youtu.be/JTMVOzPPtiwSugar Ray: https://youtu.be/ZOv09aNPYekAlt MetalNickleback: https://youtu.be/T3rXdeOvhNEPuddle of Mudd: https://youtu.be/I0-lENIRHaM Hosted on Acast. See acast.com/privacy for more information.

Disruptive Successor Podcast
Episode 121 - Leading with Grit and Grace - Ashleigh Walters' Journey from a Family-Owned Business to an Employee-Owned Business

Disruptive Successor Podcast

Play Episode Listen Later Apr 19, 2023 39:11


In this episode of The Disruptive Successor Show, Jonathan chats with Ashleigh Walters, a business executive with a proven track record of leading transformational change. Ashleigh is the author of Leading with Grit and Grace and she turned around a 50-year-old industrial furnace manufacturing and service business owned by her husband's family.During her tenure at Onex, Ashleigh became the president and she led the effort to turn the company into an employee-owned ESOP business. She attributes part of her success to her coach-approach leadership style which came about as she herself admits that she did not know everything when she started leading. Ashleigh left the company in December 2022 to become the Chairperson.The ESOP allowed the employees to buy the company without having to come up with any money. The employees funded the purchase of the business through the tax savings they made since they were no longer taxed on the income of the business. Ashleigh explains that the ESOP culture allowed for more employee engagement, which only increased after the employees became owners.HIGHLIGHT QUOTESASHLEIGH: Ownership created employee engagement and self-policing"I think that culture piece is really important because now you have owners in the company. And so what happened was they just had a stronger engagement. Once they became owners, they took even more pride and responsibility and wanted to make things better and they wanted to learn more about the business and how they could impact different pieces.""But the other thing that happened is we started seeing what I'll call policing within the employees. And so if you weren't doing what they felt like you should be doing, we're told to pick up the pace. We're all owners, we're in this together. And if you're not going to pick up the pace, get out of here."ASHLEIGH: Do a gut check about doing something differently"Anytime I have that gut check, I'm like, ah, I don't like that. Then I find another way. For instance, performance appraisals, and yearly annual reviews. I got rid of them. I did a gut check. I thought my managers hate doing them. People hate having them done. They hate being rated. It creates chaos and turmoil in our organization and I thought why are we talking about the past? That's how we came to the coach approach. Why aren't we looking to the future?"Connect with Ashleigh and get her book:LinkedIn | Website | AmazonIf you enjoyed today's episode, please subscribe, review, and share with a friend who would benefit from the message. If you're interested in picking up a copy of Jonathan Goldhill's book, Disruptive Successor, go to the website at www.DisruptiveSuccessor.com

TOXIC SICKNESS RADIO SHOWS & LABEL RELEASES
ACCESS ONE / X-TINCTION PODCAST ON TOXIC SICKNESS / APRIL / 2023

TOXIC SICKNESS RADIO SHOWS & LABEL RELEASES

Play Episode Listen Later Apr 9, 2023 45:56


ACCESS ONE / X-TINCTION PODCAST ON TOXIC SICKNESS / APRIL / 2023 by TOXIC SICKNESS OFFICIAL

Хоба!
113. Смена работы в Швеции • Изучение турецкого • Бот с ближайшими кофейнями Еревана • И немножко про кофе

Хоба!

Play Episode Listen Later Mar 12, 2023 39:00


Видеоверсия: https://youtu.be/efdBDkkMkf4Упоминали в выпуске:— Busuu, https://www.busuu.com— Babel, https://www.babbel.com— Onex, https://onex.am/ru— Сорт кофе либерика —https://trends.rbc.ru/trends/green/63ab0c409a7947546cee7231— Джеймс Хоффман — ютуб канал про кофе, https://youtube.com/@jameshoffmann— Lilly Drip, https://www.youtube.com/watch?v=2DY_075VQBE Участники выпуска:— Адель Мубаракшин, https://t.me/exarg — Ваня Звягин, https://t.me/brokitchen — Далер Алиёров, https://t.me/dalerblog ❤️ Спасибо нашим патронам!- Vadim Galushko- Sailor Moon- Борóвский Богдан- Firdavs Murodov- ультрасобаки- Pavel Ustimenko- antonpimnev- Yurik Gagarin- Бака! Подкаст об аниме- Я из Назарово- Чарли Свон- afreddison- Альпáка- Тот, кто покинул Омск- Дмитрий Логвинов- Грачик- karina dudka- Артем Шевченко- Whisper- ПермякCоленыеУши- Максим Сафонов- Артур Балицкий- Раньше я был другой АК- Савл Апостолов- el_en- Длинное имя, которое не помещается в табличкуКак с нами связаться:— Бусти, https://boosty.to/hoba/— Телеграм-канал, https://t.me/hoba_channel— Бот для ваших сообщений, https://t.me/hobacast_bot— Почта, hobacast@gmail.com— Группа ВК, https://vk.com/hobacast

Generation One
How to LOVE Well (The Heart of a Servant) - Generation One x Travis Doodles

Generation One

Play Episode Listen Later Mar 8, 2023 53:30


Generation One Family, G.O. Squad, what's good?! Welcome back to another podcast episode. Today we are joined by our good friend and brother  @TravisDoodles  for a conversation on kindness, serving and having the right heart posture behind what you're doing. Travis is someone who has exemplified incredible acts of kindness and the love of Jesus to all he encounters, and has shared incredible, uplifting content capturing these moments. Let us know how this conversation inspires you, we would love to hear from you. Like, Comment and Subscribe for more weekly Gen One content. Love you! JOIN OUR GROUP CHAT: https://discord.gg/bnNhfFk6a4 GET PLUGGED IN HERE: https://linktr.ee/GenOne_ PARTNER WITH GENERATION ONE THROUGH GIVING: https://pushpay.com/g/tphlosangeles?fnd=xPsRXtmqqy0-Sa_GfFLbqQ&fndv=Hide&r=No&rcv=False&lang=en&src=qrcode Follow on all social media (@generationoneclips) & subscribe to our Generation One YouTube channel for additional content. https://www.youtube.com/GenerationONE

Heart of Dating
HOD Select: The One, X Factor, Ghosting, Love at First Sight, The Friendzone, and So Much More with JP Pokluda

Heart of Dating

Play Episode Listen Later Feb 22, 2023 64:41


JP and Kait discuss issues such as "the one" and the "x factor" as it relates to his book "Outdated."  Want to join the Singles Ministry your church doesn't have while getting access to monthly masterclasses? Join TSA today!  https://thesinglesacademyhod.com/plans/224595?bundle_token=aac55bc380a323b776655e1b717c0ef6&utm_source=manual Want to WATCH the podcast? We're now on YouTube!  https://www.youtube.com/channel/UCJ1PswEXEyeSddMmOSiRKGw Crushing on a cutie? Download this FREE Resource on how to show interest: https://www.heartofdating.com/resource/how-to-show-interest Want to further your dating knowledge? Check out our ultimate dating library! https://www.heartofdating.com/resource/ultimate-dating-library Kait wrote a book! Snag Thank You For Rejecting Me on Amazon: https://amzn.to/3E59cLQ Want to meet some epic Christian Singles? Join our huge HOD Family on FB: https://www.facebook.com/groups/heartofdatingpodcast Come hang with us on the gram: http://instagram.com/heartofdating http://instagram.com/kaitness . . . . .  A quick thank you to some of our friends! Faithful Counseling: Our #1 resource for affordable, reliable, Christian therapy. You can get 10% off your first month by going to http://faithfulcounseling.com/heartofdating Compassion International: Do you have a burning desire to be a parent but feel stuck in singleness? Do you want to make a lasting, powerful impact in your life as a single? We are a proud partner of Compassion International. Our community of singles has sponsored hundreds of kids all around the world, and we'd love to invite you to join us on this compelling mission. http://compassion.com/heartofdating

The MMA Hour with Ariel Helwani
Petr Yan, Julianna Peña, Demetrious Johnson, Ryan Garcia, Dricus Du Plessis, and more

The MMA Hour with Ariel Helwani

Play Episode Listen Later Apr 4, 2022 223:50 Very Popular


Ariel Helwani and Eric Jackman around (13:47) discuss this past weekend's WrestleMania 38 and more. GC and Helwani around (40:20) look back at WrestleMania 38 bets and ahead to early action for UFC 273. Demetrious Johnson around (1:00:59) discusses his nickname issues, what it was like competing at ONE X, his special rules bout against Rodtang, what made the difference in the fight, if he's getting a title rematch, dealing with time zones, when he might return, Deiveson Figueiredo vs. Brandon Moreno 4, and more. Ryan Garcia around (1:38:39) discusses his anticipated return this Saturday night, his new tattoos, if he might not return after taking his break, what happened with Canelo Alvarez, why he picked Joe Goossen as a trainer, what he learned from the Luke Campbell fight, Emmanuel Tagoe's comments, and more. Petr Yan around (2:06:00) discusses his rematch against Aljamain Sterling at UFC 273, what has changed since his last fight, Aljamain Sterlng's recent comments, Sterling's latest video, if Henry Cejudo and Sean O'Malley will be in his corner, who might be next if he wins, and more. Dricus Du Plessis around (2:27:53) discusses his difficulty getting an opponent at UFC 273, why he thinks Kelvin Gastelum dropped out of the fight, what he wants to do next, his nickname, and more. Julianna Peña around (3:00:13) discusses why she went to WrestleMania 38, Ronda Rousey's legacy, why she wants to fight Rousey, contract negotiations for her rematch with Amanda Nunes, what TUF coaching was like against Nunes, and more. If you or someone you know has a gambling problem, crisis counseling and referral services can be accessed by calling 1-800-GAMBLER (1-800-426-2537) (IL/IN/MI/NJ/PA/WV/WY), 1-800-NEXT STEP (AZ), 1-800-522-4700 (CO/NH), 888-789-7777/visit http://ccpg.org/chat (CT), 1-800-BETS OFF (IA), 1-877-770-STOP (7867) (LA), 877-8-HOPENY/text HOPENY (467369) (NY), visit OPGR.org (OR), call/text TN REDLINE 1-800-889-9789 (TN), or 1-888-532-3500 (VA). 21+ (18+ NH/WY). Physically present in AZ/CO/CT/IL/IN/IA/LA/MI/NH/NJ/NY/OR/ PA/TN/VA/WV/WY only. New customers only. Min. $5 deposit required. Eligibility restrictions apply. See http://draftkings.com/sportsbook for details. Learn more about your ad choices. Visit podcastchoices.com/adchoices

The MMA Hour with Ariel Helwani
Paul Heyman, Aljamain Sterling, Kai Kara-France, Matt Brown, Yoshihiro Akiyama, and more

The MMA Hour with Ariel Helwani

Play Episode Listen Later Mar 30, 2022 258:44 Very Popular


Paul Heyman around (16:44) discusses life not being by Brock Lesnar's side anymore, the new version of Brock Lesnar for the WWE audience, Roman Reigns, AEW's MJF, his old promos, WrestleMania 38, and more. Yoshihiro Akiyama around (43:40) discusses his recent big win over Shinya Aoki at ONE X, how he bounced back, what division he plans on fighting in the future, how his wife feels about his nickname, and more. Matt Brown around (1:05:52) discusses his performance at UFC Columbus, if he thought he won the decision, what he could have done differently, his thoughts on open scoring, fighter pay, and more. In the latest On the Nose around (1:35:07), the fighter who he would pick to be face of new promotion, inspirational fighter quotes, Nate Diaz vs. Khamzat Chimaev, the biggest fight for you at UFC 273, who would win 1:1 battle with GC, his response to Graham Boylan, and more. Aljamain Sterling around (2:29:02) discusses his upcoming title defense at UFC 273, his preparation, why Petr Yan is upset, his issue with refs, his training camp, the odds for the fight, if Matt Serra is back in his corner, if he's Team Will Smith, and more. Kai Kara-France around (3:05:21) discusses his UFC Columbus win, his current contract situation with the UFC, why he wants to fight for Deiveson Figureido, Joe Burrow, a recent meeting with one of his childhood bullies, and more. GC and Helwani around (3:44:47) look at GC's best bets at WrestleMania 38. If you or someone you know has a gambling problem, crisis counseling and referral services can be accessed by calling 1-800-GAMBLER (1-800-426-2537) (IL/IN/MI/NJ/PA/WV/WY), 1-800-NEXT STEP (AZ), 1-800-522-4700 (CO/NH), 888-789-7777/visit http://ccpg.org/chat (CT), 1-800-BETS OFF (IA), 1-877-770-STOP (7867) (LA), 877-8-HOPENY/text HOPENY (467369) (NY), visit OPGR.org (OR), call/text TN REDLINE 1-800-889-9789 (TN), or 1-888-532-3500 (VA). 21+ (18+ NH/WY). Physically present in AZ/CO/CT/IL/IN/IA/LA/MI/NH/NJ/NY/OR/ PA/TN/VA/WV/WY only. New customers only. Min. $5 deposit required. Eligibility restrictions apply. See http://draftkings.com/sportsbook for details. Learn more about your ad choices. Visit podcastchoices.com/adchoices

MORNING KOMBAT WITH LUKE THOMAS AND BRIAN CAMPBELL
What Demetrious Johnson vs. Rodtang PROVED | Morning Kombat Extra Credit Ep. 18

MORNING KOMBAT WITH LUKE THOMAS AND BRIAN CAMPBELL

Play Episode Listen Later Mar 29, 2022 28:26 Transcription Available


Luke Thomas is back with Episode 18 of Morning Kombat Extra Credit. Luke breaks down a couple fights from UFC Fight Night: Blaydes vs. Daukaus and One X that he didn't get to on episode 282 of Morning Kombat.(1:45) - Demetrious Johnson vs. Rodtang(9:13) - Adriano Moraes vs. Yuya WakaMatsu(13:00) - Yoshihiro Akiyama vs. Shinya Aoki(17:05) - Alexa Grasso vs. Joanne Wood(20:32) - Chris Gutierrez vs. Danaa Batgerel(24:40) - Honorable MentionsMorning Kombat' is available on Apple Podcasts, Spotify, Stitcher, Castbox, Google Podcasts, Bullhorn and wherever else you listen to podcasts.  For more Combat Sports coverage subscribe here: youtube.com/MorningKombat Follow our hosts on Twitter: @BCampbellCBS, @lthomasnews, @MorningKombat  For Morning Kombat gear visit:morning kombat.store Follow our hosts on Instagram: @BrianCampbell, @lukethomasnews, @MorningKombat Learn more about your ad choices. Visit megaphone.fm/adchoicesSee omnystudio.com/listener for privacy information.

MORNING KOMBAT WITH LUKE THOMAS AND BRIAN CAMPBELL
What Demetrious Johnson vs. Rodtang PROVED | Morning Kombat Extra Credit Ep. 18

MORNING KOMBAT WITH LUKE THOMAS AND BRIAN CAMPBELL

Play Episode Listen Later Mar 29, 2022 31:11


Luke Thomas is back with Episode 18 of Morning Kombat Extra Credit. Luke breaks down a couple fights from UFC Fight Night: Blaydes vs. Daukaus and One X that he didn't get to on episode 282 of Morning Kombat. (1:45) - Demetrious Johnson vs. Rodtang (9:13) - Adriano Moraes vs. Yuya WakaMatsu (13:00) - Yoshihiro Akiyama vs. Shinya Aoki (17:05) - Alexa Grasso vs. Joanne Wood (20:32) - Chris Gutierrez vs. Danaa Batgerel (24:40) - Honorable Mentions Morning Kombat' is available on Apple Podcasts, Spotify, Stitcher, Castbox, Google Podcasts, Bullhorn and wherever else you listen to podcasts.    For more Combat Sports coverage subscribe here: youtube.com/MorningKombat   Follow our hosts on Twitter: @BCampbellCBS, @lthomasnews, @MorningKombat    For Morning Kombat gear visit:morning kombat.store   Follow our hosts on Instagram: @BrianCampbell, @lukethomasnews, @MorningKombat Learn more about your ad choices. Visit megaphone.fm/adchoices

MMAjunkie Radio
Ep. #3246: Oscars FC result & ONE X results, Khabib wants Colby black listed, more

MMAjunkie Radio

Play Episode Listen Later Mar 29, 2022 69:43


On Episode 3,246, the guys had two big cards to go over in ONE X and UFC…plus the Oscars had a match as well.

You're Welcome! With Chael Sonnen
Comparing McGregor and Paddy the Baddy, ONE Championship Preview & Blaydes v. Daukaus Prediction

You're Welcome! With Chael Sonnen

Play Episode Listen Later Mar 25, 2022 53:43


Chael begins today's show by discussing comparisons between Conor McGregor and Paddy Pimblett and whether the comparisons are even worth making. Then, Chael talks about Tom Aspinall's rise, tomorrow's highly-anticipated ONE X event from ONE Championship (22:35), and also tomorrow's UFC heavyweight main event between Curtis Blaydes and Chris Daukaus (42:07).

Oral Sessions with Renée Paquette
121. Rich Franklin

Oral Sessions with Renée Paquette

Play Episode Listen Later Mar 24, 2022 50:33


Rich Franklin went from the Octagon to the C-suite and lived to tell the tale on an all-new episode of The Sessions! The former UFC Middleweight Champion turned ONE Championship executive goes deep on his uncomfortable relationship with fame, transitioning to an executive role without ever meeting his producer, and what goes into creating an MMA-infused traveling show. Plus, find out what separates the upcoming ONE X card from anything else on the MMA market. Don't forget subscribe to The Sessions podcast, and follow Renee and The Volume on Twitter for the latest updates. Also subscribe to Renee's Youtube channel, and check out FanDuel for the best wagering and daily fantasy action! Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

The Schaub Show
Episode 278: Feat. Jon Anik | McGregor Wants Usman | UFC London RECAP

The Schaub Show

Play Episode Listen Later Mar 21, 2022 101:58 Transcription Available


Jon Anik joins Brendan to talk UFC London, Tom Aspinall's dominating performance, Paddy Pimblett and Meatball Molly McCann stealing the show, Jon Jones vs Stipe Miocic, Khamzat Chimaev's rise, this weekend's Curtis Blaydes vs Chris Daukaus and more. Also, Brendan breaks down/makes picks for One Championship's One X featuring John Wayne Parr's retirement fight, Demetrious Johnson vs Rodtang in a special rules fight, Angela Lee's title fight vs Stamp Fairtex and more. We also discuss current events including Chael Sonnen saying Israel Adesanya needs to fight Colby Covington, Adesanya and Darren till beefing, Conor McGregor wanting to return to fight Kamaru Usman for the 170 belt, Sean Strickland going off on Khalil Rountree and much more. Join us on Supercast: https://thicccboy.supercast.com/See omnystudio.com/listener for privacy information.

The Schaub Show
Episode 278: Feat. Jon Anik | McGregor Wants Usman | UFC London RECAP

The Schaub Show

Play Episode Listen Later Mar 21, 2022 100:58


Jon Anik joins Brendan to talk UFC London, Tom Aspinall's dominating performance, Paddy Pimblett and Meatball Molly McCann stealing the show, Jon Jones vs Stipe Miocic, Khamzat Chimaev's rise, this weekend's Curtis Blaydes vs Chris Daukaus and more. Also, Brendan breaks down/makes picks for One Championship's One X featuring John Wayne Parr's retirement fight, Demetrious Johnson vs Rodtang in a special rules fight, Angela Lee's title fight vs Stamp Fairtex and more. We also discuss current events including Chael Sonnen saying Israel Adesanya needs to fight Colby Covington, Adesanya and Darren till beefing, Conor McGregor wanting to return to fight Kamaru Usman for the 170 belt, Sean Strickland going off on Khalil Rountree and much more.  Join us on Supercast: https://thicccboy.supercast.com/ See omnystudio.com/listener for privacy information.